Sample records for development effort estimation

  1. Automated Estimation Of Software-Development Costs

    NASA Technical Reports Server (NTRS)

    Roush, George B.; Reini, William

    1993-01-01

    COSTMODL is automated software development-estimation tool. Yields significant reduction in risk of cost overruns and failed projects. Accepts description of software product developed and computes estimates of effort required to produce it, calendar schedule required, and distribution of effort and staffing as function of defined set of development life-cycle phases. Written for IBM PC(R)-compatible computers.

  2. FPA Depot - Web Application

    NASA Technical Reports Server (NTRS)

    Avila, Edwin M. Martinez; Muniz, Ricardo; Szafran, Jamie; Dalton, Adam

    2011-01-01

    Lines of code (LOC) analysis is one of the methods used to measure programmer productivity and estimate schedules of programming projects. The Launch Control System (LCS) had previously used this method to estimate the amount of work and to plan development efforts. The disadvantage of using LOC as a measure of effort is that one can only measure 30% to 35% of the total effort of software projects involves coding [8]. In the application, instead of using the LOC we are using function point for a better estimation of hours in each software to develop. Because of these disadvantages, Jamie Szafran of the System Software Branch of Control And Data Systems (NE-C3) at Kennedy Space Canter developed a web application called Function Point Analysis (FPA) Depot. The objective of this web application is that the LCS software architecture team can use the data to more accurately estimate the effort required to implement customer requirements. This paper describes the evolution of the domain model used for function point analysis as project managers continually strive to generate more accurate estimates.

  3. Estimating Software-Development Costs With Greater Accuracy

    NASA Technical Reports Server (NTRS)

    Baker, Dan; Hihn, Jairus; Lum, Karen

    2008-01-01

    COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.

  4. Estimating Software Effort Hours for Major Defense Acquisition Programs

    ERIC Educational Resources Information Center

    Wallshein, Corinne C.

    2010-01-01

    Software Cost Estimation (SCE) uses labor hours or effort required to conceptualize, develop, integrate, test, field, or maintain program components. Department of Defense (DoD) SCE can use initial software data parameters to project effort hours for large, software-intensive programs for contractors reporting the top levels of process maturity,…

  5. Technology Estimating 2: A Process to Determine the Cost and Schedule of Space Technology Research and Development

    NASA Technical Reports Server (NTRS)

    Cole, Stuart K.; Wallace, Jon; Schaffer, Mark; May, M. Scott; Greenberg, Marc W.

    2014-01-01

    As a leader in space technology research and development, NASA is continuing in the development of the Technology Estimating process, initiated in 2012, for estimating the cost and schedule of low maturity technology research and development, where the Technology Readiness Level is less than TRL 6. NASA' s Technology Roadmap areas consist of 14 technology areas. The focus of this continuing Technology Estimating effort included four Technology Areas (TA): TA3 Space Power and Energy Storage, TA4 Robotics, TA8 Instruments, and TA12 Materials, to confine the research to the most abundant data pool. This research report continues the development of technology estimating efforts completed during 2013-2014, and addresses the refinement of parameters selected and recommended for use in the estimating process, where the parameters developed are applicable to Cost Estimating Relationships (CERs) used in the parametric cost estimating analysis. This research addresses the architecture for administration of the Technology Cost and Scheduling Estimating tool, the parameters suggested for computer software adjunct to any technology area, and the identification of gaps in the Technology Estimating process.

  6. A study of fault prediction and reliability assessment in the SEL environment

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Patnaik, Debabrata

    1986-01-01

    An empirical study on estimation and prediction of faults, prediction of fault detection and correction effort, and reliability assessment in the Software Engineering Laboratory environment (SEL) is presented. Fault estimation using empirical relationships and fault prediction using curve fitting method are investigated. Relationships between debugging efforts (fault detection and correction effort) in different test phases are provided, in order to make an early estimate of future debugging effort. This study concludes with the fault analysis, application of a reliability model, and analysis of a normalized metric for reliability assessment and reliability monitoring during development of software.

  7. Advanced methods of structural and trajectory analysis for transport aircraft

    NASA Technical Reports Server (NTRS)

    Ardema, Mark D.

    1995-01-01

    This report summarizes the efforts in two areas: (1) development of advanced methods of structural weight estimation, and (2) development of advanced methods of trajectory optimization. The majority of the effort was spent in the structural weight area. A draft of 'Analytical Fuselage and Wing Weight Estimation of Transport Aircraft', resulting from this research, is included as an appendix.

  8. Ask Pete, software planning and estimation through project characterization

    NASA Technical Reports Server (NTRS)

    Kurtz, T.

    2001-01-01

    Ask Pete, was developed by NASA to provide a tool for integrating the estimation and planning activities for a software development effort. It incorporates COCOMO II estimating with NASA's software development practices and IV&V criteria to characterize a project. This characterization is then used to generate estimates and tailored planning documents.

  9. Earth resources data analysis program, phase 3

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Tasks were performed in two areas: (1) systems analysis and (2) algorithmic development. The major effort in the systems analysis task was the development of a recommended approach to the monitoring of resource utilization data for the Large Area Crop Inventory Experiment (LACIE). Other efforts included participation in various studies concerning the LACIE Project Plan, the utility of the GE Image 100, and the specifications for a special purpose processor to be used in the LACIE. In the second task, the major effort was the development of improved algorithms for estimating proportions of unclassified remotely sensed data. Also, work was performed on optimal feature extraction and optimal feature extraction for proportion estimation.

  10. Cost and schedule estimation study report

    NASA Technical Reports Server (NTRS)

    Condon, Steve; Regardie, Myrna; Stark, Mike; Waligora, Sharon

    1993-01-01

    This report describes the analysis performed and the findings of a study of the software development cost and schedule estimation models used by the Flight Dynamics Division (FDD), Goddard Space Flight Center. The study analyzes typical FDD projects, focusing primarily on those developed since 1982. The study reconfirms the standard SEL effort estimation model that is based on size adjusted for reuse; however, guidelines for the productivity and growth parameters in the baseline effort model have been updated. The study also produced a schedule prediction model based on empirical data that varies depending on application type. Models for the distribution of effort and schedule by life-cycle phase are also presented. Finally, this report explains how to use these models to plan SEL projects.

  11. NASA Software Cost Estimation Model: An Analogy Based Estimation Model

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James

    2015-01-01

    The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K-­ nearest neighbor prediction model performance on the same data set.

  12. A shuttle and space station manipulator system for assembly, docking, maintenance, cargo handling and spacecraft retrieval (preliminary design). Volume 3: Concept analysis. Part 2: Development program

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A preliminary estimate is presented of the resources required to develop the basic general purpose walking boom manipulator system. It is assumed that the necessary full scale zero g test facilities will be available on a no cost basis. A four year development effort is also assumed and it is phased with an estimated shuttle development program since the shuttle will be developed prior to the space station. Based on delivery of one qualification unit and one flight unit and without including any ground support equipment or flight test support it is estimated (within approximately + or - 25%) that a total of 3551 man months of effort and $17,387,000 are required.

  13. Developing and validating a highway construction project cost estimation tool.

    DOT National Transportation Integrated Search

    2004-01-01

    In May 2002, Virginia's Commonwealth Transportation Commissioner tasked his Chief of Technology, Research & Innovation with leading an effort to develop a definitive, consistent, and well-documented approach for estimating the cost of delivering cons...

  14. DEVELOPMENT AND APPLICATION OF POPULATION MODELS TO SUPPORT EPA'S ECOLOGICAL RISK ASSESSMENT PROCESSES FOR PESTICIDES

    EPA Science Inventory

    As part of a broader exploratory effort to develop ecological risk assessment approaches to estimate potential chemical effects on non-target populations, we describe an approach for developing simple population models to estimate the extent to which acute effects on individual...

  15. Software Development Cost Estimation Executive Summary

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus M.; Menzies, Tim

    2006-01-01

    Identify simple fully validated cost models that provide estimation uncertainty with cost estimate. Based on COCOMO variable set. Use machine learning techniques to determine: a) Minimum number of cost drivers required for NASA domain based cost models; b) Minimum number of data records required and c) Estimation Uncertainty. Build a repository of software cost estimation information. Coordinating tool development and data collection with: a) Tasks funded by PA&E Cost Analysis; b) IV&V Effort Estimation Task and c) NASA SEPG activities.

  16. Estimates of wood energy demand for residential use in Alaska: an update

    Treesearch

    Jean M. Daniels; Michael D. Paruszkiewicz

    2016-01-01

    Efforts to amend the Tongass National Forest Land Management Plan have necessitated the development of several management scenarios to assist with planning efforts. One scenario focuses on increasing the utilization of sawmill residues and low-grade material as feedstock for expanding biomass energy markets. The development of a biomass industry is viewed as a solution...

  17. Development of collision dynamics models to estimate the results of full-scale rail vehicle impact tests : Tufts University Master's Thesis

    DOT National Transportation Integrated Search

    2000-11-01

    In an effort to study occupant survivability in train collisions, analyses and tests were conducted to understand and improve the crashworthiness of rail vehicles. A collision dynamics model was developed in order to estimate the rigid body motion of...

  18. Statistical control in hydrologic forecasting.

    Treesearch

    H.G. Wilm

    1950-01-01

    With rapidly growing development and uses of water, a correspondingly great demand has developed for advance estimates of the volumes or rates of flow which are supplied by streams. Therefore much attention is being devoted to hydrologic forecasting, and numerous methods have been tested in efforts to make increasingly reliable estimates of future supplies.

  19. Materials characterization on efforts for ablative materials

    NASA Technical Reports Server (NTRS)

    Tytula, Thomas P.; Schad, Kristin C.; Swann, Myles H.

    1992-01-01

    Experimental efforts to develop a new procedure to measure char depth in carbon phenolic nozzle material are described. Using a Shor Type D Durometer, hardness profiles were mapped across post fired sample blocks and specimens from a fired rocket nozzle. Linear regression was used to estimate the char depth. Results are compared to those obtained from computed tomography in a comparative experiment. There was no significant difference in the depth estimates obtained by the two methods.

  20. Nuclear Cryogenic Propulsion Stage Affordable Development Strategy

    NASA Technical Reports Server (NTRS)

    Doughty, Glen E.; Gerrish, H. P.; Kenny, R. J.

    2014-01-01

    The development of nuclear power for space use in nuclear thermal propulsion (NTP) systems will involve significant expenditures of funds and require major technology development efforts. The development effort must be economically viable yet sufficient to validate the systems designed. Efforts are underway within the National Aeronautics and Space Administration's (NASA) Nuclear Cryogenic Propulsion Stage Project (NCPS) to study what a viable program would entail. The study will produce an integrated schedule, cost estimate and technology development plan. This will include the evaluation of various options for test facilities, types of testing and use of the engine, components, and technology developed. A "Human Rating" approach will also be developed and factored into the schedule, budget and technology development approach.

  1. The role of physical habitat and sampling effort on estimates of benthic macroinvertebrate taxonomic richness at basin and site scales.

    PubMed

    Silva, Déborah R O; Ligeiro, Raphael; Hughes, Robert M; Callisto, Marcos

    2016-06-01

    Taxonomic richness is one of the most important measures of biological diversity in ecological studies, including those with stream macroinvertebrates. However, it is impractical to measure the true richness of any site directly by sampling. Our objective was to evaluate the effect of sampling effort on estimates of macroinvertebrate family and Ephemeroptera, Plecoptera, and Trichoptera (EPT) genera richness at two scales: basin and stream site. In addition, we tried to determine which environmental factors at the site scale most influenced the amount of sampling effort needed. We sampled 39 sites in the Cerrado biome (neotropical savanna). In each site, we obtained 11 equidistant samples of the benthic assemblage and multiple physical habitat measurements. The observed basin-scale richness achieved a consistent estimation from Chao 1, Jack 1, and Jack 2 richness estimators. However, at the site scale, there was a constant increase in the observed number of taxa with increased number of samples. Models that best explained the slope of site-scale sampling curves (representing the necessity of greater sampling effort) included metrics that describe habitat heterogeneity, habitat structure, anthropogenic disturbance, and water quality, for both macroinvertebrate family and EPT genera richness. Our results demonstrate the importance of considering basin- and site-scale sampling effort in ecological surveys and that taxa accumulation curves and richness estimators are good tools for assessing sampling efficiency. The physical habitat explained a significant amount of the sampling effort needed. Therefore, future studies should explore the possible implications of physical habitat characteristics when developing sampling objectives, study designs, and calculating the needed sampling effort.

  2. Effort Drivers Estimation for Brazilian Geographically Distributed Software Development

    NASA Astrophysics Data System (ADS)

    Almeida, Ana Carina M.; Souza, Renata; Aquino, Gibeon; Meira, Silvio

    To meet the requirements of today’s fast paced markets, it is important to develop projects on time and with the minimum use of resources. A good estimate is the key to achieve this goal. Several companies have started to work with geographically distributed teams due to cost reduction and time-to-market. Some researchers indicate that this approach introduces new challenges, because the teams work in different time zones and have possible differences in culture and language. It is already known that the multisite development increases the software cycle time. Data from 15 DSD projects from 10 distinct companies were collected. The analysis shows drivers that impact significantly the total effort planned to develop systems using DSD approach in Brazil.

  3. Software Transition Project Retrospectives and the Application of SEL Effort Estimation Model and Boehm's COCOMO to Complex Software Transition Projects

    NASA Technical Reports Server (NTRS)

    McNeill, Justin

    1995-01-01

    The Multimission Image Processing Subsystem (MIPS) at the Jet Propulsion Laboratory (JPL) has managed transitions of application software sets from one operating system and hardware platform to multiple operating systems and hardware platforms. As a part of these transitions, cost estimates were generated from the personal experience of in-house developers and managers to calculate the total effort required for such projects. Productivity measures have been collected for two such transitions, one very large and the other relatively small in terms of source lines of code. These estimates used a cost estimation model similar to the Software Engineering Laboratory (SEL) Effort Estimation Model. Experience in transitioning software within JPL MIPS have uncovered a high incidence of interface complexity. Interfaces, both internal and external to individual software applications, have contributed to software transition project complexity, and thus to scheduling difficulties and larger than anticipated design work on software to be ported.

  4. Improving PAGER's real-time earthquake casualty and loss estimation toolkit: a challenge

    USGS Publications Warehouse

    Jaiswal, K.S.; Wald, D.J.

    2012-01-01

    We describe the on-going developments of PAGER’s loss estimation models, and discuss value-added web content that can be generated related to exposure, damage and loss outputs for a variety of PAGER users. These developments include identifying vulnerable building types in any given area, estimating earthquake-induced damage and loss statistics by building type, and developing visualization aids that help locate areas of concern for improving post-earthquake response efforts. While detailed exposure and damage information is highly useful and desirable, significant improvements are still necessary in order to improve underlying building stock and vulnerability data at a global scale. Existing efforts with the GEM’s GED4GEM and GVC consortia will help achieve some of these objectives. This will benefit PAGER especially in regions where PAGER’s empirical model is less-well constrained; there, the semi-empirical and analytical models will provide robust estimates of damage and losses. Finally, we outline some of the challenges associated with rapid casualty and loss estimation that we experienced while responding to recent large earthquakes worldwide.

  5. Tourism Dependence in Rural America: Estimates and Effects

    Treesearch

    Donald B.K. English; David W. Marcouiller; H. Ken Cordell

    2000-01-01

    Recreation and tourism development coninue to play an impotnat role in reshaping rural America. Efforts to evaluate the effects of such development are complicated because residents and nonrecreation visitors also use the businesses that are affected by recreation and tourism visitors. We present a method for estimating in nonmetropolitan counties jobs and income that...

  6. A study on nonlinear estimation of submaximal effort tolerance based on the generalized MET concept and the 6MWT in pulmonary rehabilitation

    PubMed Central

    Szczegielniak, Jan; Łuniewski, Jacek; Stanisławski, Rafał; Bogacz, Katarzyna; Krajczy, Marcin; Rydel, Marek

    2018-01-01

    Background The six-minute walk test (6MWT) is considered to be a simple and inexpensive tool for the assessment of functional tolerance of submaximal effort. The aim of this work was 1) to background the nonlinear nature of the energy expenditure process due to physical activity, 2) to compare the results/scores of the submaximal treadmill exercise test and those of 6MWT in pulmonary patients and 3) to develop nonlinear mathematical models relating the two. Methods The study group included patients with the COPD. All patients were subjected to a submaximal exercise test and a 6MWT. To develop an optimal mathematical solution and compare the results of the exercise test and the 6MWT, the least squares and genetic algorithms were employed to estimate parameters of polynomial expansion and piecewise linear models. Results Mathematical analysis enabled to construct nonlinear models for estimating the MET result of submaximal exercise test based on average walk velocity (or distance) in the 6MWT. Conclusions Submaximal effort tolerance in COPD patients can be effectively estimated from new, rehabilitation-oriented, nonlinear models based on the generalized MET concept and the 6MWT. PMID:29425213

  7. Early detection monitoring for larval dreissenid mussels: How much plankton sampling is enough?

    USGS Publications Warehouse

    Counihan, Timothy D.; Bollens, Stephen M.

    2017-01-01

    The development of quagga and zebra mussel (dreissenids) monitoring programs in the Pacific Northwest provides a unique opportunity to evaluate a regional invasive species detection effort early in its development. Recent studies suggest that the ecological and economic costs of a dreissenid infestation in the Pacific Northwest of the USA would be significant. Consequently, efforts are underway to monitor for the presence of dreissenids. However, assessments of whether these efforts provide for early detection are lacking. We use information collected from 2012 to 2014 to characterize the development of larval dreissenid monitoring programs in the states of Idaho, Montana, Oregon, and Washington in the context of introduction and establishment risk. We also estimate the effort needed for high-probability detection of rare planktonic taxa in four Columbia and Snake River reservoirs and assess whether the current level of effort provides for early detection. We found that the effort expended to monitor for dreissenid mussels increased substantially from 2012 to 2014, that efforts were distributed across risk categories ranging from high to very low, and that substantial gaps in our knowledge of both introduction and establishment risk exist. The estimated volume of filtered water required to fully census planktonic taxa or to provide high-probability detection of rare taxa was high for the four reservoirs examined. We conclude that the current level of effort expended does not provide for high-probability detection of larval dreissenids or other planktonic taxa when they are rare in these reservoirs. We discuss options to improve early detection capabilities.

  8. Advances in hypersonic vehicle synthesis with application to studies of advanced thermal protection system

    NASA Technical Reports Server (NTRS)

    Ardema, Mark D.

    1995-01-01

    This report summarizes the work entitled 'Advances in Hypersonic Vehicle Synthesis with Application to Studies of Advanced Thermal Protection Systems.' The effort was in two areas: (1) development of advanced methods of trajectory and propulsion system optimization; and (2) development of advanced methods of structural weight estimation. The majority of the effort was spent in the trajectory area.

  9. Software Effort Estimation Accuracy: A Comparative Study of Estimations Based on Software Sizing and Development Methods

    ERIC Educational Resources Information Center

    Lafferty, Mark T.

    2010-01-01

    The number of project failures and those projects completed over cost and over schedule has been a significant issue for software project managers. Among the many reasons for failure, inaccuracy in software estimation--the basis for project bidding, budgeting, planning, and probability estimates--has been identified as a root cause of a high…

  10. Software Size Estimation Using Expert Estimation: A Fuzzy Logic Approach

    ERIC Educational Resources Information Center

    Stevenson, Glenn A.

    2012-01-01

    For decades software managers have been using formal methodologies such as the Constructive Cost Model and Function Points to estimate the effort of software projects during the early stages of project development. While some research shows these methodologies to be effective, many software managers feel that they are overly complicated to use and…

  11. Center of Excellence for Applied Mathematical and Statistical Research in support of development of multicrop production monitoring capability

    NASA Technical Reports Server (NTRS)

    Woodward, W. A.; Gray, H. L.

    1983-01-01

    Efforts in support of the development of multicrop production monitoring capability are reported. In particular, segment level proportion estimation techniques based upon a mixture model were investigated. Efforts have dealt primarily with evaluation of current techniques and development of alternative ones. A comparison of techniques is provided on both simulated and LANDSAT data along with an analysis of the quality of profile variables obtained from LANDSAT data.

  12. Cost Estimation Techniques for C3I System Software.

    DTIC Science & Technology

    1984-07-01

    opment manmonth have been determined for maxi, midi , and mini .1 type computers. Small to median size timeshared developments used 0.2 to 1.5 hours...development schedule 1.23 1.00 1.10 2.1.3 Detailed Model The final codification of the COCOMO regressions was the development of separate effort...regardless of the software structure level being estimated: D8VC -- the expected development computer (maxi. midi . mini, micro) MODE -- the expected

  13. Approaches in highly parameterized inversion - PEST++, a Parameter ESTimation code optimized for large environmental models

    USGS Publications Warehouse

    Welter, David E.; Doherty, John E.; Hunt, Randall J.; Muffels, Christopher T.; Tonkin, Matthew J.; Schreuder, Willem A.

    2012-01-01

    An object-oriented parameter estimation code was developed to incorporate benefits of object-oriented programming techniques for solving large parameter estimation modeling problems. The code is written in C++ and is a formulation and expansion of the algorithms included in PEST, a widely used parameter estimation code written in Fortran. The new code is called PEST++ and is designed to lower the barriers of entry for users and developers while providing efficient algorithms that can accommodate large, highly parameterized problems. This effort has focused on (1) implementing the most popular features of PEST in a fashion that is easy for novice or experienced modelers to use and (2) creating a software design that is easy to extend; that is, this effort provides a documented object-oriented framework designed from the ground up to be modular and extensible. In addition, all PEST++ source code and its associated libraries, as well as the general run manager source code, have been integrated in the Microsoft Visual Studio® 2010 integrated development environment. The PEST++ code is designed to provide a foundation for an open-source development environment capable of producing robust and efficient parameter estimation tools for the environmental modeling community into the future.

  14. SOFTCOST - DEEP SPACE NETWORK SOFTWARE COST MODEL

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1994-01-01

    The early-on estimation of required resources and a schedule for the development and maintenance of software is usually the least precise aspect of the software life cycle. However, it is desirable to make some sort of an orderly and rational attempt at estimation in order to plan and organize an implementation effort. The Software Cost Estimation Model program, SOFTCOST, was developed to provide a consistent automated resource and schedule model which is more formalized than the often used guesswork model based on experience, intuition, and luck. SOFTCOST was developed after the evaluation of a number of existing cost estimation programs indicated that there was a need for a cost estimation program with a wide range of application and adaptability to diverse kinds of software. SOFTCOST combines several software cost models found in the open literature into one comprehensive set of algorithms that compensate for nearly fifty implementation factors relative to size of the task, inherited baseline, organizational and system environment, and difficulty of the task. SOFTCOST produces mean and variance estimates of software size, implementation productivity, recommended staff level, probable duration, amount of computer resources required, and amount and cost of software documentation. Since the confidence level for a project using mean estimates is small, the user is given the opportunity to enter risk-biased values for effort, duration, and staffing, to achieve higher confidence levels. SOFTCOST then produces a PERT/CPM file with subtask efforts, durations, and precedences defined so as to produce the Work Breakdown Structure (WBS) and schedule having the asked-for overall effort and duration. The SOFTCOST program operates in an interactive environment prompting the user for all of the required input. The program builds the supporting PERT data base in a file for later report generation or revision. The PERT schedule and the WBS schedule may be printed and stored in a file for later use. The SOFTCOST program is written in Microsoft BASIC for interactive execution and has been implemented on an IBM PC-XT/AT operating MS-DOS 2.1 or higher with 256K bytes of memory. SOFTCOST was originally developed for the Zylog Z80 system running under CP/M in 1981. It was converted to run on the IBM PC XT/AT in 1986. SOFTCOST is a copyrighted work with all copyright vested in NASA.

  15. Development of methodologies for the estimation of thermal properties associated with aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Scott, Elaine P.

    1993-01-01

    Thermal stress analyses are an important aspect in the development of aerospace vehicles such as the National Aero-Space Plane (NASP) and the High-Speed Civil Transport (HSCT) at NASA-LaRC. These analyses require knowledge of the temperature within the structures which consequently necessitates the need for thermal property data. The initial goal of this research effort was to develop a methodology for the estimation of thermal properties of aerospace structural materials at room temperature and to develop a procedure to optimize the estimation process. The estimation procedure was implemented utilizing a general purpose finite element code. In addition, an optimization procedure was developed and implemented to determine critical experimental parameters to optimize the estimation procedure. Finally, preliminary experiments were conducted at the Aircraft Structures Branch (ASB) laboratory.

  16. Development of methodologies for the estimation of thermal properties associated with aerospace vehicles

    NASA Astrophysics Data System (ADS)

    Scott, Elaine P.

    1993-12-01

    Thermal stress analyses are an important aspect in the development of aerospace vehicles such as the National Aero-Space Plane (NASP) and the High-Speed Civil Transport (HSCT) at NASA-LaRC. These analyses require knowledge of the temperature within the structures which consequently necessitates the need for thermal property data. The initial goal of this research effort was to develop a methodology for the estimation of thermal properties of aerospace structural materials at room temperature and to develop a procedure to optimize the estimation process. The estimation procedure was implemented utilizing a general purpose finite element code. In addition, an optimization procedure was developed and implemented to determine critical experimental parameters to optimize the estimation procedure. Finally, preliminary experiments were conducted at the Aircraft Structures Branch (ASB) laboratory.

  17. A Strategy for Autogeneration of Space Shuttle Ground Processing Simulation Models for Project Makespan Estimations

    NASA Technical Reports Server (NTRS)

    Madden, Michael G.; Wyrick, Roberta; O'Neill, Dale E.

    2005-01-01

    Space Shuttle Processing is a complicated and highly variable project. The planning and scheduling problem, categorized as a Resource Constrained - Stochastic Project Scheduling Problem (RC-SPSP), has a great deal of variability in the Orbiter Processing Facility (OPF) process flow from one flight to the next. Simulation Modeling is a useful tool in estimation of the makespan of the overall process. However, simulation requires a model to be developed, which itself is a labor and time consuming effort. With such a dynamic process, often the model would potentially be out of synchronization with the actual process, limiting the applicability of the simulation answers in solving the actual estimation problem. Integration of TEAMS model enabling software with our existing schedule program software is the basis of our solution. This paper explains the approach used to develop an auto-generated simulation model from planning and schedule efforts and available data.

  18. Development and validation of an automated operational modal analysis algorithm for vibration-based monitoring and tensile load estimation

    NASA Astrophysics Data System (ADS)

    Rainieri, Carlo; Fabbrocino, Giovanni

    2015-08-01

    In the last few decades large research efforts have been devoted to the development of methods for automated detection of damage and degradation phenomena at an early stage. Modal-based damage detection techniques are well-established methods, whose effectiveness for Level 1 (existence) and Level 2 (location) damage detection is demonstrated by several studies. The indirect estimation of tensile loads in cables and tie-rods is another attractive application of vibration measurements. It provides interesting opportunities for cheap and fast quality checks in the construction phase, as well as for safety evaluations and structural maintenance over the structure lifespan. However, the lack of automated modal identification and tracking procedures has been for long a relevant drawback to the extensive application of the above-mentioned techniques in the engineering practice. An increasing number of field applications of modal-based structural health and performance assessment are appearing after the development of several automated output-only modal identification procedures in the last few years. Nevertheless, additional efforts are still needed to enhance the robustness of automated modal identification algorithms, control the computational efforts and improve the reliability of modal parameter estimates (in particular, damping). This paper deals with an original algorithm for automated output-only modal parameter estimation. Particular emphasis is given to the extensive validation of the algorithm based on simulated and real datasets in view of continuous monitoring applications. The results point out that the algorithm is fairly robust and demonstrate its ability to provide accurate and precise estimates of the modal parameters, including damping ratios. As a result, it has been used to develop systems for vibration-based estimation of tensile loads in cables and tie-rods. Promising results have been achieved for non-destructive testing as well as continuous monitoring purposes. They are documented in the last sections of the paper.

  19. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    NASA Technical Reports Server (NTRS)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo Professional 5.0 for recompilation. An executable is provided on the distribution diskettes. COSTMODL requires 512K RAM. The standard distribution medium for COSTMODL is three 5.25 inch 360K MS-DOS format diskettes. The contents of the diskettes are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. COSTMODL was developed in 1991. IBM PC is a registered trademark of International Business Machines. Borland and Turbo Pascal are registered trademarks of Borland International, Inc. Turbo Professional is a trademark of TurboPower Software. MS-DOS is a registered trademark of Microsoft Corporation. Turbo Professional is a trademark of TurboPower Software.

  20. Associated with aerospace vehicles development of methodologies for the estimation of thermal properties

    NASA Technical Reports Server (NTRS)

    Scott, Elaine P.

    1994-01-01

    Thermal stress analyses are an important aspect in the development of aerospace vehicles at NASA-LaRC. These analyses require knowledge of the temperature distributions within the vehicle structures which consequently necessitates the need for accurate thermal property data. The overall goal of this ongoing research effort is to develop methodologies for the estimation of the thermal property data needed to describe the temperature responses of these complex structures. The research strategy undertaken utilizes a building block approach. The idea here is to first focus on the development of property estimation methodologies for relatively simple conditions, such as isotropic materials at constant temperatures, and then systematically modify the technique for the analysis of more and more complex systems, such as anisotropic multi-component systems. The estimation methodology utilized is a statistically based method which incorporates experimental data and a mathematical model of the system. Several aspects of this overall research effort were investigated during the time of the ASEE summer program. One important aspect involved the calibration of the estimation procedure for the estimation of the thermal properties through the thickness of a standard material. Transient experiments were conducted using a Pyrex standard at various temperatures, and then the thermal properties (thermal conductivity and volumetric heat capacity) were estimated at each temperature. Confidence regions for the estimated values were also determined. These results were then compared to documented values. Another set of experimental tests were conducted on carbon composite samples at different temperatures. Again, the thermal properties were estimated for each temperature, and the results were compared with values obtained using another technique. In both sets of experiments, a 10-15 percent off-set between the estimated values and the previously determined values was found. Another effort was related to the development of the experimental techniques. Initial experiments required a resistance heater placed between two samples. The design was modified such that the heater was placed on the surface of only one sample, as would be necessary in the analysis of built up structures. Experiments using the modified technique were conducted on the composite sample used previously at different temperatures. The results were within 5 percent of those found using two samples. Finally, an initial heat transfer analysis, including conduction, convection and radiation components, was completed on a titanium sandwich structural sample. Experiments utilizing this sample are currently being designed and will be used to first estimate the material's effective thermal conductivity and later to determine the properties associated with each individual heat transfer component.

  1. The Volpe Center's inter-regional auto trip model : a review of methodology, assumptions, and application

    DOT National Transportation Integrated Search

    1994-10-31

    The Volpe Center first estimated an inter-regional auto trip model as part of its effort to assess the market feasibility of maglev for the National Maglev Initiative (NMI). The original intent was to develop a direct demand model for estimating inte...

  2. Integrated Land-Use, Transportation and Environmental Modeling The Vermont Integrated Land-Use and Transportation Carbon Estimator

    DOT National Transportation Integrated Search

    2012-05-01

    The Vermont Integrated Land-Use and Transportation Carbon Estimator (VILTCE) project is part of a larger effort to develop environmental metrics related to travel, and to integrate these tools into a travel model under UVM TRC Signature Project No. 1...

  3. AN OVERVIEW OF THE UNCERTAINTY ANALYSIS, SENSITIVITY ANALYSIS, AND PARAMETER ESTIMATION (UA/SA/PE) API AND HOW TO IMPLEMENT IT

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and
    Parameter Estimation (UA/SA/PE API) (also known as Calibration, Optimization and Sensitivity and Uncertainty (CUSO)) was developed in a joint effort between several members of both ...

  4. Earth resources data analysis program, phase 2

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The efforts and findings of the Earth Resources Data Analysis Program are summarized. Results of a detailed study of the needs of EOD with respect to an applications development system (ADS) for the analysis of remotely sensed data, including an evaluation of four existing systems with respect to these needs are described. Recommendations as to possible courses for EOD to follow to obtain a viable ADS are presented. Algorithmic development comprised of several subtasks is discussed. These subtasks include the following: (1) two algorithms for multivariate density estimation; (2) a data smoothing algorithm; (3) a method for optimally estimating prior probabilities of unclassified data; and (4) further applications of the modified Cholesky decomposition in various calculations. Little effort was expended on task 3, however, two reports were reviewed.

  5. Battery Calendar Life Estimator Manual Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jon P. Christophersen; Ira Bloom; Ed Thomas

    2012-10-01

    The Battery Life Estimator (BLE) Manual has been prepared to assist developers in their efforts to estimate the calendar life of advanced batteries for automotive applications. Testing requirements and procedures are defined by the various manuals previously published under the United States Advanced Battery Consortium (USABC). The purpose of this manual is to describe and standardize a method for estimating calendar life based on statistical models and degradation data acquired from typical USABC battery testing.

  6. Battery Life Estimator Manual Linear Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jon P. Christophersen; Ira Bloom; Ed Thomas

    2009-08-01

    The Battery Life Estimator (BLE) Manual has been prepared to assist developers in their efforts to estimate the calendar life of advanced batteries for automotive applications. Testing requirements and procedures are defined by the various manuals previously published under the United States Advanced Battery Consortium (USABC). The purpose of this manual is to describe and standardize a method for estimating calendar life based on statistical models and degradation data acquired from typical USABC battery testing.

  7. COSTMODL: An automated software development cost estimation tool

    NASA Technical Reports Server (NTRS)

    Roush, George B.

    1991-01-01

    The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.

  8. Estimating rates of local species extinction, colonization and turnover in animal communities

    USGS Publications Warehouse

    Nichols, James D.; Boulinier, T.; Hines, J.E.; Pollock, K.H.; Sauer, J.R.

    1998-01-01

    Species richness has been identified as a useful state variable for conservation and management purposes. Changes in richness over time provide a basis for predicting and evaluating community responses to management, to natural disturbance, and to changes in factors such as community composition (e.g., the removal of a keystone species). Probabilistic capture-recapture models have been used recently to estimate species richness from species count and presence-absence data. These models do not require the common assumption that all species are detected in sampling efforts. We extend this approach to the development of estimators useful for studying the vital rates responsible for changes in animal communities over time; rates of local species extinction, turnover, and colonization. Our approach to estimation is based on capture-recapture models for closed animal populations that permit heterogeneity in detection probabilities among the different species in the sampled community. We have developed a computer program, COMDYN, to compute many of these estimators and associated bootstrap variances. Analyses using data from the North American Breeding Bird Survey (BBS) suggested that the estimators performed reasonably well. We recommend estimators based on probabilistic modeling for future work on community responses to management efforts as well as on basic questions about community dynamics.

  9. Phase I Forest Area Estimation Using Landsat TM and Iterative Guided Spectral Class Rejection: Assessment of Possible Training Data Protocols

    Treesearch

    John A. Scrivani; Randolph H. Wynne; Christine E. Blinn; Rebecca F. Musy

    2001-01-01

    Two methods of training data collection for automated image classification were tested in Virginia as part of a larger effort to develop an objective, repeatable, and low-cost method to provide forest area classification from satellite imagery. The derived forest area estimates were compared to estimates derived from a traditional photo-interpreted, double sample. One...

  10. Comparison of an automated Most Probable Number (MPN) technique to traditional plating methods for estimating populations of total aerobes, coliforms and E. coli associated with freshly processed broiler chickens

    USDA-ARS?s Scientific Manuscript database

    Recently, an instrument (TEMPOTM) has been developed to automate the Most Probable Number (MPN) technique and reduce the effort required to estimate some bacterial populations. We compared the automated MPN technique to traditional microbiological plating methods or PetrifilmTM for estimating the t...

  11. Enabling Software Acquisition Improvement: Government and Industry Software Development Team Acquisition Model

    DTIC Science & Technology

    2010-04-30

    estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining...previous and current complex SW development efforts, the program offices will have a source of objective lessons learned and metrics that can be applied...the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this

  12. Develop metrics of tire debris on Texas highways : technical report.

    DOT National Transportation Integrated Search

    2017-05-01

    This research effort estimated the amount, characteristics, costs, and safety implications of tire debris on Texas highways. The metrics developed by this research are based on several sources of data, including a statewide survey of debris removal p...

  13. Linear-Quadratic-Gaussian Regulator Developed for a Magnetic Bearing

    NASA Technical Reports Server (NTRS)

    Choi, Benjamin B.

    2002-01-01

    Linear-Quadratic-Gaussian (LQG) control is a modern state-space technique for designing optimal dynamic regulators. It enables us to trade off regulation performance and control effort, and to take into account process and measurement noise. The Structural Mechanics and Dynamics Branch at the NASA Glenn Research Center has developed an LQG control for a fault-tolerant magnetic bearing suspension rig to optimize system performance and to reduce the sensor and processing noise. The LQG regulator consists of an optimal state-feedback gain and a Kalman state estimator. The first design step is to seek a state-feedback law that minimizes the cost function of regulation performance, which is measured by a quadratic performance criterion with user-specified weighting matrices, and to define the tradeoff between regulation performance and control effort. The next design step is to derive a state estimator using a Kalman filter because the optimal state feedback cannot be implemented without full state measurement. Since the Kalman filter is an optimal estimator when dealing with Gaussian white noise, it minimizes the asymptotic covariance of the estimation error.

  14. Lessons learned in deploying software estimation technology and tools

    NASA Technical Reports Server (NTRS)

    Panlilio-Yap, Nikki; Ho, Danny

    1994-01-01

    Developing a software product involves estimating various project parameters. This is typically done in the planning stages of the project when there is much uncertainty and very little information. Coming up with accurate estimates of effort, cost, schedule, and reliability is a critical problem faced by all software project managers. The use of estimation models and commercially available tools in conjunction with the best bottom-up estimates of software-development experts enhances the ability of a product development group to derive reasonable estimates of important project parameters. This paper describes the experience of the IBM Software Solutions (SWS) Toronto Laboratory in selecting software estimation models and tools and deploying their use to the laboratory's product development groups. It introduces the SLIM and COSTAR products, the software estimation tools selected for deployment to the product areas, and discusses the rationale for their selection. The paper also describes the mechanisms used for technology injection and tool deployment, and concludes with a discussion of important lessons learned in the technology and tool insertion process.

  15. An ecosystem-based assessment of hairtail ( Trichiurus lepturus) harvested by multi-gears and management implications in Korean waters

    NASA Astrophysics Data System (ADS)

    Kang, Hee Joong; Zhang, Chang Ik; Lee, Eun Ji; Seo, Young Il

    2015-06-01

    Hairtail ( Trichiurus lepturus) has been traditionally harvested by multi-gear types in the Yellow Sea and the East China Sea, except for the East Sea (Sea of Japan) in Korean waters. Six different fishery types such as offshore stownet fishery, offshore longline fishery, large pair-trawl fishery, large purse seine fishery, large otter trawl fishery and offshore angling fishery target to harvest the hairtail stock accounting for about 90% of the total annual catch. We attempted to develop an ecosystem-based fisheries assessment approach, which determines the optimal allocation of catch quotas and fishing efforts for major fisheries. We conducted standardization of fishing effort for six types of hairtail fisheries using a general linear model (GLM), and then estimated maximum sustainable yield (MSY) and maximum economic yield (MEY). Estimated MSY and MEY for the hairtail stock were estimated as 100,151 mt and 97,485 mt, respectively. In addition, we carried out an ecosystem-based risk analysis to obtain species risk index (SRI), which was applied to adjusting the optimal proportion of fishing effort for six hairtail fisheries as a penalty or an incentive. As a result, fishing effort ratios were adjusted by SRI for the six fisheries types. Also, the total allowable catch (TAC) was estimated as 97,485 mt and the maximum net profit at TAC by the hairtail fisheries was estimated as 778 billion won (USD 765 million).

  16. An investigation of hydraulic conductivity estimation in a ground-water flow study of Northern Long Valley, New Jersey

    USGS Publications Warehouse

    Hill, Mary C.

    1985-01-01

    The purpose of this study was to develop a methodology to be used to investigate the aquifer characteristics and water supply potential of an aquifer system. In particular, the geohydrology of northern Long Valley, New Jersey, was investigated. Geohydrologic data were collected and analyzed to characterize the site. Analysis was accomplished by interpreting the available data and by using a numerical simulation of the watertable aquifer. Special attention was given to the estimation of hydraulic conductivity values and hydraulic conductivity structure which together define the hydraulic conductivity of the modeled aquifer. Hydraulic conductivity and all other aspects of the system were first estimated using the trial-and-error method of calibration. The estimation of hydraulic conductivity was improved using a least squares method to estimate hydraulic conductivity values and by improvements in the parameter structure. These efforts improved the calibration of the model far more than a preceding period of similar effort using the trial-and-error method of calibration. In addition, the proposed method provides statistical information on the reliability of estimated hydraulic conductivity values, calculated heads, and calculated flows. The methodology developed and applied in this work proved to be of substantial value in the evaluation of the aquifer considered.

  17. The Demand for Scientific and Technical Manpower in Selected Energy-Related Industries, 1970-85: A Methodology Applied to a Selected Scenario of Energy Output. A Summary.

    ERIC Educational Resources Information Center

    Gutmanis, Ivars; And Others

    The primary purpose of the study was to develop and apply a methodology for estimating the need for scientists and engineers by specialty in energy and energy-related industries. The projections methodology was based on the Case 1 estimates by the National Petroleum Council of the results of "maximum efforts" to develop domestic fuel sources by…

  18. Predictors and Impact of Self-Reported Suboptimal Effort on Estimates of Prevalence of HIV-Associated Neurocognitive Disorders.

    PubMed

    Levine, Andrew J; Martin, Eileen; Sacktor, Ned; Munro, Cynthia; Becker, James

    2017-06-01

    Prevalence estimates of HIV-associated neurocognitive disorders (HAND) may be inflated. Estimates are determined via cohort studies in which participants may apply suboptimal effort on neurocognitive testing, thereby inflating estimates. Additionally, fluctuating HAND severity over time may be related to inconsistent effort. To address these hypotheses, we characterized effort in the Multicenter AIDS Cohort Study. After neurocognitive testing, 935 participants (525 HIV- and 410 HIV+) completed the visual analog effort scale (VAES), rating their effort from 0% to 100%. Those with <100% then indicated the reason(s) for suboptimal effort. K-means cluster analysis established 3 groups: high (mean = 97%), moderate (79%), and low effort (51%). Rates of HAND and other characteristics were compared between the groups. Linear regression examined the predictors of VAES score. Data from 57 participants who completed the VAES at 2 visits were analyzed to characterize the longitudinal relationship between effort and HAND severity. Fifty-two percent of participants reported suboptimal effort (<100%), with no difference between serostatus groups. Common reasons included "tired" (43%) and "distracted" (36%). The lowest effort group had greater asymptomatic neurocognitive impairment and minor neurocognitive disorder diagnosis (25% and 33%) as compared with the moderate (23% and 15%) and the high (12% and 9%) effort groups. Predictors of suboptimal effort were self-reported memory impairment, African American race, and cocaine use. Change in effort between baseline and follow-up correlated with change in HAND severity. Suboptimal effort seems to inflate estimated HAND prevalence and explain fluctuation of severity over time. A simple modification of study protocols to optimize effort is indicated by the results.

  19. Earthquake Loss Estimates in Near Real-Time

    NASA Astrophysics Data System (ADS)

    Wyss, Max; Wang, Rongjiang; Zschau, Jochen; Xia, Ye

    2006-10-01

    The usefulness to rescue teams of nearreal-time loss estimates after major earthquakes is advancing rapidly. The difference in the quality of data available in highly developed compared with developing countries dictates that different approaches be used to maximize mitigation efforts. In developed countries, extensive information from tax and insurance records, together with accurate census figures, furnish detailed data on the fragility of buildings and on the number of people at risk. For example, these data are exploited by the method to estimate losses used in the Hazards U.S. Multi-Hazard (HAZUSMH)software program (http://www.fema.gov/plan/prevent/hazus/). However, in developing countries, the population at risk is estimated from inferior data sources and the fragility of the building stock often is derived empirically, using past disastrous earthquakes for calibration [Wyss, 2004].

  20. Disentangling sampling and ecological explanations underlying species-area relationships

    USGS Publications Warehouse

    Cam, E.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Alpizar-Jara, R.; Flather, C.H.

    2002-01-01

    We used a probabilistic approach to address the influence of sampling artifacts on the form of species-area relationships (SARs). We developed a model in which the increase in observed species richness is a function of sampling effort exclusively. We assumed that effort depends on area sampled, and we generated species-area curves under that model. These curves can be realistic looking. We then generated SARs from avian data, comparing SARs based on counts with those based on richness estimates. We used an approach to estimation of species richness that accounts for species detection probability and, hence, for variation in sampling effort. The slopes of SARs based on counts are steeper than those of curves based on estimates of richness, indicating that the former partly reflect failure to account for species detection probability. SARs based on estimates reflect ecological processes exclusively, not sampling processes. This approach permits investigation of ecologically relevant hypotheses. The slope of SARs is not influenced by the slope of the relationship between habitat diversity and area. In situations in which not all of the species are detected during sampling sessions, approaches to estimation of species richness integrating species detection probability should be used to investigate the rate of increase in species richness with area.

  1. An Algorithm for Efficient Maximum Likelihood Estimation and Confidence Interval Determination in Nonlinear Estimation Problems

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick Charles

    1985-01-01

    An algorithm for maximum likelihood (ML) estimation is developed with an efficient method for approximating the sensitivities. The algorithm was developed for airplane parameter estimation problems but is well suited for most nonlinear, multivariable, dynamic systems. The ML algorithm relies on a new optimization method referred to as a modified Newton-Raphson with estimated sensitivities (MNRES). MNRES determines sensitivities by using slope information from local surface approximations of each output variable in parameter space. The fitted surface allows sensitivity information to be updated at each iteration with a significant reduction in computational effort. MNRES determines the sensitivities with less computational effort than using either a finite-difference method or integrating the analytically determined sensitivity equations. MNRES eliminates the need to derive sensitivity equations for each new model, thus eliminating algorithm reformulation with each new model and providing flexibility to use model equations in any format that is convenient. A random search technique for determining the confidence limits of ML parameter estimates is applied to nonlinear estimation problems for airplanes. The confidence intervals obtained by the search are compared with Cramer-Rao (CR) bounds at the same confidence level. It is observed that the degree of nonlinearity in the estimation problem is an important factor in the relationship between CR bounds and the error bounds determined by the search technique. The CR bounds were found to be close to the bounds determined by the search when the degree of nonlinearity was small. Beale's measure of nonlinearity is developed in this study for airplane identification problems; it is used to empirically correct confidence levels for the parameter confidence limits. The primary utility of the measure, however, was found to be in predicting the degree of agreement between Cramer-Rao bounds and search estimates.

  2. A GIS-based framework for evaluating investments in fire management: Spatial allocation of recreation values

    Treesearch

    Kenneth A. Baerenklau; Armando González-Cabán; Catrina I. Páez; Edgard Chávez

    2009-01-01

    The U.S. Forest Service is responsible for developing tools to facilitate effective and efficient fire management on wildlands and urban-wildland interfaces. Existing GIS-based fire modeling software only permits estimation of the costs of fire prevention and mitigation efforts as well as the effects of those efforts on fire behavior. This research demonstrates how the...

  3. Apparent annual survival estimates of tropical songbirds better reflect life history variation when based on intensive field methods

    USGS Publications Warehouse

    Martin, Thomas E.; Riordan, Margaret M.; Repin, Rimi; Mouton, James C.; Blake, William M.

    2017-01-01

    AimAdult survival is central to theories explaining latitudinal gradients in life history strategies. Life history theory predicts higher adult survival in tropical than north temperate regions given lower fecundity and parental effort. Early studies were consistent with this prediction, but standard-effort netting studies in recent decades suggested that apparent survival rates in temperate and tropical regions strongly overlap. Such results do not fit with life history theory. Targeted marking and resighting of breeding adults yielded higher survival estimates in the tropics, but this approach is thought to overestimate survival because it does not sample social and age classes with lower survival. We compared the effect of field methods on tropical survival estimates and their relationships with life history traits.LocationSabah, Malaysian Borneo.Time period2008–2016.Major taxonPasseriformes.MethodsWe used standard-effort netting and resighted individuals of all social and age classes of 18 tropical songbird species over 8 years. We compared apparent survival estimates between these two field methods with differing analytical approaches.ResultsEstimated detection and apparent survival probabilities from standard-effort netting were similar to those from other tropical studies that used standard-effort netting. Resighting data verified that a high proportion of individuals that were never recaptured in standard-effort netting remained in the study area, and many were observed breeding. Across all analytical approaches, addition of resighting yielded substantially higher survival estimates than did standard-effort netting alone. These apparent survival estimates were higher than for temperate zone species, consistent with latitudinal differences in life histories. Moreover, apparent survival estimates from addition of resighting, but not from standard-effort netting alone, were correlated with parental effort as measured by egg temperature across species.Main conclusionsInclusion of resighting showed that standard-effort netting alone can negatively bias apparent survival estimates and obscure life history relationships across latitudes and among tropical species.

  4. Hydrogen from coal cost estimation guidebook

    NASA Technical Reports Server (NTRS)

    Billings, R. E.

    1981-01-01

    In an effort to establish baseline information whereby specific projects can be evaluated, a current set of parameters which are typical of coal gasification applications was developed. Using these parameters a computer model allows researchers to interrelate cost components in a sensitivity analysis. The results make possible an approximate estimation of hydrogen energy economics from coal, under a variety of circumstances.

  5. Standard area diagrams for aiding severity estimation scientometrics, pathosystems and methodological trends in the last 25 years

    USDA-ARS?s Scientific Manuscript database

    Standard area diagrams (SADs) have long been used as a tool to aid the estimation of plant disease severity, an essential variable in phytopathometry. Formal validation of SADs was not considered prior to the early 1990s, when considerable effort began to be invested developing SADs and assessing th...

  6. Advancing Models and Data for Characterizing Exposures to Chemicals in Consumer Products

    EPA Science Inventory

    EPA’s Office of Research and Development (ORD) is leading several efforts to develop data and methods for estimating population chemical exposures related to the use of consumer products. New curated chemical, ingredient, and product use information are being collected fro...

  7. A GLOBAL INVENTORY OF VOLATILE ORGANIC COMPOUND EMISSIONS FROM ANTHROPOGENIC SOURCES

    EPA Science Inventory

    As part of an effort to assess the potential impacts associated with global climate change, the U.S. Environmental Protection Agency's Office of Research and Development is supporting global atmospheric chemistry research by developing global scale estimates of volatile organic c...

  8. Recent advances in estimating protein and energy requirements of ruminants

    USDA-ARS?s Scientific Manuscript database

    Considerable efforts have been made in gathering scientific data and developing feeding systems for ruminant animals in the last 50 years. Future endeavours should target the assessment, interpretation, and integration of the accumulated knowledge to develop nutrition models in a holistic and pragma...

  9. Congestion Mitigation and Air Quality (CMAQ) Improvement Program: Cost-Effectiveness Tables Development and Methodology

    DOT National Transportation Integrated Search

    2015-05-01

    This document presents summary and detailed findings from a research effort to develop estimates of the cost-effectiveness of a range of project types funded under the Congestion Mitigation and Air Quality (CMAQ) Improvement Program. In this study, c...

  10. Prioritizing earthquake and tsunami alerting efforts

    NASA Astrophysics Data System (ADS)

    Allen, R. M.; Allen, S.; Aranha, M. A.; Chung, A. I.; Hellweg, M.; Henson, I. H.; Melgar, D.; Neuhauser, D. S.; Nof, R. N.; Strauss, J. A.

    2015-12-01

    The timeline of hazards associated with earthquakes ranges from seconds for the strong shaking at the epicenter, to minutes for strong shaking at more distant locations in big quakes, to tens of minutes for a local tsunami. Earthquake and tsunami warning systems must therefore include very fast initial alerts, while also taking advantage of available time in bigger and tsunami-generating quakes. At the UC Berkeley Seismological Laboratory we are developing a suite of algorithms to provide the fullest possible information about earthquake shaking and tsunami inundation from seconds to minutes after a quake. The E-larmS algorithm uses the P-wave to rapidly detect an earthquake and issue a warning. It is currently issuing alerts to test users in as little as 3 sec after the origin time. Development of a new waveform detector may lead to even faster alerts. G-larmS uses permanent deformation estimates from GNSS stations to estimate the geometry and extent of rupture underway providing more accurate ground shaking estimates in big (M>~7) earthquakes. It performed well in the M6.0 2014 Napa earthquake. T-larmS is a new algorithm designed to extend alert capabilities to tsunami inundation. Rapid estimates of source characteristics for subduction zones event can not only be used to warn of the shaking hazard, but also the local tsunami inundation hazard. These algorithms are being developed, implemented and tested with a focus on the western US, but are also now being tested in other parts of the world including Israel, Turkey, Korea and Chile. Beta users in the Bay Area are receiving the alerts and beginning to implement automated actions. They also provide feedback on users needs, which has led to the development of the MyEEW smartphone app. This app allows beta users to receive the alerts on their cell phones. All these efforts feed into our ongoing assessment of directions and priorities for future development and implementation efforts.

  11. A probabilistic assessment of health risks associated with short-term exposure to tropospheric ozone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitfield, R.G; Biller, W.F.; Jusko, M.J.

    1996-06-01

    The work described in this report is part of a larger risk assessment sponsored by the U.S. Environmental Protection Agency. Earlier efforts developed exposure-response relationships for acute health effects among populations engaged in heavy exertion. Those efforts also developed a probabilistic national ambient air quality standards exposure model and a general methodology for integrating probabilistic exposure-response relation- ships and exposure estimates to calculate overall risk results. Recently published data make it possible to model additional health endpoints (for exposure at moderate exertion), including hospital admissions. New air quality and exposure estimates for alternative national ambient air quality standards for ozonemore » are combined with exposure-response models to produce the risk results for hospital admissions and acute health effects. Sample results explain the methodology and introduce risk output formats.« less

  12. Development of Methodologies for the Estimation of Thermal Properties Associated with Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Scott, Elaine P.

    1996-01-01

    A thermal stress analysis is an important aspect in the design of aerospace structures and vehicles such as the High Speed Civil Transport (HSCT) at the National Aeronautics and Space Administration Langley Research Center (NASA-LaRC). These structures are complex and are often composed of numerous components fabricated from a variety of different materials. The thermal loads on these structures induce temperature variations within the structure, which in turn result in the development of thermal stresses. Therefore, a thermal stress analysis requires knowledge of the temperature distributions within the structures which consequently necessitates the need for accurate knowledge of the thermal properties, boundary conditions and thermal interface conditions associated with the structural materials. The goal of this proposed multi-year research effort was to develop estimation methodologies for the determination of the thermal properties and interface conditions associated with aerospace vehicles. Specific objectives focused on the development and implementation of optimal experimental design strategies and methodologies for the estimation of thermal properties associated with simple composite and honeycomb structures. The strategy used in this multi-year research effort was to first develop methodologies for relatively simple systems and then systematically modify these methodologies to analyze complex structures. This can be thought of as a building block approach. This strategy was intended to promote maximum usability of the resulting estimation procedure by NASA-LARC researchers through the design of in-house experimentation procedures and through the use of an existing general purpose finite element software.

  13. Space shuttle propulsion estimation development verification, volume 1

    NASA Technical Reports Server (NTRS)

    Rogers, Robert M.

    1989-01-01

    The results of the Propulsion Estimation Development Verification are summarized. A computer program developed under a previous contract (NAS8-35324) was modified to include improved models for the Solid Rocket Booster (SRB) internal ballistics, the Space Shuttle Main Engine (SSME) power coefficient model, the vehicle dynamics using quaternions, and an improved Kalman filter algorithm based on the U-D factorized algorithm. As additional output, the estimated propulsion performances, for each device are computed with the associated 1-sigma bounds. The outputs of the estimation program are provided in graphical plots. An additional effort was expended to examine the use of the estimation approach to evaluate single engine test data. In addition to the propulsion estimation program PFILTER, a program was developed to produce a best estimate of trajectory (BET). The program LFILTER, also uses the U-D factorized algorithm form of the Kalman filter as in the propulsion estimation program PFILTER. The necessary definitions and equations explaining the Kalman filtering approach for the PFILTER program, the models used for this application for dynamics and measurements, program description, and program operation are presented.

  14. Acoustic fatigue life prediction for nonlinear structures with multiple resonant modes

    NASA Technical Reports Server (NTRS)

    Miles, R. N.

    1992-01-01

    This report documents an effort to develop practical and accurate methods for estimating the fatigue lives of complex aerospace structures subjected to intense random excitations. The emphasis of the current program is to construct analytical schemes for performing fatigue life estimates for structures that exhibit nonlinear vibration behavior and that have numerous resonant modes contributing to the response.

  15. Multivariate Density Estimation and Remote Sensing

    NASA Technical Reports Server (NTRS)

    Scott, D. W.

    1983-01-01

    Current efforts to develop methods and computer algorithms to effectively represent multivariate data commonly encountered in remote sensing applications are described. While this may involve scatter diagrams, multivariate representations of nonparametric probability density estimates are emphasized. The density function provides a useful graphical tool for looking at data and a useful theoretical tool for classification. This approach is called a thunderstorm data analysis.

  16. Air/Superfund national technical guidance study series, Volume 2. Estimation of baseline air emission at Superfund sites. Interim report(Final)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1989-01-01

    This volume is one in a series of manuals prepared for EPA to assist its Remedial Project Managers in the assessment of the air contaminant pathway and developing input data for risk assessment. The manual provides guidance on developing baseline-emission estimates from hazardous waste sites. Baseline-emission estimates (BEEs) are defined as emission rates estimated for a site in its undisturbed state. Specifically, the manual is intended to: Present a protocol for selecting the appropriate level of effort to characterize baseline air emissions; Assist site managers in designing an approach for BEEs; Describe useful technologies for developing site-specific baseline emission estimatesmore » (BEEs); Help site managers select the appropriate technologies for generating site-specific BEEs.« less

  17. ESP: Economics of Shipyard Painting, Bid Estimating Transfer Study

    DTIC Science & Technology

    1993-11-10

    Estimating Transfer Study Final Report i EXECUTIVE SUMMARY During Phase I of the “Economics of Shipyard Painting” project, it became evident that detail...an SP-3 panel directive to establish a 2nd phase of the “Economics of Shipyard Painting” focussed on applying the detailed data collected in Phase I to...bid-stage estimating. During Phase II, a program was developed that worked in tandem with the detailed data collection effort laid out in Phase I

  18. The EGM2008 Global Gravitational Model

    NASA Astrophysics Data System (ADS)

    Pavlis, N. K.; Holmes, S. A.; Kenyon, S. C.; Factor, J. K.

    2008-12-01

    The development of a new Earth Gravitational Model (EGM) to degree 2160 has been completed. This model, designated EGM2008, is the product of the final re-iteration of our modelling and estimation approach. Our multi-year effort has produced several Preliminary Gravitational Models (PGM) of increasingly improved performance. One of these models (PGM2007A) was provided for evaluation to an independent Evaluation Working Group, sponsored by the International Association of Geodesy (IAG). In an effort to address certain shortcomings of PGM2007A, we have considered the feedback that we received from this Working Group. As part of this effort, EGM2008 incorporates an improved version of our 5'x5' global gravity anomaly database and has benefited from the latest GRACE based satellite-only solutions (e.g., ITG- GRACE03S). EGM2008 incorporates an improved ocean-wide set of altimetry-derived gravity anomalies that were estimated using PGM2007B (a variant of PGM2007A) and its associated Dynamic Ocean Topography (DOT) model as reference models in a "Remove-Compute-Restore" fashion. For the Least Squares Collocation estimation of our final global 5'x5' area-mean gravity anomaly database, we have used consistently PGM2007B as our reference model to degree 2160. We have developed and used a formulation that predicts area-mean gravity anomalies that are effectively band-limited to degree 2160, thereby minimizing aliasing effects during the harmonic analysis process. We have also placed special emphasis on the refinement and "calibration" of the error estimates that accompany our final combination solution EGM2008. We present the main aspects of the model's development and evaluation. This evaluation was accomplished primarily through the comparison of various model derived quantities with independent data and models (e.g., geoid undulations derived from GPS positioning and spirit levelling, astronomical deflections of the vertical, etc.). We will also present comparisons of our model-implied Dynamic Ocean Topography with other contemporary estimates (e.g., from ECCO).

  19. LCoE Analysis of Surge-Mode WEC

    DOE Data Explorer

    Bill Staby

    2017-03-07

    Spreadsheet which provides estimates of reductions in Levelized Cost of Energy for a surge-mode wave energy converter (WEC). This is made available via adoption of the advanced control strategies developed during this research effort.

  20. METHODS DEVELOPMENT AT THE NEAR LAB ECOLOGICAL RESEARCH AREA (NLERA) LOCATED IN THE NEUSE RIVER BASIN.

    EPA Science Inventory

    This task supports the Agency's efforts on developing a proper risk assessment tools to address Ecological and eventually Human exposures. The Agency needs to be able to identify, measure and estimate ecosystem exposure to multiple stressors. The research under this task suppor...

  1. The potential supply of organ donors. An assessment of the efficacy of organ procurement efforts in the United States.

    PubMed

    Evans, R W; Orians, C E; Ascher, N L

    1992-01-08

    To estimate the potential supply of organ donors and to measure the efficiency of organ procurement efforts in the United States. A geographic database has been developed consisting of multiple cause of death and sociodemographic data compiled by the National Center for Health Statistics. All deaths are evaluated as to their potential for organ donation. Two classes of potential donors are identified: class 1 estimates are restricted to causes of death involving significant head trauma only, and class 2 estimates include class 1 estimates as well as deaths in which brain death was less probable. Over 23,000 people are currently awaiting a kidney, heart, liver, heart-lung, pancreas, or lung transplantation. Donor supply is inadequate, and the number of donors remained unchanged at approximately 4000 annually for 1986 through 1989, with a modest 9.1% increase in 1990. Between 6900 and 10,700 potential donors are available annually (eg, 28.5 to 43.7 per million population). Depending on the class of donor considered, organ procurement efforts are between 37% and 59% efficient. Efficiency greatly varies by state and organ procurement organization. Many more organ donors are available than are being accessed through existing organ procurement efforts. Realistically, it may be possible to increase by 80% the number of donors available in the United States (up to 7300 annually). It is conceivable, although unlikely, that the supply of donor organs could achieve a level to meet demand.

  2. Sampling effort and estimates of species richness based on prepositioned area electrofisher samples

    USGS Publications Warehouse

    Bowen, Z.H.; Freeman, Mary C.

    1998-01-01

    Estimates of species richness based on electrofishing data are commonly used to describe the structure of fish communities. One electrofishing method for sampling riverine fishes that has become popular in the last decade is the prepositioned area electrofisher (PAE). We investigated the relationship between sampling effort and fish species richness at seven sites in the Tallapoosa River system, USA based on 1,400 PAE samples collected during 1994 and 1995. First, we estimated species richness at each site using the first-order jackknife and compared observed values for species richness and jackknife estimates of species richness to estimates based on historical collection data. Second, we used a permutation procedure and nonlinear regression to examine rates of species accumulation. Third, we used regression to predict the number of PAE samples required to collect the jackknife estimate of species richness at each site during 1994 and 1995. We found that jackknife estimates of species richness generally were less than or equal to estimates based on historical collection data. The relationship between PAE electrofishing effort and species richness in the Tallapoosa River was described by a positive asymptotic curve as found in other studies using different electrofishing gears in wadable streams. Results from nonlinear regression analyses indicted that rates of species accumulation were variable among sites and between years. Across sites and years, predictions of sampling effort required to collect jackknife estimates of species richness suggested that doubling sampling effort (to 200 PAEs) would typically increase observed species richness by not more than six species. However, sampling effort beyond about 60 PAE samples typically increased observed species richness by < 10%. We recommend using historical collection data in conjunction with a preliminary sample size of at least 70 PAE samples to evaluate estimates of species richness in medium-sized rivers. Seventy PAE samples should provide enough information to describe the relationship between sampling effort and species richness and thus facilitate evaluation of a sampling effort.

  3. Preliminary development of digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Stanley, W. D.

    1980-01-01

    Topics covered involve a number of closely related tasks including: the development of several control loop and dynamic noise model computer programs for simulating microwave radiometer measurements; computer modeling of an existing stepped frequency radiometer in an effort to determine its optimum operational characteristics; investigation of the classical second order analog control loop to determine its ability to reduce the estimation error in a microwave radiometer; investigation of several digital signal processing unit designs; initiation of efforts to develop required hardware and software for implementation of the digital signal processing unit; and investigation of the general characteristics and peculiarities of digital processing noiselike microwave radiometer signals.

  4. Validation of PV-RPM Code in the System Advisor Model.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Geoffrey Taylor; Lavrova, Olga; Freeman, Janine

    2017-04-01

    This paper describes efforts made by Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) to validate the SNL developed PV Reliability Performance Model (PV - RPM) algorithm as implemented in the NREL System Advisor Model (SAM). The PV - RPM model is a library of functions that estimates component failure and repair in a photovoltaic system over a desired simulation period. The failure and repair distributions in this paper are probabilistic representations of component failure and repair based on data collected by SNL for a PV power plant operating in Arizona. The validation effort focuses on whethermore » the failure and repair dist ributions used in the SAM implementation result in estimated failures that match the expected failures developed in the proof - of - concept implementation. Results indicate that the SAM implementation of PV - RPM provides the same results as the proof - of - concep t implementation, indicating the algorithms were reproduced successfully.« less

  5. The Flight Optimization System Weights Estimation Method

    NASA Technical Reports Server (NTRS)

    Wells, Douglas P.; Horvath, Bryce L.; McCullers, Linwood A.

    2017-01-01

    FLOPS has been the primary aircraft synthesis software used by the Aeronautics Systems Analysis Branch at NASA Langley Research Center. It was created for rapid conceptual aircraft design and advanced technology impact assessments. FLOPS is a single computer program that includes weights estimation, aerodynamics estimation, engine cycle analysis, propulsion data scaling and interpolation, detailed mission performance analysis, takeoff and landing performance analysis, noise footprint estimation, and cost analysis. It is well known as a baseline and common denominator for aircraft design studies. FLOPS is capable of calibrating a model to known aircraft data, making it useful for new aircraft and modifications to existing aircraft. The weight estimation method in FLOPS is known to be of high fidelity for conventional tube with wing aircraft and a substantial amount of effort went into its development. This report serves as a comprehensive documentation of the FLOPS weight estimation method. The development process is presented with the weight estimation process.

  6. Limited sampling hampers “big data” estimation of species richness in a tropical biodiversity hotspot

    PubMed Central

    Engemann, Kristine; Enquist, Brian J; Sandel, Brody; Boyle, Brad; Jørgensen, Peter M; Morueta-Holme, Naia; Peet, Robert K; Violle, Cyrille; Svenning, Jens-Christian

    2015-01-01

    Macro-scale species richness studies often use museum specimens as their main source of information. However, such datasets are often strongly biased due to variation in sampling effort in space and time. These biases may strongly affect diversity estimates and may, thereby, obstruct solid inference on the underlying diversity drivers, as well as mislead conservation prioritization. In recent years, this has resulted in an increased focus on developing methods to correct for sampling bias. In this study, we use sample-size-correcting methods to examine patterns of tropical plant diversity in Ecuador, one of the most species-rich and climatically heterogeneous biodiversity hotspots. Species richness estimates were calculated based on 205,735 georeferenced specimens of 15,788 species using the Margalef diversity index, the Chao estimator, the second-order Jackknife and Bootstrapping resampling methods, and Hill numbers and rarefaction. Species richness was heavily correlated with sampling effort, and only rarefaction was able to remove this effect, and we recommend this method for estimation of species richness with “big data” collections. PMID:25692000

  7. Limited sampling hampers "big data" estimation of species richness in a tropical biodiversity hotspot.

    PubMed

    Engemann, Kristine; Enquist, Brian J; Sandel, Brody; Boyle, Brad; Jørgensen, Peter M; Morueta-Holme, Naia; Peet, Robert K; Violle, Cyrille; Svenning, Jens-Christian

    2015-02-01

    Macro-scale species richness studies often use museum specimens as their main source of information. However, such datasets are often strongly biased due to variation in sampling effort in space and time. These biases may strongly affect diversity estimates and may, thereby, obstruct solid inference on the underlying diversity drivers, as well as mislead conservation prioritization. In recent years, this has resulted in an increased focus on developing methods to correct for sampling bias. In this study, we use sample-size-correcting methods to examine patterns of tropical plant diversity in Ecuador, one of the most species-rich and climatically heterogeneous biodiversity hotspots. Species richness estimates were calculated based on 205,735 georeferenced specimens of 15,788 species using the Margalef diversity index, the Chao estimator, the second-order Jackknife and Bootstrapping resampling methods, and Hill numbers and rarefaction. Species richness was heavily correlated with sampling effort, and only rarefaction was able to remove this effect, and we recommend this method for estimation of species richness with "big data" collections.

  8. Glass Property Models and Constraints for Estimating the Glass to be Produced at Hanford by Implementing Current Advanced Glass Formulation Efforts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vienna, John D.; Kim, Dong-Sang; Skorski, Daniel C.

    2013-07-01

    Recent glass formulation and melter testing data have suggested that significant increases in waste loading in HLW and LAW glasses are possible over current system planning estimates. The data (although limited in some cases) were evaluated to determine a set of constraints and models that could be used to estimate the maximum loading of specific waste compositions in glass. It is recommended that these models and constraints be used to estimate the likely HLW and LAW glass volumes that would result if the current glass formulation studies are successfully completed. It is recognized that some of the models are preliminarymore » in nature and will change in the coming years. Plus the models do not currently address the prediction uncertainties that would be needed before they could be used in plant operations. The models and constraints are only meant to give an indication of rough glass volumes and are not intended to be used in plant operation or waste form qualification activities. A current research program is in place to develop the data, models, and uncertainty descriptions for that purpose. A fundamental tenet underlying the research reported in this document is to try to be less conservative than previous studies when developing constraints for estimating the glass to be produced by implementing current advanced glass formulation efforts. The less conservative approach documented herein should allow for the estimate of glass masses that may be realized if the current efforts in advanced glass formulations are completed over the coming years and are as successful as early indications suggest they may be. Because of this approach there is an unquantifiable uncertainty in the ultimate glass volume projections due to model prediction uncertainties that has to be considered along with other system uncertainties such as waste compositions and amounts to be immobilized, split factors between LAW and HLW, etc.« less

  9. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1994-01-01

    NASA is responsible for developing much of the nation's future space technology. Cost estimates for new programs are required early in the planning process so that decisions can be made accurately. Because of the long lead times required to develop space hardware, the cost estimates are frequently required 10 to 15 years before the program delivers hardware. The system design in conceptual phases of a program is usually only vaguely defined and the technology used is so often state-of-the-art or beyond. These factors combine to make cost estimating for conceptual programs very challenging. This paper describes an effort to develop parametric cost estimating methods for space systems in the conceptual design phase. The approach is to identify variables that drive cost such as weight, quantity, development culture, design inheritance and time. The nature of the relationships between the driver variables and cost will be discussed. In particular, the relationship between weight and cost will be examined in detail. A theoretical model of cost will be developed and tested statistically against a historical database of major research and development projects.

  10. Estimating the Success of OD Applications.

    ERIC Educational Resources Information Center

    Golembiewski, Robert T.; And Others

    1982-01-01

    Organizational development (OD) and its future are discussed. Examines database implications about OD's applications. Reports an effort to transcend the limitations of the literature, based on a very intensive search for OD applications in both business and government contexts. (CT)

  11. Inferring invasive species abundance using removal data from management actions

    USGS Publications Warehouse

    Davis, Amy J.; Hooten, Mevin B.; Miller, Ryan S.; Farnsworth, Matthew L.; Lewis, Jesse S.; Moxcey, Michael; Pepin, Kim M.

    2016-01-01

    Evaluation of the progress of management programs for invasive species is crucial for demonstrating impacts to stakeholders and strategic planning of resource allocation. Estimates of abundance before and after management activities can serve as a useful metric of population management programs. However, many methods of estimating population size are too labor intensive and costly to implement, posing restrictive levels of burden on operational programs. Removal models are a reliable method for estimating abundance before and after management using data from the removal activities exclusively, thus requiring no work in addition to management. We developed a Bayesian hierarchical model to estimate abundance from removal data accounting for varying levels of effort, and used simulations to assess the conditions under which reliable population estimates are obtained. We applied this model to estimate site-specific abundance of an invasive species, feral swine (Sus scrofa), using removal data from aerial gunning in 59 site/time-frame combinations (480–19,600 acres) throughout Oklahoma and Texas, USA. Simulations showed that abundance estimates were generally accurate when effective removal rates (removal rate accounting for total effort) were above 0.40. However, when abundances were small (<50) the effective removal rate needed to accurately estimates abundances was considerably higher (0.70). Based on our post-validation method, 78% of our site/time frame estimates were accurate. To use this modeling framework it is important to have multiple removals (more than three) within a time frame during which demographic changes are minimized (i.e., a closed population; ≤3 months for feral swine). Our results show that the probability of accurately estimating abundance from this model improves with increased sampling effort (8+ flight hours across the 3-month window is best) and increased removal rate. Based on the inverse relationship between inaccurate abundances and inaccurate removal rates, we suggest auxiliary information that could be collected and included in the model as covariates (e.g., habitat effects, differences between pilots) to improve accuracy of removal rates and hence abundance estimates.

  12. Renewing U.S. Mathematics; Critical Resource for the Future.

    DTIC Science & Technology

    1986-04-30

    developed tell us how to get a good estimate of the appropriate levels. Table 10 contains the numbers, which total $180 million per year. 38 Since FY...direct costs speaks for itself. Scientists in other fields would look at these numbers and ask how on earth research was getting done. Who was paying...mathematics funding are making an effort to redress the problem. However, recent budget constraints severely threaten their efforts. The issue is how to reach

  13. ESPC Coupled Global Prediction System - Develop and Test Coupled Physical Parameterizations: NAVGEM/CICE/HYCOM

    DTIC Science & Technology

    2013-09-30

    the Study of the Environmental Arctic Change (SEARCH) Sea Ice Outlook (SIO) effort. The SIO is an international effort to provide a community-wide...summary of the expected September arctic sea ice minimum. Monthly reports released throughout the summer synthesize community estimates of the current...state and expected minimum of sea ice . Along with the backbone components of this system (NAVGEM/HYCOM/CICE), other data models have been used to

  14. A Modeling Approach to Global Land Surface Monitoring with Low Resolution Satellite Imaging

    NASA Technical Reports Server (NTRS)

    Hlavka, Christine A.; Dungan, Jennifer; Livingston, Gerry P.; Gore, Warren J. (Technical Monitor)

    1998-01-01

    The effects of changing land use/land cover on global climate and ecosystems due to greenhouse gas emissions and changing energy and nutrient exchange rates are being addressed by federal programs such as NASA's Mission to Planet Earth (MTPE) and by international efforts such as the International Geosphere-Biosphere Program (IGBP). The quantification of these effects depends on accurate estimates of the global extent of critical land cover types such as fire scars in tropical savannas and ponds in Arctic tundra. To address the requirement for accurate areal estimates, methods for producing regional to global maps with satellite imagery are being developed. The only practical way to produce maps over large regions of the globe is with data of coarse spatial resolution, such as Advanced Very High Resolution Radiometer (AVHRR) weather satellite imagery at 1.1 km resolution or European Remote-Sensing Satellite (ERS) radar imagery at 100 m resolution. The accuracy of pixel counts as areal estimates is in doubt, especially for highly fragmented cover types such as fire scars and ponds. Efforts to improve areal estimates from coarse resolution maps have involved regression of apparent area from coarse data versus that from fine resolution in sample areas, but it has proven difficult to acquire sufficient fine scale data to develop the regression. A method for computing accurate estimates from coarse resolution maps using little or no fine data is therefore needed.

  15. ALTERNATIVE APPROACH TO ESTIMATING CANCER ...

    EPA Pesticide Factsheets

    The alternative approach for estimating cancer potency from inhalation exposure to asbestos seeks to improve the methods developed by USEPA (1986). This efforts seeks to modify the the current approach for estimating cancer potency for lung cancer and mesothelioma to account for the current scientific consensus that cancer risk from asbestos depends both on mineral type and on particle size distribution. In brief, epidemiological exposure-response data for lung cancer and mesothelioma in asbestos workers are combined with estimates of the mineral type(s) and partical size distribution at each exposure location in order to estimate potency factors that are specific to a selected set of mineral type and size

  16. Creating an effort tracking tool to improve therapeutic cancer clinical trials workload management and budgeting.

    PubMed

    James, Pam; Bebee, Patty; Beekman, Linda; Browning, David; Innes, Mathew; Kain, Jeannie; Royce-Westcott, Theresa; Waldinger, Marcy

    2011-11-01

    Quantifying data management and regulatory workload for clinical research is a difficult task that would benefit from a robust tool to assess and allocate effort. As in most clinical research environments, The University of Michigan Comprehensive Cancer Center (UMCCC) Clinical Trials Office (CTO) struggled to effectively allocate data management and regulatory time with frequently inaccurate estimates of how much time was required to complete the specific tasks performed by each role. In a dynamic clinical research environment in which volume and intensity of work ebbs and flows, determining requisite effort to meet study objectives was challenging. In addition, a data-driven understanding of how much staff time was required to complete a clinical trial was desired to ensure accurate trial budget development and effective cost recovery. Accordingly, the UMCCC CTO developed and implemented a Web-based effort-tracking application with the goal of determining the true costs of data management and regulatory staff effort in clinical trials. This tool was developed, implemented, and refined over a 3-year period. This article describes the process improvement and subsequent leveling of workload within data management and regulatory that enhanced the efficiency of UMCCC's clinical trials operation.

  17. Transient Inverse Calibration of Site-Wide Groundwater Model to Hanford Operational Impacts from 1943 to 1996--Alternative Conceptual Model Considering Interaction with Uppermost Basalt Confined Aquifer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vermeul, Vincent R.; Cole, Charles R.; Bergeron, Marcel P.

    2001-08-29

    The baseline three-dimensional transient inverse model for the estimation of site-wide scale flow parameters, including their uncertainties, using data on the transient behavior of the unconfined aquifer system over the entire historical period of Hanford operations, has been modified to account for the effects of basalt intercommunication between the Hanford unconfined aquifer and the underlying upper basalt confined aquifer. Both the baseline and alternative conceptual models (ACM-1) considered only the groundwater flow component and corresponding observational data in the 3-Dl transient inverse calibration efforts. Subsequent efforts will examine both groundwater flow and transport. Comparisons of goodness of fit measures andmore » parameter estimation results for the ACM-1 transient inverse calibrated model with those from previous site-wide groundwater modeling efforts illustrate that the new 3-D transient inverse model approach will strengthen the technical defensibility of the final model(s) and provide the ability to incorporate uncertainty in predictions related to both conceptual model and parameter uncertainty. These results, however, indicate that additional improvements are required to the conceptual model framework. An investigation was initiated at the end of this basalt inverse modeling effort to determine whether facies-based zonation would improve specific yield parameter estimation results (ACM-2). A description of the justification and methodology to develop this zonation is discussed.« less

  18. An investigation of TNAV equipped aircraft in a simulated en route metering environment

    NASA Technical Reports Server (NTRS)

    Groce, J. L.; Izumi, K. H.; Markham, C. H.; Schwab, R. W.; Taylor, J. A.

    1986-01-01

    This document presents the results of an effort to estimate how often a TNAV (Time Navigation) equipped aircraft could be given a TNAV clearance in the En Route Metering (ERM) system as a function of the percentage of arriving traffic which is TNAV equipped. A fast-time simulation of Denver Stapleton international arrival traffic in the Denver Air Route Traffic Control Center route structure, including en route metering operations, was used to develop data on estimated conflicts, clearance communications and fuel usage for traffic mixes of 25, 50, 75 and 100% TNAV equipped. This study supports an overall effort by NASA to assess the benefits and required technology for using TNAV-equipped aircraft in the ERM environment.

  19. Prevalence of Gestational Diabetes and Risk of Progression to Type 2 Diabetes: a Global Perspective.

    PubMed

    Zhu, Yeyi; Zhang, Cuilin

    2016-01-01

    Despite the increasing epidemic of diabetes mellitus affecting populations at different life stages, the global burden of gestational diabetes mellitus (GDM) is not well assessed. Systematically synthesized data on global prevalence estimates of GDM are lacking, particularly among developing countries. The hyperglycemic intrauterine environment as exemplified in pregnancies complicated by GDM might not only reflect but also fuel the epidemic of type 2 diabetes mellitus (T2DM). We comprehensively reviewed available data in the past decade in an attempt to estimate the contemporary global prevalence of GDM by country and region. We reviewed the risk of progression from GDM to T2DM as well. Synthesized data demonstrate wide variations in both prevalence estimates of GDM and the risk of progression from GDM to T2DM. Direct comparisons of GDM burden across countries or regions are challenging given the great heterogeneity in screening approaches, diagnostic criteria, and underlying population characteristics. In this regard, collaborative efforts to estimate global GDM prevalence would be a large but important leap forward. Such efforts may have substantial public health implications in terms of informing health policy makers and healthcare providers for disease burden and for developing more targeted and effective diabetes prevention and management strategies globally.

  20. An approach to software cost estimation

    NASA Technical Reports Server (NTRS)

    Mcgarry, F.; Page, J.; Card, D.; Rohleder, M.; Church, V.

    1984-01-01

    A general procedure for software cost estimation in any environment is outlined. The basic concepts of work and effort estimation are explained, some popular resource estimation models are reviewed, and the accuracy of source estimates is discussed. A software cost prediction procedure based on the experiences of the Software Engineering Laboratory in the flight dynamics area and incorporating management expertise, cost models, and historical data is described. The sources of information and relevant parameters available during each phase of the software life cycle are identified. The methodology suggested incorporates these elements into a customized management tool for software cost prediction. Detailed guidelines for estimation in the flight dynamics environment developed using this methodology are presented.

  1. Combining inferences from models of capture efficiency, detectability, and suitable habitat to classify landscapes for conservation of threatened bull trout

    USGS Publications Warehouse

    Peterson, J.; Dunham, J.B.

    2003-01-01

    Effective conservation efforts for at-risk species require knowledge of the locations of existing populations. Species presence can be estimated directly by conducting field-sampling surveys or alternatively by developing predictive models. Direct surveys can be expensive and inefficient, particularly for rare and difficult-to-sample species, and models of species presence may produce biased predictions. We present a Bayesian approach that combines sampling and model-based inferences for estimating species presence. The accuracy and cost-effectiveness of this approach were compared to those of sampling surveys and predictive models for estimating the presence of the threatened bull trout ( Salvelinus confluentus ) via simulation with existing models and empirical sampling data. Simulations indicated that a sampling-only approach would be the most effective and would result in the lowest presence and absence misclassification error rates for three thresholds of detection probability. When sampling effort was considered, however, the combined approach resulted in the lowest error rates per unit of sampling effort. Hence, lower probability-of-detection thresholds can be specified with the combined approach, resulting in lower misclassification error rates and improved cost-effectiveness.

  2. Population estimates and monitoring guidelines for endangered Laysan Teal, Anas Laysanensis, at Midway Atoll: Pilot study results 2008-2010.

    USGS Publications Warehouse

    Reynolds, Michelle H.; Brinck, Kevin W.; Laniawe, Leona

    2011-01-01

    To improve the Laysan Teal population estimates, we recommend changes to the monitoring protocol. Additional years of data are needed to quantify inter-annual seasonal detection probabilities, which may allow the use of standardized direct counts as an unbiased index of population size. Survey protocols should be enhanced through frequent resights, regular survey intervals, and determining reliable standards to detect catastrophic declines and annual changes in adult abundance. In late 2009 to early 2010, 68% of the population was marked with unique color band combinations. This allowed for potentially accurate adult population estimates and survival estimates without the need to mark new birds in 2010, 2011, and possibly 2012. However, efforts should be made to replace worn or illegible bands so birds can be identified in future surveys. It would be valuable to develop more sophisticated population size and survival models using Program MARK, a state-of-the-art software package which uses likelihood models to analyze mark-recapture data. This would allow for more reliable adult population and survival estimates to compare with the ―source‖ Laysan Teal population on Laysan Island. These models will require additional years of resight data (> 1 year) and, in some cases, an intensive annual effort of marking and recapture. Because data indicate standardized all-wetland counts are a poor index of abundance, monitoring efforts could be improved by expanding resight surveys to include all wetlands, discontinuing the all-wetland counts, and reallocating some of the wetland count effort to collect additional opportunistic resights. Approximately two years of additional bimonthly surveys are needed to validate the direct count as an appropriate index of population abundance. Additional years of individual resight data will allow estimates of adult population size, as specified in recovery criteria, and to track species population dynamics at Midway Atoll.

  3. Fuel-cycle emissions for conventional and alternative fuel vehicles : an assessment of air toxics

    DOT National Transportation Integrated Search

    2000-08-01

    This report provides information on recent efforts to use the Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation (GREET) fuel-cycle model to estimate air toxics emissions. GREET, developed at Argonne National Laboratory, currentl...

  4. Advances in marker-assisted breeding of sugarcane

    USDA-ARS?s Scientific Manuscript database

    Despite the challenges posed by sugarcane, geneticists and breeders have actively sought to use DNA marker technology to enhance breeding efforts. Markers have been used to explore taxonomy, estimate genetic diversity, and to develop unique molecular fingerprints. Numerous studies have been undertak...

  5. A Planning Tool for Estimating Waste Generated by a Radiological Incident and Subsequent Decontamination Efforts - 13569

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boe, Timothy; Lemieux, Paul; Schultheisz, Daniel

    2013-07-01

    Management of debris and waste from a wide-area radiological incident would probably constitute a significant percentage of the total remediation cost and effort. The U.S. Environmental Protection Agency's (EPA's) Waste Estimation Support Tool (WEST) is a unique planning tool for estimating the potential volume and radioactivity levels of waste generated by a radiological incident and subsequent decontamination efforts. The WEST was developed to support planners and decision makers by generating a first-order estimate of the quantity and characteristics of waste resulting from a radiological incident. The tool then allows the user to evaluate the impact of various decontamination/demolition strategies onmore » the waste types and volumes generated. WEST consists of a suite of standalone applications and Esri{sup R} ArcGIS{sup R} scripts for rapidly estimating waste inventories and levels of radioactivity generated from a radiological contamination incident as a function of user-defined decontamination and demolition approaches. WEST accepts Geographic Information System (GIS) shape-files defining contaminated areas and extent of contamination. Building stock information, including square footage, building counts, and building composition estimates are then generated using the Federal Emergency Management Agency's (FEMA's) Hazus{sup R}-MH software. WEST then identifies outdoor surfaces based on the application of pattern recognition to overhead aerial imagery. The results from the GIS calculations are then fed into a Microsoft Excel{sup R} 2007 spreadsheet with a custom graphical user interface where the user can examine the impact of various decontamination/demolition scenarios on the quantity, characteristics, and residual radioactivity of the resulting waste streams. (authors)« less

  6. Quality-Based Analysis Capability for National Youth Surveys: Development, Application, and Implications for Policy.

    ERIC Educational Resources Information Center

    Orvis, Bruce R.; Gahart, Martin T.

    As part of the military recruiting effort, the Department of Defense sponsors surveys of the national youth population to help design recruiting and advertising strategies. This report develops and applies a method of using the information contained in national youth surveys to estimate the probability that respondents taking the Armed Forces…

  7. Development and characterization of microsatellite markers in the Point Arena mountain beaver Aplodontia rufa nigra

    Treesearch

    Kristine L. Pilgrim; William J. Zielinski; Mary J. Mazurek; Frederick V. Schlexer; Michael K. Schwartz

    2006-01-01

    The Point Arena mountain beaver (Aplodontia rufa nigra) is an endangered subspecies. Efforts to recover this sub-species will be aided by advances in molecular genetics, specifically the ability to estimate population size using noninvasive genetic sampling. Here we report on the development of nine polymorphic loci for the Point Arena mountain...

  8. Investigating Alternatives to the Fish Early Life-Stage Test: A Strategy for Discovering and Annotating Adverse Outcome Pathways for Early Fish Development

    EPA Science Inventory

    The fish early life-stage (FELS) test (OECD Test Guideline 210) is the primary test used internationally to estimate chronic fish toxicity in support of ecological risk assessments and chemical management programs. As part of an on-going effort to develop efficient and cost-effec...

  9. The costs of evaluating species densities and composition of snakes to assess development impacts in amazonia.

    PubMed

    Fraga, Rafael de; Stow, Adam J; Magnusson, William E; Lima, Albertina P

    2014-01-01

    Studies leading to decision-making for environmental licensing often fail to provide accurate estimates of diversity. Measures of snake diversity are regularly obtained to assess development impacts in the rainforests of the Amazon Basin, but this taxonomic group may be subject to poor detection probabilities. Recently, the Brazilian government tried to standardize sampling designs by the implementation of a system (RAPELD) to quantify biological diversity using spatially-standardized sampling units. Consistency in sampling design allows the detection probabilities to be compared among taxa, and sampling effort and associated cost to be evaluated. The cost effectiveness of detecting snakes has received no attention in Amazonia. Here we tested the effects of reducing sampling effort on estimates of species densities and assemblage composition. We identified snakes in seven plot systems, each standardised with 14 plots. The 250 m long centre line of each plot followed an altitudinal contour. Surveys were repeated four times in each plot and detection probabilities were estimated for the 41 species encountered. Reducing the number of observations, or the size of the sampling modules, caused significant loss of information on species densities and local patterns of variation in assemblage composition. We estimated the cost to find a snake as $ 120 U.S., but general linear models indicated the possibility of identifying differences in assemblage composition for half the overall survey costs. Decisions to reduce sampling effort depend on the importance of lost information to target-issues, and may not be the preferred option if there is the potential for identifying individual snake species requiring specific conservation actions. However, in most studies of human disturbance on species assemblages, it is likely to be more cost-effective to focus on other groups of organisms with higher detection probabilities.

  10. The Costs of Evaluating Species Densities and Composition of Snakes to Assess Development Impacts in Amazonia

    PubMed Central

    de Fraga, Rafael; Stow, Adam J.; Magnusson, William E.; Lima, Albertina P.

    2014-01-01

    Studies leading to decision-making for environmental licensing often fail to provide accurate estimates of diversity. Measures of snake diversity are regularly obtained to assess development impacts in the rainforests of the Amazon Basin, but this taxonomic group may be subject to poor detection probabilities. Recently, the Brazilian government tried to standardize sampling designs by the implementation of a system (RAPELD) to quantify biological diversity using spatially-standardized sampling units. Consistency in sampling design allows the detection probabilities to be compared among taxa, and sampling effort and associated cost to be evaluated. The cost effectiveness of detecting snakes has received no attention in Amazonia. Here we tested the effects of reducing sampling effort on estimates of species densities and assemblage composition. We identified snakes in seven plot systems, each standardised with 14 plots. The 250 m long centre line of each plot followed an altitudinal contour. Surveys were repeated four times in each plot and detection probabilities were estimated for the 41 species encountered. Reducing the number of observations, or the size of the sampling modules, caused significant loss of information on species densities and local patterns of variation in assemblage composition. We estimated the cost to find a snake as $ 120 U.S., but general linear models indicated the possibility of identifying differences in assemblage composition for half the overall survey costs. Decisions to reduce sampling effort depend on the importance of lost information to target-issues, and may not be the preferred option if there is the potential for identifying individual snake species requiring specific conservation actions. However, in most studies of human disturbance on species assemblages, it is likely to be more cost-effective to focus on other groups of organisms with higher detection probabilities. PMID:25147930

  11. Sensitivity analysis of add-on price estimate for select silicon wafering technologies

    NASA Technical Reports Server (NTRS)

    Mokashi, A. R.

    1982-01-01

    The cost of producing wafers from silicon ingots is a major component of the add-on price of silicon sheet. Economic analyses of the add-on price estimates and their sensitivity internal-diameter (ID) sawing, multiblade slurry (MBS) sawing and fixed-abrasive slicing technique (FAST) are presented. Interim price estimation guidelines (IPEG) are used for estimating a process add-on price. Sensitivity analysis of price is performed with respect to cost parameters such as equipment, space, direct labor, materials (blade life) and utilities, and the production parameters such as slicing rate, slices per centimeter and process yield, using a computer program specifically developed to do sensitivity analysis with IPEG. The results aid in identifying the important cost parameters and assist in deciding the direction of technology development efforts.

  12. Sampling effort needed to estimate condition and species richness in the Ohio River, USA

    EPA Science Inventory

    The level of sampling effort required to characterize fish assemblage condition in a river for the purposes of bioassessment may be estimated via different approaches. However, the goal with any approach is to determine the minimum level of effort necessary to reach some specific...

  13. Methodological Framework for Analysis of Buildings-Related Programs with BEAMS, 2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elliott, Douglas B.; Dirks, James A.; Hostick, Donna J.

    The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official “benefits estimates” for each of its major programs using its Planning, Analysis, and Evaluation (PAE) Team. PAE conducts an annual integrated modeling and analysis effort to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. These estimates are part of EERE’s budget request and are also used in the formulation of EERE’s performance measures. Two of EERE’s major programs are the Building Technologies Program (BT) and the Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports PAEmore » by developing the program characterizations and other market information necessary to provide input to the EERE integrated modeling analysis as part of PAE’s Portfolio Decision Support (PDS) effort. Additionally, PNNL also supports BT by providing line-item estimates for the Program’s internal use. PNNL uses three modeling approaches to perform these analyses. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits using one of those methods: the Building Energy Analysis and Modeling System (BEAMS). BEAMS is a PC-based accounting model that was built in Visual Basic by PNNL specifically for estimating the benefits of buildings-related projects. It allows various types of projects to be characterized including whole-building, envelope, lighting, and equipment projects. This document contains an overview section that describes the estimation process and the models used to estimate energy savings. The body of the document describes the algorithms used within the BEAMS software. This document serves both as stand-alone documentation for BEAMS, and also as a supplemental update of a previous document, Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics Effort, (Elliott et al. 2004b). The areas most changed since the publication of that previous document are those discussing the calculation of lighting and HVAC interactive effects (for both lighting and envelope/whole-building projects). This report does not attempt to convey inputs to BEAMS or the methodology of their derivation.« less

  14. AERIS : State-of-the-Practice Scan of Environmental Models

    DOT National Transportation Integrated Search

    2011-06-24

    This report has been developed under the Track 1 effort of Phase 1 of the AERIS program and presents the findings of the state-of-the-practice scan of environmental models to estimate environmental impacts (emissions, fuel consumption, etc.) due to c...

  15. The Influence of Mark-Recapture Sampling Effort on Estimates of Rock Lobster Survival

    PubMed Central

    Kordjazi, Ziya; Frusher, Stewart; Buxton, Colin; Gardner, Caleb; Bird, Tomas

    2016-01-01

    Five annual capture-mark-recapture surveys on Jasus edwardsii were used to evaluate the effect of sample size and fishing effort on the precision of estimated survival probability. Datasets of different numbers of individual lobsters (ranging from 200 to 1,000 lobsters) were created by random subsampling from each annual survey. This process of random subsampling was also used to create 12 datasets of different levels of effort based on three levels of the number of traps (15, 30 and 50 traps per day) and four levels of the number of sampling-days (2, 4, 6 and 7 days). The most parsimonious Cormack-Jolly-Seber (CJS) model for estimating survival probability shifted from a constant model towards sex-dependent models with increasing sample size and effort. A sample of 500 lobsters or 50 traps used on four consecutive sampling-days was required for obtaining precise survival estimations for males and females, separately. Reduced sampling effort of 30 traps over four sampling days was sufficient if a survival estimate for both sexes combined was sufficient for management of the fishery. PMID:26990561

  16. Cross-bispectrum computation and variance estimation

    NASA Technical Reports Server (NTRS)

    Lii, K. S.; Helland, K. N.

    1981-01-01

    A method for the estimation of cross-bispectra of discrete real time series is developed. The asymptotic variance properties of the bispectrum are reviewed, and a method for the direct estimation of bispectral variance is given. The symmetry properties are described which minimize the computations necessary to obtain a complete estimate of the cross-bispectrum in the right-half-plane. A procedure is given for computing the cross-bispectrum by subdividing the domain into rectangular averaging regions which help reduce the variance of the estimates and allow easy application of the symmetry relationships to minimize the computational effort. As an example of the procedure, the cross-bispectrum of a numerically generated, exponentially distributed time series is computed and compared with theory.

  17. Inferences about population dynamics from count data using multi-state models: A comparison to capture-recapture approaches

    USGS Publications Warehouse

    Grant, Evan H. Campbell; Zipkin, Elise; Scott, Sillett T.; Chandler, Richard; Royle, J. Andrew

    2014-01-01

    Wildlife populations consist of individuals that contribute disproportionately to growth and viability. Understanding a population's spatial and temporal dynamics requires estimates of abundance and demographic rates that account for this heterogeneity. Estimating these quantities can be difficult, requiring years of intensive data collection. Often, this is accomplished through the capture and recapture of individual animals, which is generally only feasible at a limited number of locations. In contrast, N-mixture models allow for the estimation of abundance, and spatial variation in abundance, from count data alone. We extend recently developed multistate, open population N-mixture models, which can additionally estimate demographic rates based on an organism's life history characteristics. In our extension, we develop an approach to account for the case where not all individuals can be assigned to a state during sampling. Using only state-specific count data, we show how our model can be used to estimate local population abundance, as well as density-dependent recruitment rates and state-specific survival. We apply our model to a population of black-throated blue warblers (Setophaga caerulescens) that have been surveyed for 25 years on their breeding grounds at the Hubbard Brook Experimental Forest in New Hampshire, USA. The intensive data collection efforts allow us to compare our estimates to estimates derived from capture–recapture data. Our model performed well in estimating population abundance and density-dependent rates of annual recruitment/immigration. Estimates of local carrying capacity and per capita recruitment of yearlings were consistent with those published in other studies. However, our model moderately underestimated annual survival probability of yearling and adult females and severely underestimates survival probabilities for both of these male stages. The most accurate and precise estimates will necessarily require some amount of intensive data collection efforts (such as capture–recapture). Integrated population models that combine data from both intensive and extensive sources are likely to be the most efficient approach for estimating demographic rates at large spatial and temporal scales.

  18. EPA Handbook on the Benefits, Costs, and Impacts of Land Cleanup and Reuse (2011)

    EPA Pesticide Factsheets

    This Handbook describes EPA‘s land cleanup and reuse programs and outlines some of the unique aspects that have complicated efforts to develop suitable methods for estimating benefits. It clarifies the differences between types of economic analyses—specif

  19. Oxidative Responses to Extracted Cookstove Emissions in Lung Epithelial Cells

    EPA Science Inventory

    Exposure to cookstove emissions (CE) has been linked to significant increases in morbidity and mortality, with current estimates attributing CE exposure to over 4 million deaths annually. The development of several new cookstove (CS) designs has led efforts to reduce CE with rela...

  20. Development of an areawide estimate of truck freight value in the urban mobility report.

    DOT National Transportation Integrated Search

    2012-08-01

    Significant efforts have resulted in improved knowledge about the effects of congestion on the motoring public. The Urban : Mobility Report (UMR) has been produced for over 20 years detailing the effects of congestion in the United States (1). Despit...

  1. The characterization of socio-political instability, development and sustainability with Fisher information

    EPA Science Inventory

    In an effort to evaluate socio-political instability, we studied the relationship between dynamic order, socio-political upheavals and sustainability in nation states. Estimating the degree of dynamic order inherent in the socio-political regime of various countries throughout th...

  2. Development and Experimental Application of International Affairs Indicators. Volume A

    DTIC Science & Technology

    1974-06-01

    DEVELOPMENT ’^EXPERIMENTAL APPttcATION OF INTERNATIONAL AFFAIRS INDICATORS Volume A Final Report e June 1974 US I Sponsored by: Defense Advanced...intelligence communities were designed, techniques for estimating the future were developed and tested, and the techniques and indicators were applied to the...past year’s effort is that the intelligence community has become increasingly aware of the potential use- fulness of quantitative indicators. The

  3. Redox flow cell development and demonstration project, calendar year 1976

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The major focus of the effort was the key technology issues that directly influence the fundamental feasibility of the overall redox concept. These issues were the development of a suitable semipermeable separator membrane for the system, the screening and study of candidate redox couples to achieve optimum cell performance, and the carrying out of systems analysis and modeling to develop system performance goals and cost estimates.

  4. The impact of overstory density on reproduction establishment in the Missouri Ozarks: models for simulating regeneration stochastically

    Treesearch

    Lance A. Vickers; David R. Larsen; Daniel C. Dey; Benjamin O. Knapp; John M. Kabrick

    2017-01-01

    Predicting the effects of silvicultural choices on regeneration has been difficult with the tools available to foresters. In an effort to improve this, we developed a collection of reproduction establishment models based on stand development hypotheses and parameterized with empirical data for several species in the Missouri Ozarks. These models estimate third-year...

  5. Early Training Estimation System

    DTIC Science & Technology

    1980-08-01

    first year efforts in CTES development were Cecil Wakelin, Gavin Livingstone, Ray Walsh« Peter Weddle, David Herlihy, Laurel Brown, Drs. Paul Ronco...and Society, 1980, pp. 1067-1974. David , J., Price, J. Successful communication in full scale engineering development statements of work. Air Force...1980, U.S. Army Engineering Laboratory. Shrier , S. Algorithms for system design. Proceedings of the International Conference on Cybernetics and

  6. Lean burn combustor technology at GE Aircraft Engines

    NASA Technical Reports Server (NTRS)

    Dodds, Willard J.

    1992-01-01

    This presentation summarizes progress to date at GE Aircraft Engines in demonstration of a lean combustion system for the High Speed Civil Transport (HSCT). These efforts were supported primarily by NASA contracts, with the exception of initial size and weight estimates and development of advanced diagnostics which were conducted under GE Independent Research and Development projects. Key accomplishments to date are summarized below.

  7. Crop Characteristics Research: Growth and Reflectance Analysis

    NASA Technical Reports Server (NTRS)

    Badhwar, G. D. (Principal Investigator)

    1985-01-01

    Much of the early research in remote sensing follows along developing spectral signatures of cover types. It was found, however, that a signature from an unknown cover class could not always be matched to a catalog value of known cover class. This approach was abandoned and supervised classification schemes followed. These were not efficient and required extensive training. It was obvious that data acquired at a single time could not separate cover types. A large portion of the proposed research has concentrated on modeling the temporal behavior of agricultural crops and on removing the need for any training data in remote sensing surveys; the key to which is the solution of the so-called 'signature extension' problem. A clear need to develop spectral estimaters of crop ontogenic stages and yield has existed even though various correlations have been developed. Considerable effort in developing techniques to estimate these variables was devoted to this work. The need to accurately evaluate existing canopy reflectance model(s), improve these models, use them to understand the crop signatures, and estimate leaf area index was the third objective of the proposed work. A synopsis of this research effort is discussed.

  8. Understanding cost growth during operations of planetary missions: An explanation of changes

    NASA Astrophysics Data System (ADS)

    McNeill, J. F.; Chapman, E. L.; Sklar, M. E.

    In the development of project cost estimates for interplanetary missions, considerable focus is generally given to the development of cost estimates for the development of ground, flight, and launch systems, i.e., Phases B, C, and D. Depending on the project team, efforts expended to develop cost estimates for operations (Phase E) may be relatively less rigorous than that devoted to estimates for ground and flight systems development. Furthermore, the project team may be challenged to develop a solid estimate of operations cost in the early stages of mission development, e.g., Concept Study Report or Systems Requirement Review (CSR/SRR), Preliminary Design Review (PDR), as mission specific peculiarities that impact cost may not be well understood. In addition, a methodology generally used to develop Phase E cost is engineering build-up, also known as “ grass roots” . Phase E can include cost and schedule risks that are not anticipated at the time of the major milestone reviews prior to launch. If not incorporated into the engineering build-up cost method for Phase E, this may translate into an estimation of the complexity of operations and overall cost estimates that are not mature and at worse, insufficient. As a result, projects may find themselves with thin reserves during cruise and on-orbit operations or project overruns prior to the end of mission. This paper examines a set of interplanetary missions in an effort to better understand the reasons for cost and staffing growth in Phase E. The method used in the study is discussed as well as the major findings summarized as the Phase E Explanation of Change (EoC). Research for the study entailed the review of project materials, including Estimates at Completion (EAC) for Phase E and staffing profiles, major project milestone reviews, e.g., CSR, PDR, Critical Design Review (CDR), the interviewing of select project and mission management, and review of Phase E replan materials. From this work, a detai- ed picture is constructed of why cost grew during the operations phase, even to the level of specific events in the life of the missions. As a next step, the Phase E EoC results were gleaned and synthesized to produce leading indicators, i.e., what may be identifiable signs of cost and staffing growth that may be present as early as PDR or CDR. Both a qualitative and quantitative approach was used to determine leading indicators. These leading indicators will be reviewed and a practical method for their use will be discussed.

  9. Development on electromagnetic impedance function modeling and its estimation

    NASA Astrophysics Data System (ADS)

    Sutarno, D.

    2015-09-01

    Today the Electromagnetic methods such as magnetotellurics (MT) and controlled sources audio MT (CSAMT) is used in a broad variety of applications. Its usefulness in poor seismic areas and its negligible environmental impact are integral parts of effective exploration at minimum cost. As exploration was forced into more difficult areas, the importance of MT and CSAMT, in conjunction with other techniques, has tended to grow continuously. However, there are obviously important and difficult problems remaining to be solved concerning our ability to collect process and interpret MT as well as CSAMT in complex 3D structural environments. This talk aim at reviewing and discussing the recent development on MT as well as CSAMT impedance functions modeling, and also some improvements on estimation procedures for the corresponding impedance functions. In MT impedance modeling, research efforts focus on developing numerical method for computing the impedance functions of three dimensionally (3-D) earth resistivity models. On that reason, 3-D finite elements numerical modeling for the impedances is developed based on edge element method. Whereas, in the CSAMT case, the efforts were focused to accomplish the non-plane wave problem in the corresponding impedance functions. Concerning estimation of MT and CSAMT impedance functions, researches were focused on improving quality of the estimates. On that objective, non-linear regression approach based on the robust M-estimators and the Hilbert transform operating on the causal transfer functions, were used to dealing with outliers (abnormal data) which are frequently superimposed on a normal ambient MT as well as CSAMT noise fields. As validated, the proposed MT impedance modeling method gives acceptable results for standard three dimensional resistivity models. Whilst, the full solution based modeling that accommodate the non-plane wave effect for CSAMT impedances is applied for all measurement zones, including near-, transition-as well as the far-field zones, and consequently the plane wave correction is no longer needed for the impedances. In the resulting robust impedance estimates, outlier contamination is removed and the self consistency between the real and imaginary parts of the impedance estimates is guaranteed. Using synthetic and real MT data, it is shown that the proposed robust estimation methods always yield impedance estimates which are better than the conventional least square (LS) estimation, even under condition of severe noise contamination. A recent development on the constrained robust CSAMT impedance estimation is also discussed. By using synthetic CSAMT data it is demonstrated that the proposed methods can produce usable CSAMT transfer functions for all measurement zones.

  10. Development on electromagnetic impedance function modeling and its estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sutarno, D., E-mail: Sutarno@fi.itb.ac.id

    2015-09-30

    Today the Electromagnetic methods such as magnetotellurics (MT) and controlled sources audio MT (CSAMT) is used in a broad variety of applications. Its usefulness in poor seismic areas and its negligible environmental impact are integral parts of effective exploration at minimum cost. As exploration was forced into more difficult areas, the importance of MT and CSAMT, in conjunction with other techniques, has tended to grow continuously. However, there are obviously important and difficult problems remaining to be solved concerning our ability to collect process and interpret MT as well as CSAMT in complex 3D structural environments. This talk aim atmore » reviewing and discussing the recent development on MT as well as CSAMT impedance functions modeling, and also some improvements on estimation procedures for the corresponding impedance functions. In MT impedance modeling, research efforts focus on developing numerical method for computing the impedance functions of three dimensionally (3-D) earth resistivity models. On that reason, 3-D finite elements numerical modeling for the impedances is developed based on edge element method. Whereas, in the CSAMT case, the efforts were focused to accomplish the non-plane wave problem in the corresponding impedance functions. Concerning estimation of MT and CSAMT impedance functions, researches were focused on improving quality of the estimates. On that objective, non-linear regression approach based on the robust M-estimators and the Hilbert transform operating on the causal transfer functions, were used to dealing with outliers (abnormal data) which are frequently superimposed on a normal ambient MT as well as CSAMT noise fields. As validated, the proposed MT impedance modeling method gives acceptable results for standard three dimensional resistivity models. Whilst, the full solution based modeling that accommodate the non-plane wave effect for CSAMT impedances is applied for all measurement zones, including near-, transition-as well as the far-field zones, and consequently the plane wave correction is no longer needed for the impedances. In the resulting robust impedance estimates, outlier contamination is removed and the self consistency between the real and imaginary parts of the impedance estimates is guaranteed. Using synthetic and real MT data, it is shown that the proposed robust estimation methods always yield impedance estimates which are better than the conventional least square (LS) estimation, even under condition of severe noise contamination. A recent development on the constrained robust CSAMT impedance estimation is also discussed. By using synthetic CSAMT data it is demonstrated that the proposed methods can produce usable CSAMT transfer functions for all measurement zones.« less

  11. Development of a banding database for North Pacific albatross: Implications for future data collection

    USGS Publications Warehouse

    Doherty, P.F.; Kendall, W.L.; Sillett, S.; Gustafson, M.; Flint, B.; Naughton, M.; Robbins, C.S.; Pyle, P.; Macintyre, Ian G.

    2006-01-01

    The effects of fishery practices on black-footed (Phoebastria nigripes) and Laysan albatross (Phoebastria immutabilis) continue to be a source of contention and uncertainty. Some of this uncertainty is a result of a lack of estimates of albatross demographic parameters such as survival. To begin to address these informational needs, a database of albatross banding and encounter records was constructed. Due to uncertainty concerning data collection and validity of assumptions required for mark-recapture analyses, these data should be used with caution. Although demographic parameter estimates are of interest to many, band loss rates, temporary emigration rates, and discontinuous banding effort can confound these estimates. We suggest a number of improvements in data collection that can help ameliorate problems, including the use of double banding and collecting data using a `robust? design. Additionally, sustained banding and encounter efforts are needed to maximize the value of these data. With these modifications, the usefulness of the banding data could be improved markedly.

  12. Terrestrial gravity data analysis for interim gravity model improvement

    NASA Technical Reports Server (NTRS)

    1987-01-01

    This is the first status report for the Interim Gravity Model research effort that was started on June 30, 1986. The basic theme of this study is to develop appropriate models and adjustment procedures for estimating potential coefficients from terrestrial gravity data. The plan is to use the latest gravity data sets to produce coefficient estimates as well as to provide normal equations to NASA for use in the TOPEX/POSEIDON gravity field modeling program.

  13. Developing recreational harvest regulations for an unexploited lake trout population

    USGS Publications Warehouse

    Lenker, Melissa A; Weidel, Brian C.; Jensen, Olaf P.; Solomon, Christopher T.

    2016-01-01

    Developing fishing regulations for previously unexploited populations presents numerous challenges, many of which stem from a scarcity of baseline information about abundance, population productivity, and expected angling pressure. We used simulation models to test the effect of six management strategies (catch and release; trophy, minimum, and maximum length limits; and protected and exploited slot length limits) on an unexploited population of Lake Trout Salvelinus namaycush in Follensby Pond, a 393-ha lake located in New York State’s Adirondack Park. We combined field and literature data and mark–recapture abundance estimates to parameterize an age-structured population model and used the model to assess the effects of each management strategy on abundance, catch per unit effort (CPUE), and harvest over a range of angler effort (0–2,000 angler-days/year). Lake Trout density (3.5 fish/ha for fish ≥ age 13, the estimated age at maturity) was similar to densities observed in other unexploited systems, but growth rate was relatively slow. Maximum harvest occurred at levels of effort ≤ 1,000 angler-days/year in all the scenarios considered. Regulations that permitted harvest of large postmaturation fish, such as New York’s standard Lake Trout minimum size limit or a trophy size limit, resulted in low harvest and high angler CPUE. Regulations that permitted harvest of small and sometimes immature fish, such as a protected slot or maximum size limit, allowed high harvest but resulted in low angler CPUE and produced rapid declines in harvest with increases in effort beyond the effort consistent with maximum yield. Management agencies can use these results to match regulations to management goals and to assess the risks of different management options for unexploited Lake Trout populations and other fish species with similar life history traits.

  14. Environmental Public Health Survelliance for Exposure to Respiratory Health Hazards: A Joint NASA/CDC Project to Use Remote Sensing Data for Estimating Airborne Particulate Matter Over the Atlanta, Georgia Metropolitan Area

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Rickman, Douglas; Mohammad, Al-Hamdan; Crosson, William; Estes, Maurice, Jr.; Limaye, Ashutosh; Qualters, Judith

    2008-01-01

    Describes the public health surveillance efforts of NASA, in a joint effort with the Center for Disease Control (CDC). NASA/MSFC and the CDC are partners in linking nvironmental and health data to enhance public health surveillance. The use of NASA technology creates value - added geospatial products from existing environmental data sources to facilitate public health linkages. The venture sought to provide remote sensing data for the 5-country Metro-Atlanta area and to integrate this environmental data with public health data into a local network, in an effort to prevent and control environmentally related health effects. Remote sensing data used environmental data (Environmental Protection Agency [EPA] Air Quality System [AQS] ground measurements and MODIS Aerosol Optical Depth [AOD]) to estimate airborne particulate matter over Atlanta, and linked this data with health data related to asthma. The study proved the feasibility of linking environmental data (MODIS particular matter estimates and AQS) with health data (asthma). Algorithms were developed for QC, bias removal, merging MODIS and AQS particulate matter data, as well as for other applications. Additionally, a Business Associate Agreement was negotiated for a health care provider to enable sharing of Protected Health Information.

  15. Towards an Early Software Effort Estimation Based on Functional and Non-Functional Requirements

    NASA Astrophysics Data System (ADS)

    Kassab, Mohamed; Daneva, Maya; Ormandjieva, Olga

    The increased awareness of the non-functional requirements as a key to software project and product success makes explicit the need to include them in any software project effort estimation activity. However, the existing approaches to defining size-based effort relationships still pay insufficient attention to this need. This paper presents a flexible, yet systematic approach to the early requirements-based effort estimation, based on Non-Functional Requirements ontology. It complementarily uses one standard functional size measurement model and a linear regression technique. We report on a case study which illustrates the application of our solution approach in context and also helps evaluate our experiences in using it.

  16. Inferring invasive species abundance using removal data from management actions.

    PubMed

    Davis, Amy J; Hooten, Mevin B; Miller, Ryan S; Farnsworth, Matthew L; Lewis, Jesse; Moxcey, Michael; Pepin, Kim M

    2016-10-01

    Evaluation of the progress of management programs for invasive species is crucial for demonstrating impacts to stakeholders and strategic planning of resource allocation. Estimates of abundance before and after management activities can serve as a useful metric of population management programs. However, many methods of estimating population size are too labor intensive and costly to implement, posing restrictive levels of burden on operational programs. Removal models are a reliable method for estimating abundance before and after management using data from the removal activities exclusively, thus requiring no work in addition to management. We developed a Bayesian hierarchical model to estimate abundance from removal data accounting for varying levels of effort, and used simulations to assess the conditions under which reliable population estimates are obtained. We applied this model to estimate site-specific abundance of an invasive species, feral swine (Sus scrofa), using removal data from aerial gunning in 59 site/time-frame combinations (480-19,600 acres) throughout Oklahoma and Texas, USA. Simulations showed that abundance estimates were generally accurate when effective removal rates (removal rate accounting for total effort) were above 0.40. However, when abundances were small (<50) the effective removal rate needed to accurately estimates abundances was considerably higher (0.70). Based on our post-validation method, 78% of our site/time frame estimates were accurate. To use this modeling framework it is important to have multiple removals (more than three) within a time frame during which demographic changes are minimized (i.e., a closed population; ≤3 months for feral swine). Our results show that the probability of accurately estimating abundance from this model improves with increased sampling effort (8+ flight hours across the 3-month window is best) and increased removal rate. Based on the inverse relationship between inaccurate abundances and inaccurate removal rates, we suggest auxiliary information that could be collected and included in the model as covariates (e.g., habitat effects, differences between pilots) to improve accuracy of removal rates and hence abundance estimates. © 2016 by the Ecological Society of America.

  17. MSFC Sortie Laboratory Environmental Control System (ECS) phase B design study results

    NASA Technical Reports Server (NTRS)

    Ignatonis, A. J.; Mitchell, K. L.

    1974-01-01

    Phase B effort of the Sortie Lab program has concluded. Results of that effort are presented which pertain to the definitions of the environmental control system (ECS). Numerous design studies were performed in Phase B to investigate system feasibility, complexity, weight, and cost. The results and methods employed for these design studies are included. An autonomous Sortie Lab ECS was developed which utilizes a deployed space radiator. Total system weight was projected to be 1814.4 kg including the radiator and fluids. ECS power requirements were estimated at 950 watts.

  18. Control of robotic assistance using poststroke residual voluntary effort.

    PubMed

    Makowski, Nathaniel S; Knutson, Jayme S; Chae, John; Crago, Patrick E

    2015-03-01

    Poststroke hemiparesis limits the ability to reach, in part due to involuntary muscle co-activation (synergies). Robotic approaches are being developed for both therapeutic benefit and continuous assistance during activities of daily living. Robotic assistance may enable participants to exert less effort, thereby reducing expression of the abnormal co-activation patterns, which could allow participants to reach further. This study evaluated how well participants could perform a reaching task with robotic assistance that was either provided independent of effort in the vertical direction or in the sagittal plane in proportion to voluntary effort estimated from electromyograms (EMG) on the affected side. Participants who could not reach targets without assistance were enabled to reach further with assistance. Constant anti-gravity force assistance that was independent of voluntary effort did not reduce the quality of reach and enabled participants to exert less effort while maintaining different target locations. Force assistance that was proportional to voluntary effort on the affected side enabled participants to exert less effort and could be controlled to successfully reach targets, but participants had increased difficulty maintaining a stable position. These results suggest that residual effort on the affected side can produce an effective command signal for poststroke assistive devices.

  19. Illiteracy among Adults with Disabilities in the Developing World: A Review of the Literature and a Call for Action

    ERIC Educational Resources Information Center

    Groce, Nora Ellen; Bakhshi, Parul

    2011-01-01

    In the early 1990s, UNESCO estimated that perhaps 97% of the world's 650 million disabled persons were unable to read or write, leading to significant efforts throughout the developing world to ensure that all children with disabilities attended school through "inclusive education" programmes. But what of the vast majority of persons with…

  20. Measuring the Impact of an Alternative Approach to School Bullying

    ERIC Educational Resources Information Center

    Domino, Meg

    2013-01-01

    Background: National estimates of middle school bullying approximate 40%, making it the leading form of school violence. Meta-analysis of conventional anti-bullying efforts have shown mixed results, averaging small reductions in bullying behavior. Social-Emotional Learning and PositiveYouth Development provide a theory-driven alternative for…

  1. Methodology for determining economic impacts of raised medians : data collection for additional case studies

    DOT National Transportation Integrated Search

    1998-10-01

    The objective of this four-year resaerch effort is to develop and test a methodology to estimate the economic impact of median design. This report summarizes the work performed in the second year. The secnd year of this study included collecting data...

  2. HARMONIZATION AND COMMUNICATION OF PBPK MODELS USING THE EXPOSURE RELATED DOSE ESTIMATION MODEL (ERDEM) SYSTEM: TRICHLOROETHYLENE

    EPA Science Inventory

    In support of the trichloroethylene (TCE) risk assessment for the Office of Air and Radiation, Office of Solid Waste and Emergency Response, and Office of Water, NERL and NCEA are developing an updated physiologically-based pharmacokinetic (PBPK) model. The PBPK modeling effort ...

  3. Adapting geostatistics to analyze spatial and temporal trends in weed populations

    USDA-ARS?s Scientific Manuscript database

    Geostatistics were originally developed in mining to estimate the location, abundance and quality of ore over large areas from soil samples to optimize future mining efforts. Here, some of these methods were adapted to weeds to account for a limited distribution area (i.e., inside a field), variatio...

  4. Extracted Cookstove Emissions Differentially Alter Pro-inflammatory and Adaptive Gene Expression in Lung Epithelial Cells

    EPA Science Inventory

    Current estimates attribute exposure to cookstove emissions (CE) to over 4 million deaths annually. While the development of several new cookstove (CS) designs has led efforts to reduce CE with relative success, the data supporting potential health benefits from the use of new CS...

  5. Evaluation of a Moderate Resolution, Satellite-Based Impervious Surface Map Using an Independent, High-Resolution Validation Dataset

    EPA Science Inventory

    Given the relatively high cost of mapping impervious surfaces at regional scales, substantial effort is being expended in the development of moderate-resolution, satellite-based methods for estimating impervious surface area (ISA). To rigorously assess the accuracy of these data ...

  6. REVIEW AND EVALUATION OF CURRENT METHODS AND USER NEEDS FOR OTHER STATIONARY COMBUSTION SOURCES

    EPA Science Inventory

    The report gives results of Phase 1 of an effort to develop improved methodologies for estimating area source emissions of air pollutants from stationary combustion sources. The report (1) evaluates Area and Mobile Source (AMS) subsystem methodologies; (2) compares AMS results w...

  7. A methodology for determining economic impacts of raised medians : data analysis on additional case studies

    DOT National Transportation Integrated Search

    1999-10-01

    The objective of this four-year research effort is to develop and test a methodology to estimate the economic impacts of median design. This report summarizes the activities performed in the third year of this project. The primary task in the third y...

  8. Statistical Indicators of Scientific and Technical Communication (1960-1980), 1977 Edition.

    ERIC Educational Resources Information Center

    King, D. W.; And Others

    This report describes results of the second year of research in developing statistical indicators of scientific and technical communication. This effort placed special emphasis on the periodical literature including new estimates of the number of journals and other periodicals, characteristics of journals, journal prices, number of subscribers,…

  9. EXPERIMENTAL METHODOLOGIES AND PRELIMINARY TRANSFER FACTOR DATA FOR ESTIMATION OF DERMAL EXPOSURES TO PARTICLES

    EPA Science Inventory

    Developmental efforts and experimental data are described that focused on quantifying the transfer of particles on a mass basis from indoor surfaces to human skin. Methods were developed that utilized a common fluorescein-tagged Arizona Test Dust (ATD) as a possible surrogate ...

  10. Another Edsel: The Collective Misperception of the Demand for the Certification of MA Sociologists.

    ERIC Educational Resources Information Center

    Schutt, Russell K.; Costner, Herbert L.

    1992-01-01

    Describes the failed effort to establish certification procedures for an Applied Social Research Specialist, a sociologist who would work in research-related positions. Compares the program to the Ford Motor Company's development of the Edsel. Discusses reasons why the application estimates were wrong. (CFR)

  11. Integrating landslide and liquefaction hazard and loss estimates with existing USGS real-time earthquake information products

    USGS Publications Warehouse

    Allstadt, Kate E.; Thompson, Eric M.; Hearne, Mike; Nowicki Jessee, M. Anna; Zhu, J.; Wald, David J.; Tanyas, Hakan

    2017-01-01

    The U.S. Geological Survey (USGS) has made significant progress toward the rapid estimation of shaking and shakingrelated losses through their Did You Feel It? (DYFI), ShakeMap, ShakeCast, and PAGER products. However, quantitative estimates of the extent and severity of secondary hazards (e.g., landsliding, liquefaction) are not currently included in scenarios and real-time post-earthquake products despite their significant contributions to hazard and losses for many events worldwide. We are currently running parallel global statistical models for landslides and liquefaction developed with our collaborators in testing mode, but much work remains in order to operationalize these systems. We are expanding our efforts in this area by not only improving the existing statistical models, but also by (1) exploring more sophisticated, physics-based models where feasible; (2) incorporating uncertainties; and (3) identifying and undertaking research and product development to provide useful landslide and liquefaction estimates and their uncertainties. Although our existing models use standard predictor variables that are accessible globally or regionally, including peak ground motions, topographic slope, and distance to water bodies, we continue to explore readily available proxies for rock and soil strength as well as other susceptibility terms. This work is based on the foundation of an expanding, openly available, case-history database we are compiling along with historical ShakeMaps for each event. The expected outcome of our efforts is a robust set of real-time secondary hazards products that meet the needs of a wide variety of earthquake information users. We describe the available datasets and models, developments currently underway, and anticipated products. 

  12. Project Report on Development of a Safeguards Approach for Pyroprocessing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robert Bean

    The Idaho National Laboratory has undertaken an effort to develop a standard safeguards approach for international commercial pyroprocessing facilities. This report details progress for the fiscal year 2010 effort. A component by component diversion pathway analysis has been performed, and has led to insight on the mitigation needs and equipment development needed for a valid safeguards approach. The effort to develop an in-hot cell detection capability led to the digital cloud chamber, and more importantly, the significant potential scientific breakthrough of the inverse spectroscopy algorithm, including the ability to identify energy and spatial location of gamma ray emitting sources withmore » a single, non-complex, stationary radiation detector system. Curium measurements were performed on historical and current samples at the FCF to attempt to determine the utility of using gross neutron counting for accountancy measurements. A solid cost estimate of equipment installation at FCF has been developed to guide proposals and cost allocations to use FCF as a test bed for safeguards measurement demonstrations. A combined MATLAB and MCNPX model has been developed to perform detector placement calculations around the electrorefiner. Early harvesting has occurred wherein the project team has been requested to provide pyroprocessing technology and safeguards short courses.« less

  13. Real-time multisensor data fusion for target detection, classification, tracking, counting, and range estimates

    NASA Astrophysics Data System (ADS)

    Tsui, Eddy K.; Thomas, Russell L.

    2004-09-01

    As part of the Commanding General of Army Material Command's Research, Development & Engineering Command (RDECOM), the U.S. Army Research Development and Engineering Center (ARDEC), Picatinny funded a joint development effort with McQ Associates, Inc. to develop an Advanced Minefield Sensor (AMS) as a technology evaluation prototype for the Anti-Personnel Landmine Alternatives (APLA) Track III program. This effort laid the fundamental groundwork of smart sensors for detection and classification of targets, identification of combatant or noncombatant, target location and tracking at and between sensors, fusion of information across targets and sensors, and automatic situation awareness to the 1st responder. The efforts have culminated in developing a performance oriented architecture meeting the requirements of size, weight, and power (SWAP). The integrated digital signal processor (DSP) paradigm is capable of computing signals from sensor modalities to extract needed information within either a 360° or fixed field of view with acceptable false alarm rate. This paper discusses the challenges in the developments of such a sensor, focusing on achieving reasonable operating ranges, achieving low power, small size and low cost, and applications for extensions of this technology.

  14. Aerial survey methodology for bison population estimation in Yellowstone National Park

    USGS Publications Warehouse

    Hess, Steven C.

    2002-01-01

    I developed aerial survey methods for statistically rigorous bison population estimation in Yellowstone National Park to support sound resource management decisions and to understand bison ecology. Survey protocols, data recording procedures, a geographic framework, and seasonal stratifications were based on field observations from February 1998-September 2000. The reliability of this framework and strata were tested with long-term data from 1970-1997. I simulated different sample survey designs and compared them to high-effort censuses of well-defined large areas to evaluate effort, precision, and bias. Sample survey designs require much effort and extensive information on the current spatial distribution of bison and therefore do not offer any substantial reduction in time and effort over censuses. I conducted concurrent ground surveys, or 'double sampling' to estimate detection probability during aerial surveys. Group size distribution and habitat strongly affected detection probability. In winter, 75% of the groups and 92% of individual bison were detected on average from aircraft, while in summer, 79% of groups and 97% of individual bison were detected. I also used photography to quantify the bias due to counting large groups of bison accurately and found that undercounting increased with group size and could reach 15%. I compared survey conditions between seasons and identified optimal time windows for conducting surveys in both winter and summer. These windows account for the habitats and total area bison occupy, and group size distribution. Bison became increasingly scattered over the Yellowstone region in smaller groups and more occupied unfavorable habitats as winter progressed. Therefore, the best conditions for winter surveys occur early in the season (Dec-Jan). In summer, bison were most spatially aggregated and occurred in the largest groups by early August. Low variability between surveys and high detection probability provide population estimates with an overall coefficient of variation of approximately 8% and have high power for detecting trends in population change. I demonstrated how population estimates from winter and summer can be integrated into a comprehensive monitoring program to estimate annual growth rates, overall winter mortality, and an index of calf production, requiring about 30 hours of flight per year.

  15. Uncertainty in sample estimates and the implicit loss function for soil information.

    NASA Astrophysics Data System (ADS)

    Lark, Murray

    2015-04-01

    One significant challenge in the communication of uncertain information is how to enable the sponsors of sampling exercises to make a rational choice of sample size. One way to do this is to compute the value of additional information given the loss function for errors. The loss function expresses the costs that result from decisions made using erroneous information. In certain circumstances, such as remediation of contaminated land prior to development, loss functions can be computed and used to guide rational decision making on the amount of resource to spend on sampling to collect soil information. In many circumstances the loss function cannot be obtained prior to decision making. This may be the case when multiple decisions may be based on the soil information and the costs of errors are hard to predict. The implicit loss function is proposed as a tool to aid decision making in these circumstances. Conditional on a logistical model which expresses costs of soil sampling as a function of effort, and statistical information from which the error of estimates can be modelled as a function of effort, the implicit loss function is the loss function which makes a particular decision on effort rational. In this presentation the loss function is defined and computed for a number of arbitrary decisions on sampling effort for a hypothetical soil monitoring problem. This is based on a logistical model of sampling cost parameterized from a recent geochemical survey of soil in Donegal, Ireland and on statistical parameters estimated with the aid of a process model for change in soil organic carbon. It is shown how the implicit loss function might provide a basis for reflection on a particular choice of sample size by comparing it with the values attributed to soil properties and functions. Scope for further research to develop and apply the implicit loss function to help decision making by policy makers and regulators is then discussed.

  16. Using the international classification of functioning, disability and health to expand understanding of paralysis in the United States through improved surveillance.

    PubMed

    Fox, Michael H; Krahn, Gloria L; Sinclair, Lisa B; Cahill, Anthony

    2015-07-01

    Surveillance on paralysis prevalence has been conceptually and methodologically challenging. Numerous methods have been used to approximate population-level paralysis prevalence estimates leading to widely divergent prevalence estimates. To describe three phases in use of the International Classification of Functioning, Disability and Health (ICF) as a framework and planning tool for defining paralysis and developing public health surveillance of this condition. Description of the surveillance methodology covers four steps: an assessment of prior data collection efforts that included a review of existing surveys, registries and other data collection efforts designed to capture both case definitions in use and prevalence of paralysis; use of a consensus conference of experts to develop a case definition of paralysis based on the ICF rather than medical diagnostic criteria; explanation of use of the ICF framework for domains of interest to develop, cognitively test, validate and administer a brief self-report questionnaire for telephone administration on a population; and development and administration of a Paralysis Prevalence and Health Disparities Survey that used content mapping to back code items from existing national surveys to operationalize key domains. ICF coding led to a national population-based survey of paralysis that produced accurate estimates of prevalence and identification of factors related to the health of people in the U.S. living with paralysis. The ICF can be a useful tool for developing valid and reliable surveillance strategies targeting subgroups of individuals with functional disabilities such as people with paralysis and others. Published by Elsevier Inc.

  17. Creel survey sampling designs for estimating effort in short-duration Chinook salmon fisheries

    USGS Publications Warehouse

    McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.

    2013-01-01

    Chinook Salmon Oncorhynchus tshawytscha sport fisheries in the Columbia River basin are commonly monitored using roving creel survey designs and require precise, unbiased catch estimates. The objective of this study was to examine the relative bias and precision of total catch estimates using various sampling designs to estimate angling effort under the assumption that mean catch rate was known. We obtained information on angling populations based on direct visual observations of portions of Chinook Salmon fisheries in three Idaho river systems over a 23-d period. Based on the angling population, Monte Carlo simulations were used to evaluate the properties of effort and catch estimates for each sampling design. All sampling designs evaluated were relatively unbiased. Systematic random sampling (SYS) resulted in the most precise estimates. The SYS and simple random sampling designs had mean square error (MSE) estimates that were generally half of those observed with cluster sampling designs. The SYS design was more efficient (i.e., higher accuracy per unit cost) than a two-cluster design. Increasing the number of clusters available for sampling within a day decreased the MSE of estimates of daily angling effort, but the MSE of total catch estimates was variable depending on the fishery. The results of our simulations provide guidelines on the relative influence of sample sizes and sampling designs on parameters of interest in short-duration Chinook Salmon fisheries.

  18. Software Certification and Software Certificate Management Systems

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2005-01-01

    Incremental certification and re-certification of code as it is developed and modified is a prerequisite for applying modem, evolutionary development processes, which are especially relevant for NASA. For example, the Columbia Accident Investigation Board (CAIB) report 121 concluded there is "the need for improved and uniform statistical sampling, audit, and certification processes". Also, re-certification time has been a limiting factor in making changes to Space Shuttle code close to launch time. This is likely to be an even bigger problem with the rapid turnaround required in developing NASA s replacement for the Space Shuttle, the Crew Exploration Vehicle (CEV). Hence, intelligent development processes are needed which place certification at the center of development. If certification tools provide useful information, such as estimated time and effort, they are more likely to be adopted. The ultimate impact of such a tool will be reduced effort and increased reliability.

  19. The Brazilian version of the effort-reward imbalance questionnaire to assess job stress.

    PubMed

    Chor, Dóra; Werneck, Guilherme Loureiro; Faerstein, Eduardo; Alves, Márcia Guimarães de Mello; Rotenberg, Lúcia

    2008-01-01

    The effort-reward imbalance (ERI) model has been used to assess the health impact of job stress. We aimed at describing the cross-cultural adaptation of the ERI questionnaire into Portuguese and some psychometric properties, in particular internal consistency, test-retest reliability, and factorial structure. We developed a Brazilian version of the ERI using a back-translation method and tested its reliability. The test-retest reliability study was conducted with 111 health workers and University staff. The current analyses are based on 89 participants, after exclusion of those with missing data. Reproducibility (interclass correlation coefficients) for the "effort", "'reward", and "'overcommitment"' dimensions of the scale was estimated at 0.76, 0.86, and 0.78, respectively. Internal consistency (Cronbach's alpha) estimates for these same dimensions were 0.68, 0.78, and 0.78, respectively. The exploratory factorial structure was fairly consistent with the model's theoretical components. We conclude that the results of this study represent the first evidence in favor of the application of the Brazilian Portuguese version of the ERI scale in health research in populations with similar socioeconomic characteristics.

  20. Evaluating Satellite-based Rainfall Estimates for Basin-scale Hydrologic Modeling

    NASA Astrophysics Data System (ADS)

    Yilmaz, K. K.; Hogue, T. S.; Hsu, K.; Gupta, H. V.; Mahani, S. E.; Sorooshian, S.

    2003-12-01

    The reliability of any hydrologic simulation and basin outflow prediction effort depends primarily on the rainfall estimates. The problem of estimating rainfall becomes more obvious in basins with scarce or no rain gauges. We present an evaluation of satellite-based rainfall estimates for basin-scale hydrologic modeling with particular interest in ungauged basins. The initial phase of this study focuses on comparison of mean areal rainfall estimates from ground-based rain gauge network, NEXRAD radar Stage-III, and satellite-based PERSIANN (Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks) and their influence on hydrologic model simulations over several basins in the U.S. Six-hourly accumulations of the above competing mean areal rainfall estimates are used as input to the Sacramento Soil Moisture Accounting Model. Preliminary experiments for the Leaf River Basin in Mississippi, for the period of March 2000 - June 2002, reveals that seasonality plays an important role in the comparison. There is an overestimation during the summer and underestimation during the winter in satellite-based rainfall with respect to the competing rainfall estimates. The consequence of this result on the hydrologic model is that simulated discharge underestimates the major observed peak discharges during early spring for the basin under study. Future research will entail developing correction procedures, which depend on different factors such as seasonality, geographic location and basin size, for satellite-based rainfall estimates over basins with dense rain gauge network and/or radar coverage. Extension of these correction procedures to satellite-based rainfall estimates over ungauged basins with similar characteristics has the potential for reducing the input uncertainty in ungauged basin modeling efforts.

  1. Using population genetic tools to develop a control strategy for feral cats (Felis catus) in Hawai'i

    USGS Publications Warehouse

    Hansen, H.; Hess, S.C.; Cole, D.; Banko, P.C.

    2007-01-01

    Population genetics can provide information about the demographics and dynamics of invasive species that is beneficial for developing effective control strategies. We studied the population genetics of feral cats on Hawai'i Island by microsatellite analysis to evaluate genetic diversity and population structure, assess gene flow and connectivity among three populations, identify potential source populations, characterise population dynamics, and evaluate sex-biased dispersal. High genetic diversity, low structure, and high number of migrants per generation supported high gene flow that was not limited spatially. Migration rates revealed that most migration occurred out of West Mauna Kea. Effective population size estimates indicated increasing cat populations despite control efforts. Despite high gene flow, relatedness estimates declined significantly with increased geographic distance and Bayesian assignment tests revealed the presence of three population clusters. Genetic structure and relatedness estimates indicated male-biased dispersal, primarily from Mauna Kea, suggesting that this population should be targeted for control. However, recolonisation seems likely, given the great dispersal ability that may not be inhibited by barriers such as lava flows. Genetic monitoring will be necessary to assess the effectiveness of future control efforts. Management of other invasive species may benefit by employing these population genetic tools. ?? CSIRO 2007.

  2. Development of Environmental Load Estimation Model for Road Drainage Systems in the Early Design Phase

    NASA Astrophysics Data System (ADS)

    Park, Jin-Young; Lee, Dong-Eun; Kim, Byung-Soo

    2017-10-01

    Due to the increasing concern about climate change, efforts to reduce environmental load are continuously being made in construction industry, and LCA (life cycle assessment) is being presented as an effective method to assess environmental load. Since LCA requires information on construction quantity used for environmental load estimation, however, it is not being utilized in the environmental review in the early design phase where it is difficult to obtain such information. In this study, computation system for construction quantity based on standard cross section of road drainage facilities was developed to compute construction quantity required for LCA using only information available in the early design phase to develop and verify the effectiveness of a model that can perform environmental load estimation. The result showed that it is an effective model that can be used in the early design phase as it revealed a 13.39% mean absolute error rate.

  3. Department of the Navy Supporting Data for Fiscal Year 1984 Budget Estimates Descriptive Summaries Submitted to Congress January 1983. Research, Development, Test and Evaluation, Navy. Book 1. Technology Base, Advanced Technology Development, Strategic Programs.

    DTIC Science & Technology

    1983-01-01

    altioser access (2) Asesss maturity of on-gotnR efforts and integrate appropriate development Into an effective globally dftjtributod .command spport...numerical techniques for nonlinear media.structure shock Interaction inrluding effects of elastic-plastic deformation have bee.a developed and used to...shtittle flight; develop camera payload for SPARTAN (free flyer) flight f rom shuttle. Develop detailed Interpretivesystem capablity~ for global ultraviolet

  4. Force Structure. Joint Seabasing Would Benefit from a Comprehensive Management Approach and Rigorous Experimentation before Services Spend Billions on New Capabilities

    DTIC Science & Technology

    2007-01-01

    results could be compromised. While service development efforts tied to seabasing are approaching milestones for investment decisions , it is...estimates for joint seabasing options are developed and made transparent to DOD and Congress, decision makers will not be able to evaluate the cost...Integrate Service Initiatives 10 DOD Has Not Developed a Joint Experimentation Campaign Plan to Inform Decisions About Joint Seabasing 16 Timeframe for

  5. Preliminary risk benefit assessment for nuclear waste disposal in space

    NASA Technical Reports Server (NTRS)

    Rice, E. E.; Denning, R. S.; Friedlander, A. L.; Priest, C. C.

    1982-01-01

    This paper describes the recent work of the authors on the evaluation of health risk benefits of space disposal of nuclear waste. The paper describes a risk model approach that has been developed to estimate the non-recoverable, cumulative, expected radionuclide release to the earth's biosphere for different options of nuclear waste disposal in space. Risk estimates for the disposal of nuclear waste in a mined geologic repository and the short- and long-term risk estimates for space disposal were developed. The results showed that the preliminary estimates of space disposal risks are low, even with the estimated uncertainty bounds. If calculated release risks for mined geologic repositories remain as low as given by the U.S. DOE, and U.S. EPA requirements continue to be met, then no additional space disposal study effort in the U.S. is warranted at this time. If risks perceived by the public are significant in the acceptance of mined geologic repositories, then consideration of space disposal as a complement to the mined geologic repository is warranted.

  6. Descriptive Summaries of the Research Development Test & Evaluation Army Appropriation FY 1983. Supporting Data FY 1983, Budget Estimate Submitted to Congress February 1982. Volume II.

    DTIC Science & Technology

    1982-02-01

    Defense (BHD) Advanced Technology Program is a broadly based rsearch and development effort designed to exploit new and emerging technologies...11-363 DK13 NON -COMMUNICATIONS ELECTRONIC COUNTERMEASURES SYSTEMS .............................. .11-368 DK14 EXPENDABLE JAMMERS...NAVSTAR GLOBAL POSITIONING SYSTEMS (GPS) USER EQUIPMENT .................................. 111-366 DEFENSEWIDE MISSION SUPPORT 6.37.38.A NON -SYSTEMS

  7. An open source framework for tracking and state estimation ('Stone Soup')

    NASA Astrophysics Data System (ADS)

    Thomas, Paul A.; Barr, Jordi; Balaji, Bhashyam; White, Kruger

    2017-05-01

    The ability to detect and unambiguously follow all moving entities in a state-space is important in multiple domains both in defence (e.g. air surveillance, maritime situational awareness, ground moving target indication) and the civil sphere (e.g. astronomy, biology, epidemiology, dispersion modelling). However, tracking and state estimation researchers and practitioners have difficulties recreating state-of-the-art algorithms in order to benchmark their own work. Furthermore, system developers need to assess which algorithms meet operational requirements objectively and exhaustively rather than intuitively or driven by personal favourites. We have therefore commenced the development of a collaborative initiative to create an open source framework for production, demonstration and evaluation of Tracking and State Estimation algorithms. The initiative will develop a (MIT-licensed) software platform for researchers and practitioners to test, verify and benchmark a variety of multi-sensor and multi-object state estimation algorithms. The initiative is supported by four defence laboratories, who will contribute to the development effort for the framework. The tracking and state estimation community will derive significant benefits from this work, including: access to repositories of verified and validated tracking and state estimation algorithms, a framework for the evaluation of multiple algorithms, standardisation of interfaces and access to challenging data sets. Keywords: Tracking,

  8. A hidden-process model for estimating prespawn mortality using carcass survey data

    USGS Publications Warehouse

    DeWeber, J. Tyrell; Peterson, James T.; Sharpe, Cameron; Kent, Michael L.; Colvin, Michael E.; Schreck, Carl B.

    2017-01-01

    After returning to spawning areas, adult Pacific salmon Oncorhynchus spp. often die without spawning successfully, which is commonly referred to as prespawn mortality. Prespawn mortality reduces reproductive success and can thereby hamper conservation, restoration, and reintroduction efforts. The primary source of information used to estimate prespawn mortality is collected through carcass surveys, but estimation can be difficult with these data due to imperfect detection and carcasses with unknown spawning status. To facilitate unbiased estimation of prespawn mortality and associated uncertainty, we developed a hidden-process mark–recovery model to estimate prespawn mortality rates from carcass survey data while accounting for imperfect detection and unknown spawning success. We then used the model to estimate prespawn mortality and identify potential associated factors for 3,352 adult spring Chinook Salmon O. tshawytscha that were transported above Foster Dam on the South Santiam River (Willamette River basin, Oregon) from 2009 to 2013. Estimated prespawn mortality was relatively low (≤13%) in most years (interannual mean = 28%) but was especially high (74%) in 2013. Variation in prespawn mortality estimates among outplanted groups of fish within each year was also very high, and some of this variation was explained by a trend toward lower prespawn mortality among fish that were outplanted later in the year. Numerous efforts are being made to monitor and, when possible, minimize prespawn mortality in salmon populations; this model can be used to provide unbiased estimates of spawning success that account for unknown fate and imperfect detection, which are common to carcass survey data.

  9. Development of an automated procedure for estimation of the spatial variation of runoff in large river basins

    USDA-ARS?s Scientific Manuscript database

    The use of distributed parameter models to address water resource management problems has increased in recent years. Calibration is necessary to reduce the uncertainties associated with model input parameters. Manual calibration of a distributed parameter model is a very time consuming effort. There...

  10. The impact of organizational structure on flight software cost risk

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus; Lum, Karen; Monson, Erik

    2004-01-01

    This paper summarizes the final results of the follow-up study updating the estimated software effort growth for those projects that were still under development and including an evaluation of the roles versus observed cost risk for the missions included in the original study which expands the data set to thirteen missions.

  11. Long-term agroecosystem research in the Central Mississippi River Basin: hyperspectral remote sensing of reservoir water quality

    USDA-ARS?s Scientific Manuscript database

    In-situ methods for estimating water quality parameters would facilitate efforts in spatial and temporal monitoring, and optical reflectance sensing has shown potential in this regard, particularly for chlorophyll, suspended sediment and turbidity. The objective of this research was to develop and e...

  12. 76 FR 19740 - Notice of Public Information Collections Being Reviewed by the U.S. Agency for International...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-08

    ...;and investigations, committee meetings, agency decisions and rulings, #0;delegations of authority... Requested SUMMARY: U.S. Agency for International Development (USAID) is making efforts to reduce the... whether the information shall have practical utility; (b) the accuracy of the burden estimates; (c) ways...

  13. 77 FR 75105 - Notice of Public Information Collections Being Reviewed by the U.S. Agency for International...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-19

    ...;and investigations, committee meetings, agency decisions and rulings, #0;delegations of authority... Requested SUMMARY: U.S. Agency for International Development (USAID) is making efforts to reduce the... whether the information shall have practical utility; (b) the accuracy of the burden estimates; (c) ways...

  14. Design and implementation of a Windows NT network to support CNC activities

    NASA Technical Reports Server (NTRS)

    Shearrow, C. A.

    1996-01-01

    The Manufacturing, Materials, & Processes Technology Division is undergoing dramatic changes to bring it's manufacturing practices current with today's technological revolution. The Division is developing Computer Automated Design and Computer Automated Manufacturing (CAD/CAM) abilities. The development of resource tracking is underway in the form of an accounting software package called Infisy. These two efforts will bring the division into the 1980's in relationship to manufacturing processes. Computer Integrated Manufacturing (CIM) is the final phase of change to be implemented. This document is a qualitative study and application of a CIM application capable of finishing the changes necessary to bring the manufacturing practices into the 1990's. The documentation provided in this qualitative research effort includes discovery of the current status of manufacturing in the Manufacturing, Materials, & Processes Technology Division including the software, hardware, network and mode of operation. The proposed direction of research included a network design, computers to be used, software to be used, machine to computer connections, estimate a timeline for implementation, and a cost estimate. Recommendation for the division's improvement include action to be taken, software to utilize, and computer configurations.

  15. Inferences about population dynamics from count data using multistate models: a comparison to capture–recapture approaches

    PubMed Central

    Zipkin, Elise F; Sillett, T Scott; Grant, Evan H Campbell; Chandler, Richard B; Royle, J Andrew

    2014-01-01

    Wildlife populations consist of individuals that contribute disproportionately to growth and viability. Understanding a population's spatial and temporal dynamics requires estimates of abundance and demographic rates that account for this heterogeneity. Estimating these quantities can be difficult, requiring years of intensive data collection. Often, this is accomplished through the capture and recapture of individual animals, which is generally only feasible at a limited number of locations. In contrast, N-mixture models allow for the estimation of abundance, and spatial variation in abundance, from count data alone. We extend recently developed multistate, open population N-mixture models, which can additionally estimate demographic rates based on an organism's life history characteristics. In our extension, we develop an approach to account for the case where not all individuals can be assigned to a state during sampling. Using only state-specific count data, we show how our model can be used to estimate local population abundance, as well as density-dependent recruitment rates and state-specific survival. We apply our model to a population of black-throated blue warblers (Setophaga caerulescens) that have been surveyed for 25 years on their breeding grounds at the Hubbard Brook Experimental Forest in New Hampshire, USA. The intensive data collection efforts allow us to compare our estimates to estimates derived from capture–recapture data. Our model performed well in estimating population abundance and density-dependent rates of annual recruitment/immigration. Estimates of local carrying capacity and per capita recruitment of yearlings were consistent with those published in other studies. However, our model moderately underestimated annual survival probability of yearling and adult females and severely underestimates survival probabilities for both of these male stages. The most accurate and precise estimates will necessarily require some amount of intensive data collection efforts (such as capture–recapture). Integrated population models that combine data from both intensive and extensive sources are likely to be the most efficient approach for estimating demographic rates at large spatial and temporal scales. PMID:24634726

  16. Data reconstruction can improve abundance index estimation: An example using Taiwanese longline data for Pacific bluefin tuna

    PubMed Central

    Fukuda, Hiromu; Maunder, Mark N.

    2017-01-01

    Catch-per-unit-effort (CPUE) is often the main piece of information used in fisheries stock assessment; however, the catch and effort data that are traditionally compiled from commercial logbooks can be incomplete or unreliable due to many reasons. Pacific bluefin tuna (PBF) is a seasonal target species in the Taiwanese longline fishery. Since 2010, detailed catch information for each PBF has been made available through a catch documentation scheme. However, previously, only market landing data with a low coverage of logbooks were available. Therefore, several nontraditional procedures were performed to reconstruct catch and effort data from many alternative data sources not directly obtained from fishers for 2001–2015: (1) Estimating the catch number from the landing weight for 2001–2003, for which the catch number information was incomplete, based on Monte Carlo simulation; (2) deriving fishing days for 2007–2009 from voyage data recorder data, based on a newly developed algorithm; and (3) deriving fishing days for 2001–2006 from vessel trip information, based on linear relationships between fishing and at-sea days. Subsequently, generalized linear mixed models were developed with the delta-lognormal assumption for standardizing the CPUE calculated from the reconstructed data, and three-stage model evaluation was performed using (1) Akaike and Bayesian information criteria to determine the most favorable variable composition of standardization models, (2) overall R2 via cross-validation to compare fitting performance between area-separated and area-combined standardizations, and (3) system-based testing to explore the consistency of the standardized CPUEs with auxiliary data in the PBF stock assessment model. The last stage of evaluation revealed high consistency among the data, thus demonstrating improvements in data reconstruction for estimating the abundance index, and consequently the stock assessment. PMID:28968434

  17. Application of CEDA and ASPIC computer packages to the hairtail ( Trichiurus japonicus) fishery in the East China Sea

    NASA Astrophysics Data System (ADS)

    Wang, Yu; Liu, Qun

    2013-01-01

    Surplus-production models are widely used in fish stock assessment and fisheries management due to their simplicity and lower data demands than age-structured models such as Virtual Population Analysis. The CEDA (catch-effort data analysis) and ASPIC (a surplus-production model incorporating covariates) computer packages are data-fitting or parameter estimation tools that have been developed to analyze catch-and-effort data using non-equilibrium surplus production models. We applied CEDA and ASPIC to the hairtail ( Trichiurus japonicus) fishery in the East China Sea. Both packages produced robust results and yielded similar estimates. In CEDA, the Schaefer surplus production model with log-normal error assumption produced results close to those of ASPIC. CEDA is sensitive to the choice of initial proportion, while ASPIC is not. However, CEDA produced higher R 2 values than ASPIC.

  18. Developing and Testing a Model to Predict Outcomes of Organizational Change

    PubMed Central

    Gustafson, David H; Sainfort, François; Eichler, Mary; Adams, Laura; Bisognano, Maureen; Steudel, Harold

    2003-01-01

    Objective To test the effectiveness of a Bayesian model employing subjective probability estimates for predicting success and failure of health care improvement projects. Data Sources Experts' subjective assessment data for model development and independent retrospective data on 221 healthcare improvement projects in the United States, Canada, and the Netherlands collected between 1996 and 2000 for validation. Methods A panel of theoretical and practical experts and literature in organizational change were used to identify factors predicting the outcome of improvement efforts. A Bayesian model was developed to estimate probability of successful change using subjective estimates of likelihood ratios and prior odds elicited from the panel of experts. A subsequent retrospective empirical analysis of change efforts in 198 health care organizations was performed to validate the model. Logistic regression and ROC analysis were used to evaluate the model's performance using three alternative definitions of success. Data Collection For the model development, experts' subjective assessments were elicited using an integrative group process. For the validation study, a staff person intimately involved in each improvement project responded to a written survey asking questions about model factors and project outcomes. Results Logistic regression chi-square statistics and areas under the ROC curve demonstrated a high level of model performance in predicting success. Chi-square statistics were significant at the 0.001 level and areas under the ROC curve were greater than 0.84. Conclusions A subjective Bayesian model was effective in predicting the outcome of actual improvement projects. Additional prospective evaluations as well as testing the impact of this model as an intervention are warranted. PMID:12785571

  19. Investigation of Models and Estimation Techniques for GPS Attitude Determination

    NASA Technical Reports Server (NTRS)

    Garrick, J.

    1996-01-01

    Much work has been done in the Flight Dynamics Analysis Branch (FDAB) in developing algorithms to met the new and growing field of attitude determination using the Global Positioning SYstem (GPS) constellation of satellites. Flight Dynamics has the responsibility to investigate any new technology and incorporate the innovations in the attitude ground support systems developed to support future missions. The work presented here is an investigative analysis that will produce the needed adaptation to allow the Flight Dynamics Support System (FDSS) to incorporate GPS phase measurements and produce observation measurements compatible with the FDSS. A simulator was developed to produce the necessary measurement data to test the models developed for the different estimation techniques used by FDAB. This paper gives an overview of the current modeling capabilities of the simulator models and algorithms for the adaptation of GPS measurement data and results from each of the estimation techniques. Future analysis efforts to evaluate the simulator and models against inflight GPS measurement data are also outlined.

  20. Mine planning and emission control strategies using geostatistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martino, F.; Kim, Y.C.

    1983-03-01

    This paper reviews the past four years' research efforts performed jointly by the University of Arizona and the Homer City Owners in which geostatistics were applied to solve various problems associated with coal characterization, mine planning, and development of emission control strategies. Because geostatistics is the only technique which can quantify the degree of confidence associated with a given estimate (or prediction), it played an important role throughout the research efforts. Through geostatistics, it was learned that there is an urgent need for closely spaced sample information, if short-term coal quality predictions are to be made for mine planning purposes.

  1. A climate trend analysis of Kenya-August 2010

    USGS Publications Warehouse

    Funk, Christopher C.

    2010-01-01

    Introduction This brief report draws from a multi-year effort by the United States Agency for International Development's Famine Early Warning System Network (FEWS NET) to monitor and map rainfall and temperature trends over the last 50 years (1960-2009) in Kenya. Observations from seventy rainfall gauges and seventeen air temperature stations were analyzed for the long rains period, corresponding to March through June (MAMJ). The data were quality controlled, converted into 1960-2009 trend estimates, and interpolated using a rigorous geo-statistical technique (kriging). Kriging produces standard error estimates, and these can be used to assess the relative spatial accuracy of the identified trends. Dividing the trends by the associated errors allows us to identify the relative certainty of our estimates (Funk and others, 2005; Verdin and others, 2005; Brown and Funk, 2008; Funk and Verdin, 2009). Assuming that the same observed trends persist, regardless of whether or not these changes are due to anthropogenic or natural cyclical causes, these results can be extended to 2025, providing critical, and heretofore missing information about the types and locations of adaptation efforts that may be required to improve food security.

  2. Stress Recovery and Error Estimation for 3-D Shell Structures

    NASA Technical Reports Server (NTRS)

    Riggs, H. R.

    2000-01-01

    The C1-continuous stress fields obtained from finite element analyses are in general lower- order accurate than are the corresponding displacement fields. Much effort has focussed on increasing their accuracy and/or their continuity, both for improved stress prediction and especially error estimation. A previous project developed a penalized, discrete least squares variational procedure that increases the accuracy and continuity of the stress field. The variational problem is solved by a post-processing, 'finite-element-type' analysis to recover a smooth, more accurate, C1-continuous stress field given the 'raw' finite element stresses. This analysis has been named the SEA/PDLS. The recovered stress field can be used in a posteriori error estimators, such as the Zienkiewicz-Zhu error estimator or equilibrium error estimators. The procedure was well-developed for the two-dimensional (plane) case involving low-order finite elements. It has been demonstrated that, if optimal finite element stresses are used for the post-processing, the recovered stress field is globally superconvergent. Extension of this work to three dimensional solids is straightforward. Attachment: Stress recovery and error estimation for shell structure (abstract only). A 4-node, shear-deformable flat shell element developed via explicit Kirchhoff constraints (abstract only). A novel four-node quadrilateral smoothing element for stress enhancement and error estimation (abstract only).

  3. A critical look at national monitoring programs for birds and other wildlife species

    USGS Publications Warehouse

    Sauer, J.R.; O'Shea, T.J.; Bogon, M.A.

    2003-01-01

    Concerns?about declines in numerous taxa have created agreat deal of interest in survey development. Because birds have traditionally been monitored by a variety of methods, bird surveys form natural models for development of surveys for other taxa. Here I suggest that most bird surveys are not appropriate models for survey design. Most lack important design components associated with estimation of population parameters at sample sites or with sampling over space, leading to estimates that may be biased, I discuss the limitations of national bird monitoring programs designed to monitor population size. Although these surveys are often analyzed, careful consideration must be given to factors that may bias estimates but that cannot be evaluated within the survey. Bird surveys with appropriate designs have generally been developed as part of management programs that have specific information needs. Experiences gained from bird surveys provide important information for development of surveys for other taxa, and statistical developments in estimation of population sizes from counts provide new approaches to overcoming the limitations evident in many bird surveys. Design of surveys is a collaborative effort, requiring input from biologists, statisticians, and the managers who will use the information from the surveys.

  4. LWST Phase I Project Conceptual Design Study: Evaluation of Design and Construction Approaches for Economical Hybrid Steel/Concrete Wind Turbine Towers; June 28, 2002 -- July 31, 2004

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LaNier, M. W.

    The United States Department of Energy (DOE) Wind Energy Research Program has begun a new effort to partner with U.S. industry to develop wind technology that will allow wind systems to compete in regions of low wind speed. The Class 4 and 5 sites targeted by this effort have annual average wind speeds of 5.8 m/s (13 mph), measured at 10 m (33 ft) height. Such sites are abundant in the United States and would increase the land area available for wind energy production twenty-fold. The new program is targeting a levelized cost of energy of 3 cents/kWh at thesemore » sites by 2010. A three-element approach has been initiated. These efforts are concept design, component development, and system development. This work builds on previous activities under the WindPACT program and the Next Generation Turbine program. If successful, DOE estimates that his new technology could result in 35 to 45 gigawatts of additional wind capacity being installed by 2020.« less

  5. Control of corruption, democratic accountability, and effectiveness of HIV/AIDS official development assistance.

    PubMed

    Lee, Hwa-Young; Yang, Bong-Ming; Kang, Minah

    2016-01-01

    Despite continued global efforts, HIV/AIDS outcomes in developing countries have not made much progress. Poor governance in recipient countries is often seen as one of the reasons for ineffectiveness of aid efforts to achieve stated objectives and desired outcomes. This study examines the impact of two important dimensions of governance - control of corruption and democratic accountability - on the effectiveness of HIV/AIDS official development assistance. An empirical analysis using dynamic panel Generalized Method of Moments estimation was conducted on 2001-2010 datasets. Control of corruption and democratic accountability revealed an independent effect and interaction with the amount of HIV/AIDS aid on incidence of HIV/AIDS, respectively, while none of the two governance variables had a significant effect on HIV/AIDS prevalence. Specifically, in countries with accountability level below -2.269, aid has a detrimental effect on incidence of HIV/AIDS. The study findings suggest that aid programs need to be preceded or at least accompanied by serious efforts to improve governance in recipient countries and that democratic accountability ought to receive more critical attention.

  6. Adapting to Climate Change in the Great Lakes Region: The Wisconsin Initiative on Climate Change Impacts

    NASA Astrophysics Data System (ADS)

    Vimont, D.; Liebl, D.

    2012-12-01

    The mission of the Wisconsin Initiative on Climate Change Impacts (WICCI; http://www.wicci.wisc.edu) is to assess the impacts of climate change on Wisconsin's natural, human, and built environments; and to assist in developing, recommending, and implementing climate adaptation strategies in Wisconsin. WICCI originated in 2007 as a partnership between the University of Wisconsin Nelson Institute and the Wisconsin Department of Natural Resources, and has since grown to include numerous other state, public, and private institutions. In 2011, WICCI released its First Assessment Report, which documents the efforts of over 200 individuals around the state in assessing vulnerability and estimating the risk that regional climate change poses to Wisconsin. The success of WICCI as an organization can be traced to its existence as a partnership between academic and state institutions, and as a boundary organization that catalyzes cross-disciplinary efforts between science and policy. WICCI's organizational structure and its past success at assessing climate impacts in Wisconsin will be briefly discussed. As WICCI moves into its second phase, it is increasing its emphasis on the second part of its mission: development, and implementation of adaptation strategies. Towards these goals WICCI has expanded its organizational structure to include a Communications and Outreach Committee that further ensures a necessary two-way communication of information between stakeholders / decision makers, and scientific efforts. WICCI is also increasing its focus on place-based efforts that include climate change information as one part of an integrated effort at sustainable development. The talk will include a discussion of current outreach and education efforts, as well as future directions for WICCI efforts.

  7. Validation of abundance estimates from mark–recapture and removal techniques for rainbow trout captured by electrofishing in small streams

    USGS Publications Warehouse

    Rosenberger, Amanda E.; Dunham, Jason B.

    2005-01-01

    Estimation of fish abundance in streams using the removal model or the Lincoln - Peterson mark - recapture model is a common practice in fisheries. These models produce misleading results if their assumptions are violated. We evaluated the assumptions of these two models via electrofishing of rainbow trout Oncorhynchus mykiss in central Idaho streams. For one-, two-, three-, and four-pass sampling effort in closed sites, we evaluated the influences of fish size and habitat characteristics on sampling efficiency and the accuracy of removal abundance estimates. We also examined the use of models to generate unbiased estimates of fish abundance through adjustment of total catch or biased removal estimates. Our results suggested that the assumptions of the mark - recapture model were satisfied and that abundance estimates based on this approach were unbiased. In contrast, the removal model assumptions were not met. Decreasing sampling efficiencies over removal passes resulted in underestimated population sizes and overestimates of sampling efficiency. This bias decreased, but was not eliminated, with increased sampling effort. Biased removal estimates based on different levels of effort were highly correlated with each other but were less correlated with unbiased mark - recapture estimates. Stream size decreased sampling efficiency, and stream size and instream wood increased the negative bias of removal estimates. We found that reliable estimates of population abundance could be obtained from models of sampling efficiency for different levels of effort. Validation of abundance estimates requires extra attention to routine sampling considerations but can help fisheries biologists avoid pitfalls associated with biased data and facilitate standardized comparisons among studies that employ different sampling methods.

  8. Projection of Maximum Software Maintenance Manning Levels.

    DTIC Science & Technology

    1982-06-01

    mainte- nance team development and for outyear support resource estimation, and to provide an analysis of applications of the model in areas other...by General Research Corporation of Santa Barbara, Ca., indicated that the Planning and Resource Management Information System (PARRIS) at the Air Force...determined that when the optimal input effort is applied, steps in the development would be achieved at a rate proportional to V(t). Thus the work-rate could

  9. Electromagnetic deep-probing (100-1000 kms) of the Earth's interior from artificial satellites: Constraints on the regional emplacement of crustal resources

    NASA Technical Reports Server (NTRS)

    Hermance, J. F. (Principal Investigator)

    1981-01-01

    Efforts continue in the development of a computer program for looking at the coupling of finite dimensioned source fields with a laterally heterogeneous Earth. An algorithm for calculating a time-varying reference field using ground-based magnetic observatory data is also under development as part of the production of noise-free estimates of global electromagnetic response functions using Magsat data.

  10. The development and application of composite complexity models and a relative complexity metric in a software maintenance environment

    NASA Technical Reports Server (NTRS)

    Hops, J. M.; Sherif, J. S.

    1994-01-01

    A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that noe new defects are introduced in the development phase of the software process; and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modifications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.

  11. Diurnal and Reproductive Stage-Dependent Variation of Parental Behaviour in Captive Zebra Finches

    PubMed Central

    Morvai, Boglárka; Nanuru, Sabine; Mul, Douwe; Kusche, Nina; Milne, Gregory; Székely, Tamás; Komdeur, Jan; Miklósi, Ádám

    2016-01-01

    Parental care plays a key role in ontogeny, life-history trade-offs, sexual selection and intra-familial conflict. Studies focusing on understanding causes and consequences of variation in parental effort need to quantify parental behaviour accurately. The applied methods are, however, diverse even for a given species and type of parental effort, and rarely validated for accuracy. Here we focus on variability of parental behaviour from a methodological perspective to investigate the effect of different samplings on various estimates of parental effort. We used nest box cameras in a captive breeding population of zebra finches, Taeniopygia guttata, a widely used model system of sexual selection, intra-familial dynamics and parental care. We investigated diurnal and reproductive stage-dependent variation in parental effort (including incubation, brooding, nest attendance and number of feedings) based on 12h and 3h continuous video-recordings taken at various reproductive stages. We then investigated whether shorter (1h) sampling periods provided comparable estimates of overall parental effort and division of labour to those of longer (3h) sampling periods. Our study confirmed female-biased division of labour during incubation, and showed that the difference between female and male effort diminishes with advancing reproductive stage. We found individually consistent parental behaviours within given days of incubation and nestling provisioning. Furthermore, parental behaviour was consistent over the different stages of incubation, however, only female brooding was consistent over nestling provisioning. Parental effort during incubation did not predict parental effort during nestling provisioning. Our analyses revealed that 1h sampling may be influenced heavily by stochastic and diurnal variation. We suggest using a single longer sampling period (3h) may provide a consistent and accurate estimate for overall parental effort during incubation in zebra finches. Due to the large within-individual variation, we suggest repeated longer sampling over the reproductive stage may be necessary for accurate estimates of parental effort post-hatching. PMID:27973549

  12. Redox flow cell development and demonstration project, calendar year 1977

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Research and development on the redox flow cell conducted from January 1, 1977, to December 31, 1977, are described in this report. The major focus of the effort during 1977 was the key technology issues that directly influence the fundamental feasibility of the overall redox concept. These issues were the development of a suitable ion exchange membrane for the system, the screening and study of candidate redox couples to achieve optimum cell performance, and the carrying out of systems analysis and modeling to develop system performance goals and cost estimates.

  13. Evaluating the impact of lower resolutions of digital elevation model on rainfall-runoff modeling for ungauged catchments.

    PubMed

    Ghumman, Abul Razzaq; Al-Salamah, Ibrahim Saleh; AlSaleem, Saleem Saleh; Haider, Husnain

    2017-02-01

    Geomorphological instantaneous unit hydrograph (GIUH) usually uses geomorphologic parameters of catchment estimated from digital elevation model (DEM) for rainfall-runoff modeling of ungauged watersheds with limited data. Higher resolutions (e.g., 5 or 10 m) of DEM play an important role in the accuracy of rainfall-runoff models; however, such resolutions are expansive to obtain and require much greater efforts and time for preparation of inputs. In this research, a modeling framework is developed to evaluate the impact of lower resolutions (i.e., 30 and 90 m) of DEM on the accuracy of Clark GIUH model. Observed rainfall-runoff data of a 202-km 2 catchment in a semiarid region was used to develop direct runoff hydrographs for nine rainfall events. Geographical information system was used to process both the DEMs. Model accuracy and errors were estimated by comparing the model results with the observed data. The study found (i) high model efficiencies greater than 90% for both the resolutions, and (ii) that the efficiency of Clark GIUH model does not significantly increase by enhancing the resolution of the DEM from 90 to 30 m. Thus, it is feasible to use lower resolutions (i.e., 90 m) of DEM in the estimation of peak runoff in ungauged catchments with relatively less efforts. Through sensitivity analysis (Monte Carlo simulations), the kinematic wave parameter and stream length ratio are found to be the most significant parameters in velocity and peak flow estimations, respectively; thus, they need to be carefully estimated for calculation of direct runoff in ungauged watersheds using Clark GIUH model.

  14. Ground based mobile isotopic methane measurements in the Front Range, Colorado

    NASA Astrophysics Data System (ADS)

    Vaughn, B. H.; Rella, C.; Petron, G.; Sherwood, O.; Mielke-Maday, I.; Schwietzke, S.

    2014-12-01

    Increased development of unconventional oil and gas resources in North America has given rise to attempts to monitor and quantify fugitive emissions of methane from the industry. Emission estimates of methane from oil and gas basins can vary significantly from one study to another as well as from EPA or State estimates. New efforts are aimed at reconciling bottom-up, or inventory-based, emission estimates of methane with top-down estimates based on atmospheric measurements from aircraft, towers, mobile ground-based vehicles, and atmospheric models. Attributing airborne measurements of regional methane fluxes to specific sources is informed by ground-based measurements of methane. Stable isotopic measurements (δ13C) of methane help distinguish between emissions from the O&G industry, Confined Animal Feed Operations (CAFO), and landfills, but analytical challenges typically limit meaningful isotopic measurements to individual point sampling. We are developing a toolbox to use δ13CH4 measurements to assess the partitioning of methane emissions for regions with multiple methane sources. The method was applied to the Denver-Julesberg Basin. Here we present data from continuous isotopic measurements obtained over a wide geographic area by using MegaCore, a 1500 ft. tube that is constantly filled with sample air while driving, then subsequently analyzed at slower rates using cavity ring down spectroscopy (CRDS). Pressure, flow and calibration are tightly controlled allowing precise attribution of methane enhancements to their point of collection. Comparisons with point measurements are needed to confirm regional values and further constrain flux estimates and models. This effort was made in conjunction with several major field campaigns in the Colorado Front Range in July-August 2014, including FRAPPÉ (Front Range Air Pollution and Photochemistry Experiment), DISCOVER-AQ, and the Air Water Gas NSF Sustainability Research Network at the University of Colorado.

  15. Development and evaluation of an acoustic device to estimate size distribution of channel catfish in commercial ponds

    USDA-ARS?s Scientific Manuscript database

    As one step in the continued effort to utilize acoustic methods and techniques to the betterment of catfish aquaculture, an acoustic “catfish sizer” was designed to determine the size distribution of Channel Catfish Ictalurus punctatus in commercial ponds. The catfish sizer employed a custom-built 4...

  16. Modifying Taper-Derived Merchantable Height Estimates to Account for Tree Characteristics

    Treesearch

    James A. Westfall

    2006-01-01

    The U.S. Department of Agriculture Forest Service Northeastern Forest Inventory and Analysis program (NE-FIA) is developing regionwide tree-taper equations. Unlike most previous work on modeling tree form, this effort necessarily includes a wide array of tree species. For some species, branching patterns can produce undesirable tree form that reduces the merchantable...

  17. A tool for rapid post-hurricane urban tree debris estimates using high resolution aerial imagery

    Treesearch

    Zoltan Szantoi; Sparkle L Malone; Francisco Escobedo; Orlando Misas; Scot Smith; Bon Dewitt

    2012-01-01

    Coastal communities in the southeast United States have regularly experienced severe hurricane impacts. To better facilitate recovery efforts in these communities following natural disasters, state and federal agencies must respond quickly with information regarding the extent and severity of hurricane damage and the amount of tree debris volume. A tool was developed...

  18. Software Measurement Guidebook. Version 02.00.02

    DTIC Science & Technology

    1992-12-01

    Compatibility Testing Process .............................. 9-5 Figure 9-3. Development Effort Planning Curve ................................. 9-7 Figure 10-1...requirements, design, code, and test and for analyzing this data. "* Proposal Manager. The person responsible for describing and supporting the estimated...designed, build/elease ranges, variances, and comparisons size growth; costs; completions; and content, units completing test , units with historical

  19. Screening organic chemicals in commerce for emissions in the context of environmental and human exposure.

    PubMed

    Breivik, Knut; Arnot, Jon A; Brown, Trevor N; McLachlan, Michael S; Wania, Frank

    2012-08-01

    Quantitative knowledge of organic chemical release into the environment is essential to understand and predict human exposure as well as to develop rational control strategies for any substances of concern. While significant efforts have been invested to characterize and screen organic chemicals for hazardous properties, relatively less effort has been directed toward estimating emissions and hence also risks. Here, a rapid throughput method to estimate emissions of discrete organic chemicals in commerce has been developed, applied and evaluated to support screening studies aimed at ranking and identifying chemicals of potential concern. The method builds upon information in the European Union Technical Guidance Document and utilizes information on quantities in commerce (production and/or import rates), chemical function (use patterns) and physical-chemical properties to estimate emissions to air, soil and water within the OECD for five stages of the chemical life-cycle. The method is applied to 16,029 discrete substances (identified by CAS numbers) from five national and international high production volume lists. As access to consistent input data remains fragmented or even impossible, particular attention is given to estimating, evaluating and discussing uncertainties in the resulting emission scenarios. The uncertainty for individual substances typically spans 3 to 4 orders of magnitude for this initial tier screening method. Information on uncertainties in emissions is useful as any screening or categorization methods which solely rely on threshold values are at risk of leading to a significant number of either false positives or false negatives. A limited evaluation of the screening method's estimates for a sub-set of about 100 substances, compared against independent and more detailed emission scenarios presented in various European Risk Assessment Reports, highlights that up-to-date and accurate information on quantities in commerce as well as a detailed breakdown on chemical function are critically needed for developing more realistic emission scenarios.

  20. Technology Estimating: A Process to Determine the Cost and Schedule of Space Technology Research and Development

    NASA Technical Reports Server (NTRS)

    Cole, Stuart K.; Reeves, John D.; Williams-Byrd, Julie A.; Greenberg, Marc; Comstock, Doug; Olds, John R.; Wallace, Jon; DePasquale, Dominic; Schaffer, Mark

    2013-01-01

    NASA is investing in new technologies that include 14 primary technology roadmap areas, and aeronautics. Understanding the cost for research and development of these technologies and the time it takes to increase the maturity of the technology is important to the support of the ongoing and future NASA missions. Overall, technology estimating may help provide guidance to technology investment strategies to help improve evaluation of technology affordability, and aid in decision support. The research provides a summary of the framework development of a Technology Estimating process where four technology roadmap areas were selected to be studied. The framework includes definition of terms, discussion for narrowing the focus from 14 NASA Technology Roadmap areas to four, and further refinement to include technologies, TRL range of 2 to 6. Included in this paper is a discussion to address the evaluation of 20 unique technology parameters that were initially identified, evaluated and then subsequently reduced for use in characterizing these technologies. A discussion of data acquisition effort and criteria established for data quality are provided. The findings obtained during the research included gaps identified, and a description of a spreadsheet-based estimating tool initiated as a part of the Technology Estimating process.

  1. Improving waterfowl production estimates: Results of a test in the prairie pothole region

    USGS Publications Warehouse

    Arnold, P.M.; Cowardin, L.M.

    1985-01-01

    The U.S. Fish and Wildlife Service in an effort to improve and standardize methods for estimating waterfowl production tested a new technique in the four-county Arrowwood Wetland Management District (WMD) for three years (1982-1984). On 14 randomly selected 10.36 km2 plots, upland and wetland habitat was mapped, classified, and digitized. Waterfowl breeding pairs were counted twice each year and the proportion of wetland basins containing water was determined. Pair numbers and habitat conditions were entered into a computer model developed by Northern Prairie Wildlife Research Center. That model estimates production on small federally owned wildlife tracts, federal wetland easements, and private land. Results indicate that production estimates were most accurate for mallards (Anas platyrhynchos), the species for which the computer model and data base were originally designed. Predictions for the pintail (Anas acuta), gadwall (A. strepa), blue-winged teal (A. discors), and northern shoveler (A. clypeata) were believed to be less accurate. Modeling breeding period dynamics of a waterfowl species and making credible production estimates for a geographic area are possible if the data used in the model are adequate. The process of modeling the breeding period of a species aids in locating areas of insufficient biological knowledge. This process will help direct future research efforts and permit more efficient gathering of field data.

  2. Preliminary Assessment of the Flow of Used Electronics, In ...

    EPA Pesticide Factsheets

    Electronic waste (e-waste) is the largest growing municipal waste stream in the United States. The improper disposal of e-waste has environmental, economic, and social impacts, thus there is a need for sustainable stewardship of electronics. EPA/ORD has been working to improve our understanding of the quantity and flow of electronic devices from initial purchase to final disposition. Understanding the pathways of used electronics from the consumer to their final disposition would provide insight to decision makers about their impacts and support efforts to encourage improvements in policy, technology, and beneficial use. This report is the first stage of study of EPA/ORD's efforts to understand the flows of used electronics and e-waste by reviewing the regulatory programs for the selected states and identifying the key lessons learned and best practices that have emerged since their inception. Additionally, a proof-of-concept e-waste flow model has been developed to provide estimates of the quantity of e-waste generated annually at the national level, as well as for selected states. This report documents a preliminary assessment of available data and development of the model that can be used as a starting point to estimate domestic flows of used electronics from generation, to collection and reuse, to final disposition. The electronics waste flow model can estimate the amount of electronic products entering the EOL management phase based on unit sales dat

  3. Analysis and Management of Animal Populations: Modeling, Estimation and Decision Making

    USGS Publications Warehouse

    Williams, B.K.; Nichols, J.D.; Conroy, M.J.

    2002-01-01

    This book deals with the processes involved in making informed decisions about the management of animal populations. It covers the modeling of population responses to management actions, the estimation of quantities needed in the modeling effort, and the application of these estimates and models to the development of sound management decisions. The book synthesizes and integrates in a single volume the methods associated with these themes, as they apply to ecological assessment and conservation of animal populations. KEY FEATURES * Integrates population modeling, parameter estimation and * decision-theoretic approaches to management in a single, cohesive framework * Provides authoritative, state-of-the-art descriptions of quantitative * approaches to modeling, estimation and decision-making * Emphasizes the role of mathematical modeling in the conduct of science * and management * Utilizes a unifying biological context, consistent mathematical notation, * and numerous biological examples

  4. On implementing maximum economic yield in commercial fisheries

    PubMed Central

    Dichmont, C. M.; Pascoe, S.; Kompas, T.; Punt, A. E.; Deng, R.

    2009-01-01

    Economists have long argued that a fishery that maximizes its economic potential usually will also satisfy its conservation objectives. Recently, maximum economic yield (MEY) has been identified as a primary management objective for Australian fisheries and is under consideration elsewhere. However, first attempts at estimating MEY as an actual management target for a real fishery (rather than a conceptual or theoretical exercise) have highlighted some substantial complexities generally unconsidered by fisheries economists. Here, we highlight some of the main issues encountered in our experience and their implications for estimating and transitioning to MEY. Using a bioeconomic model of an Australian fishery for which MEY is the management target, we note that unconstrained optimization may result in effort trajectories that would not be acceptable to industry or managers. Different assumptions regarding appropriate constraints result in different outcomes, each of which may be considered a valid MEY. Similarly, alternative treatments of prices and costs may result in differing estimates of MEY and their associated effort trajectories. To develop an implementable management strategy in an adaptive management framework, a set of assumptions must be agreed among scientists, economists, and industry and managers, indicating that operationalizing MEY is not simply a matter of estimating the numbers but requires strong industry commitment and involvement. PMID:20018676

  5. Estimation, analysis, sources, and verification of consumptive water use data in the Great Lakes-St. Lawrence River basin

    USGS Publications Warehouse

    Snavely, D.S.

    1988-01-01

    The Great Lakes-St. Lawrence River basin provides water for many uses and for wildlife habitat; thus many groups have developed strategies to manage the basin 's water resource. The International Joint Commission (IJC) is reviewing and comparing available consumptive-use data to assess the magnitude and effect of consumptive uses under present projected economic and hydraulic conditions on lake levels. As a part of this effort, the U.S. Geological Survey compared its own estimates of consumptive use in the United States with those generated by (1) the International Great Lakes Diversions and (2) the IJC. The U.S. Geological Survey also developed two methods of calculating consumptive-use projections for 1980 through 2000; one method yields an estimate of 6,490 cu ft/s for the year 2000; the other yields an estimate of 8,330 cu ft/s. These two projections could be considered the upper and lower limits for the year 2000. The reasons for the varying estimates are differences in (1) methods by which base year values were developed, and (2) the methods or models that were used to project consumptive-use values for the future. Acquisition of consumptive-use data from water users or governmental agencies or ministries would be desirable to minimize reliance on estimates. (USGS)

  6. CubeSat mission design software tool for risk estimating relationships

    NASA Astrophysics Data System (ADS)

    Gamble, Katharine Brumbaugh; Lightsey, E. Glenn

    2014-09-01

    In an effort to make the CubeSat risk estimation and management process more scientific, a software tool has been created that enables mission designers to estimate mission risks. CubeSat mission designers are able to input mission characteristics, such as form factor, mass, development cycle, and launch information, in order to determine the mission risk root causes which historically present the highest risk for their mission. Historical data was collected from the CubeSat community and analyzed to provide a statistical background to characterize these Risk Estimating Relationships (RERs). This paper develops and validates the mathematical model based on the same cost estimating relationship methodology used by the Unmanned Spacecraft Cost Model (USCM) and the Small Satellite Cost Model (SSCM). The RER development uses general error regression models to determine the best fit relationship between root cause consequence and likelihood values and the input factors of interest. These root causes are combined into seven overall CubeSat mission risks which are then graphed on the industry-standard 5×5 Likelihood-Consequence (L-C) chart to help mission designers quickly identify areas of concern within their mission. This paper is the first to document not only the creation of a historical database of CubeSat mission risks, but, more importantly, the scientific representation of Risk Estimating Relationships.

  7. Soft computing techniques toward modeling the water supplies of Cyprus.

    PubMed

    Iliadis, L; Maris, F; Tachos, S

    2011-10-01

    This research effort aims in the application of soft computing techniques toward water resources management. More specifically, the target is the development of reliable soft computing models capable of estimating the water supply for the case of "Germasogeia" mountainous watersheds in Cyprus. Initially, ε-Regression Support Vector Machines (ε-RSVM) and fuzzy weighted ε-RSVMR models have been developed that accept five input parameters. At the same time, reliable artificial neural networks have been developed to perform the same job. The 5-fold cross validation approach has been employed in order to eliminate bad local behaviors and to produce a more representative training data set. Thus, the fuzzy weighted Support Vector Regression (SVR) combined with the fuzzy partition has been employed in an effort to enhance the quality of the results. Several rational and reliable models have been produced that can enhance the efficiency of water policy designers. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diwan, R.

    India is generally known as a poor, overpopulated, and underdeveloped country. Its per capita income, even in 1976-77, is less than 100 U.S. dollars. In the 1971 census India's population was estimated at 550 million, or approximately 15% of the world population. It is projected that the 1976 Indian population is close to 600 million. Its government has been making major efforts attacking the problem of underdevelopment; in these efforts it is assumed that once the country is developed, the twin problems of poverty and overpopulation also will be solved. To remove underdevelopment, India has instituted the mechanism of five-yearmore » plans which are an attempt in generating a development process. In this paper the energy implications of this development process are analyzed during the last decade of 1960-1970. Even though changes have taken place in the years 1970-1976, they are not fundamentally or structurally different from the trends established in the ten-year span under study.« less

  9. A Screening-Level Approach for Comparing Risks Affecting Aquatic Ecosystem Services over Socio-Environmental Gradients

    NASA Astrophysics Data System (ADS)

    Harmon, T. C.; Conde, D.; Villamizar, S. R.; Reid, B.; Escobar, J.; Rusak, J.; Hoyos, N.; Scordo, F.; Perillo, G. M.; Piccolo, M. C.; Zilio, M.; Velez, M.

    2015-12-01

    Assessing risks to aquatic ecosystems services (ES) is challenging and time-consuming, and effective strategies for prioritizing more detailed assessment efforts are needed. We propose a screening-level risk analysis (SRA) approach that scales ES risk using socioeconomic and environmental indices to capture anthropic and climatic pressures, as well as the capacity for institutional responses to those pressures. The method considers ES within a watershed context, and uses expert input to prioritize key services and the associated pressures that threaten them. The SRA approach focuses on estimating ES risk affect factors, which are the sum of the intensity factors for all hazards or pressures affecting the ES. We estimate the pressure intensity factors in a novel manner, basing them on the nation's (i) human development (proxied by Inequality-adjusted Human Development Index, IHDI), (ii) environmental regulatory and monitoring state (Environmental Performance Index, EPI) and (iii) the current level of water stress in the watershed (baseline water stress, BWS). Anthropic intensity factors for future conditions are derived from the baseline values based on the nation's 10-year trend in IHDI and EPI; ES risks in nations with stronger records of change are rewarded more/penalized less in estimates for good/poor future management scenarios. Future climatic intensity factors are tied to water stress estimates based on two general circulation model (GCM) outcomes. We demonstrate the method for an international array of six sites representing a wide range of socio-environmental settings. The outcomes illustrate novel consequences of the scaling scheme. Risk affect factors may be greater in a highly developed region under intense climatic pressure, or in less well-developed regions due to human factors (e.g., poor environmental records). As a screening-level tool, the SRA approach offers considerable promise for ES risk comparisons among watersheds and regions so that detailed assessment, management and mitigation efforts can be effectively prioritized.

  10. Development of the Hospital Ship Replacement (HSR) Concept - Maximizing Capability & Affordability

    DTIC Science & Technology

    2014-08-01

    be restricted by weight when it comes to passenger capacity. For verification, these patient capacity estimates were compared to seating ...ABSTRACT The Center for Innovation in Ship Design (CISD) requested a design effort to refine and expand upon a previous development of a concept that...could serve as a replacement for the existing hospital ships, USNS Mercy (T-AHS 19) and USNS Comfort (T-AHS 20). These ships are over 35 years old and

  11. Environmental fate and exposure models: advances and challenges in 21st century chemical risk assessment.

    PubMed

    Di Guardo, Antonio; Gouin, Todd; MacLeod, Matthew; Scheringer, Martin

    2018-01-24

    Environmental fate and exposure models are a powerful means to integrate information on chemicals, their partitioning and degradation behaviour, the environmental scenario and the emissions in order to compile a picture of chemical distribution and fluxes in the multimedia environment. A 1995 pioneering book, resulting from a series of workshops among model developers and users, reported the main advantages and identified needs for research in the field of multimedia fate models. Considerable efforts were devoted to their improvement in the past 25 years and many aspects were refined; notably the inclusion of nanomaterials among the modelled substances, the development of models at different spatial and temporal scales, the estimation of chemical properties and emission data, the incorporation of additional environmental media and processes, the integration of sensitivity and uncertainty analysis in the simulations. However, some challenging issues remain and require research efforts and attention: the need of methods to estimate partition coefficients for polar and ionizable chemical in the environment, a better description of bioavailability in different environments as well as the requirement of injecting more ecological realism in exposure predictions to account for the diversity of ecosystem structures and functions in risk assessment. Finally, to transfer new scientific developments into the realm of regulatory risk assessment, we propose the formation of expert groups that compare, discuss and recommend model modifications and updates and help develop practical tools for risk assessment.

  12. Estimating lifetime and age-conditional probabilities of developing cancer.

    PubMed

    Wun, L M; Merrill, R M; Feuer, E J

    1998-01-01

    Lifetime and age-conditional risk estimates of developing cancer provide a useful summary to the public of the current cancer risk and how this risk compares with earlier periods and among select subgroups of society. These reported estimates, commonly quoted in the popular press, have the potential to promote early detection efforts, to increase cancer awareness, and to serve as an aid in study planning. However, they can also be easily misunderstood and frightening to the general public. The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute and the American Cancer Society have recently begun including in annual reports lifetime and age-conditional risk estimates of developing cancer. These risk estimates are based on incidence rates that reflect new cases of the cancer in a population free of the cancer. To compute these estimates involves a cancer prevalence adjustment that is computed cross-sectionally from current incidence and mortality data derived within a multiple decrement life table. This paper presents a detailed description of the methodology for deriving lifetime and age-conditional risk estimates of developing cancer. In addition, an extension is made which, using a triple decrement life table, adjusts for a surgical procedure that removes individuals from the risk of developing a given cancer. Two important results which provide insights into the basic methodology are included in the discussion. First, the lifetime risk estimate does not depend on the cancer prevalence adjustment, although this is not the case for age-conditional risk estimates. Second, the lifetime risk estimate is always smaller when it is corrected for a surgical procedure that takes people out of the risk pool to develop the cancer. The methodology is applied to corpus and uterus NOS cancers, with a correction made for hysterectomy prevalence. The interpretation and limitations of risk estimates are also discussed.

  13. Real-time monitoring of a microbial electrolysis cell using an electrical equivalent circuit model.

    PubMed

    Hussain, S A; Perrier, M; Tartakovsky, B

    2018-04-01

    Efforts in developing microbial electrolysis cells (MECs) resulted in several novel approaches for wastewater treatment and bioelectrosynthesis. Practical implementation of these approaches necessitates the development of an adequate system for real-time (on-line) monitoring and diagnostics of MEC performance. This study describes a simple MEC equivalent electrical circuit (EEC) model and a parameter estimation procedure, which enable such real-time monitoring. The proposed approach involves MEC voltage and current measurements during its operation with periodic power supply connection/disconnection (on/off operation) followed by parameter estimation using either numerical or analytical solution of the model. The proposed monitoring approach is demonstrated using a membraneless MEC with flow-through porous electrodes. Laboratory tests showed that changes in the influent carbon source concentration and composition significantly affect MEC total internal resistance and capacitance estimated by the model. Fast response of these EEC model parameters to changes in operating conditions enables the development of a model-based approach for real-time monitoring and fault detection.

  14. Application of expert systems in project management decision aiding

    NASA Technical Reports Server (NTRS)

    Harris, Regina; Shaffer, Steven; Stokes, James; Goldstein, David

    1987-01-01

    The feasibility of developing an expert systems-based project management decision aid to enhance the performance of NASA project managers was assessed. The research effort included extensive literature reviews in the areas of project management, project management decision aiding, expert systems technology, and human-computer interface engineering. Literature reviews were augmented by focused interviews with NASA managers. Time estimation for project scheduling was identified as the target activity for decision augmentation, and a design was developed for an Integrated NASA System for Intelligent Time Estimation (INSITE). The proposed INSITE design was judged feasible with a low level of risk. A partial proof-of-concept experiment was performed and was successful. Specific conclusions drawn from the research and analyses are included. The INSITE concept is potentially applicable in any management sphere, commercial or government, where time estimation is required for project scheduling. As project scheduling is a nearly universal management activity, the range of possibilities is considerable. The INSITE concept also holds potential for enhancing other management tasks, especially in areas such as cost estimation, where estimation-by-analogy is already a proven method.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, William Scott

    This seminar presentation describes amplitude models and yield estimations that look at the data in order to inform legislation. The following points were brought forth in the summary: global models that will predict three-component amplitudes (R-T-Z) were produced; Q models match regional geology; corrected source spectra can be used for discrimination and yield estimation; three-component data increase coverage and reduce scatter in source spectral estimates; three-component efforts must include distance-dependent effects; a community effort on instrument calibration is needed.

  16. Site-occupancy distribution modeling to correct population-trend estimates derived from opportunistic observations

    USGS Publications Warehouse

    Kery, M.; Royle, J. Andrew; Schmid, Hans; Schaub, M.; Volet, B.; Hafliger, G.; Zbinden, N.

    2010-01-01

    Species' assessments must frequently be derived from opportunistic observations made by volunteers (i.e., citizen scientists). Interpretation of the resulting data to estimate population trends is plagued with problems, including teasing apart genuine population trends from variations in observation effort. We devised a way to correct for annual variation in effort when estimating trends in occupancy (species distribution) from faunal or floral databases of opportunistic observations. First, for all surveyed sites, detection histories (i.e., strings of detection-nondetection records) are generated. Within-season replicate surveys provide information on the detectability of an occupied site. Detectability directly represents observation effort; hence, estimating detectablity means correcting for observation effort. Second, site-occupancy models are applied directly to the detection-history data set (i.e., without aggregation by site and year) to estimate detectability and species distribution (occupancy, i.e., the true proportion of sites where a species occurs). Site-occupancy models also provide unbiased estimators of components of distributional change (i.e., colonization and extinction rates). We illustrate our method with data from a large citizen-science project in Switzerland in which field ornithologists record opportunistic observations. We analyzed data collected on four species: the widespread Kingfisher (Alcedo atthis. ) and Sparrowhawk (Accipiter nisus. ) and the scarce Rock Thrush (Monticola saxatilis. ) and Wallcreeper (Tichodroma muraria. ). Our method requires that all observed species are recorded. Detectability was <1 and varied over the years. Simulations suggested some robustness, but we advocate recording complete species lists (checklists), rather than recording individual records of single species. The representation of observation effort with its effect on detectability provides a solution to the problem of differences in effort encountered when extracting trend information from haphazard observations. We expect our method is widely applicable for global biodiversity monitoring and modeling of species distributions. ?? 2010 Society for Conservation Biology.

  17. Development and experimental validation of a frying model to estimate acrylamide levels in French fries.

    PubMed

    Palazoğlu, T K; Gökmen, V

    2008-04-01

    In this study, a numerical model was developed to simulate frying of potato strips and estimate acrylamide levels in French fries. Heat and mass transfer parameters determined during frying of potato strips and the formation and degradation kinetic parameters of acrylamide obtained with a sugar-asparagine model system were incorporated within the model. The effect of reducing sugar content (0.3 to 2.15 g/100 g dry matter), strip thickness (8.5 x 8.5 mm and 10 x 10 mm), and frying time (3, 4, 5, and 6 min) and temperature (150, 170, and 190 degrees C) on resultant acrylamide level in French fries was investigated both numerically and experimentally. The model appeared to closely estimate the acrylamide contents, and thereby may potentially save considerable time, money, and effort during the stages of process design and optimization.

  18. NETL CO 2 Storage prospeCtive Resource Estimation Excel aNalysis (CO 2-SCREEN) User's Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanguinito, Sean M.; Goodman, Angela; Levine, Jonathan

    This user’s manual guides the use of the National Energy Technology Laboratory’s (NETL) CO 2 Storage prospeCtive Resource Estimation Excel aNalysis (CO 2-SCREEN) tool, which was developed to aid users screening saline formations for prospective CO 2 storage resources. CO 2- SCREEN applies U.S. Department of Energy (DOE) methods and equations for estimating prospective CO 2 storage resources for saline formations. CO2-SCREEN was developed to be substantive and user-friendly. It also provides a consistent method for calculating prospective CO 2 storage resources that allows for consistent comparison of results between different research efforts, such as the Regional Carbon Sequestration Partnershipsmore » (RCSP). CO 2-SCREEN consists of an Excel spreadsheet containing geologic inputs and outputs, linked to a GoldSim Player model that calculates prospective CO 2 storage resources via Monte Carlo simulation.« less

  19. Investigating the error sources of the online state of charge estimation methods for lithium-ion batteries in electric vehicles

    NASA Astrophysics Data System (ADS)

    Zheng, Yuejiu; Ouyang, Minggao; Han, Xuebing; Lu, Languang; Li, Jianqiu

    2018-02-01

    Sate of charge (SOC) estimation is generally acknowledged as one of the most important functions in battery management system for lithium-ion batteries in new energy vehicles. Though every effort is made for various online SOC estimation methods to reliably increase the estimation accuracy as much as possible within the limited on-chip resources, little literature discusses the error sources for those SOC estimation methods. This paper firstly reviews the commonly studied SOC estimation methods from a conventional classification. A novel perspective focusing on the error analysis of the SOC estimation methods is proposed. SOC estimation methods are analyzed from the views of the measured values, models, algorithms and state parameters. Subsequently, the error flow charts are proposed to analyze the error sources from the signal measurement to the models and algorithms for the widely used online SOC estimation methods in new energy vehicles. Finally, with the consideration of the working conditions, choosing more reliable and applicable SOC estimation methods is discussed, and the future development of the promising online SOC estimation methods is suggested.

  20. Assessment of Flood Disaster Impacts in Cambodia: Implications for Rapid Disaster Response

    NASA Astrophysics Data System (ADS)

    Ahamed, Aakash; Bolten, John; Doyle, Colin

    2016-04-01

    Disaster monitoring systems can provide near real time estimates of population and infrastructure affected by sudden onset natural hazards. This information is useful to decision makers allocating lifesaving resources following disaster events. Floods are the world's most common and devastating disasters (UN, 2004; Doocy et al., 2013), and are particularly frequent and severe in the developing countries of Southeast Asia (Long and Trong, 2001; Jonkman, 2005; Kahn, 2005; Stromberg, 2007; Kirsch et al., 2012). Climate change, a strong regional monsoon, and widespread hydropower construction contribute to a complex and unpredictable regional hydrodynamic regime. As such, there is a critical need for novel techniques to assess flood impacts to population and infrastructure with haste during and following flood events in order to enable governments and agencies to optimize response efforts following disasters. Here, we build on methods to determine regional flood extent in near real time and develop systems that automatically quantify the socioeconomic impacts of flooding in Cambodia. Software developed on cloud based, distributed processing Geographic Information Systems (GIS) is used to demonstrate spatial and numerical estimates of population, households, roadways, schools, hospitals, airports, agriculture and fish catch affected by severe monsoon flooding occurring in the Cambodian portion of Lower Mekong River Basin in 2011. Results show modest agreement with government and agency estimates. Maps and statistics generated from the system are intended to complement on the ground efforts and bridge information gaps to decision makers. The system is open source, flexible, and can be applied to other disasters (e.g. earthquakes, droughts, landslides) in various geographic regions.

  1. Combining inferences from models of capture efficiency, detectability, and suitable habitat to classify landscapes for conservation of threatened bull trout

    Treesearch

    James T. Peterson; Jason Dunham

    2003-01-01

    Effective conservation efforts for at-risk species require knowledge of the locations of existing populations. Species presence can be estimated directly by conducting field-sampling surveys or alternatively by developing predictive models. Direct surveys can be expensive and inefficient, particularly for rare and difficult- to-sample species, and models of species...

  2. Psychosocial Factors Mediating the Effect of the CHoBI7 Intervention on Handwashing with Soap: A Randomized Controlled Trial

    ERIC Educational Resources Information Center

    George, Christine Marie; Biswas, Shwapon; Jung, Danielle; Perin, Jamie; Parvin, Tahmina; Monira, Shirajum; Saif-Ur-Rahman, K. M.; Rashid, Mahamud-ur; Bhuyian, Sazzadul Islam; Thomas, Elizabeth D.; Dreibelbis, Robert; Begum, Farzana; Zohura, Fatema; Zhang, Xiaotong; Sack, David A.; Alam, Munirul; Sack, R. Bradley; Leontsini, Elli; Winch, Peter J.

    2017-01-01

    Inadequate hand hygiene is estimated to result in nearly 300,000 deaths annually, with the majority of deaths being among children younger than 5 years. In an effort to promote handwashing with soap and water treatment behaviors among highly susceptible household members of cholera patients, we recently developed the Cholera-Hospital-Based…

  3. A Composite Self-Report: Reasons for Taking Science Courses as Given by Cocoa High School Science Students.

    ERIC Educational Resources Information Center

    Louwerse, Frances H.

    A self-report instrument (questionnaire/reaction scale) was developed and administered to students in grades 9-12 to: (1) determine the number of science courses taken by each grade level; (2) estimate the number of science courses requested for future years and indicate where recruitment efforts would be needed; (3) examine other-directed reasons…

  4. Estimating small mammal abundance on fuels treatment units in southwestern ponderosa pine forests

    Treesearch

    Sarah J. Converse; Brett G. Dickson; Gary C. White; William M. Block

    2004-01-01

    In many North American forests, post-European settlement fire suppression efforts have resulted in the excessive accumulation of forest fuels and changes to the historic fire regime, thereby increasing the risk of catastrophic wildfires (Cooper 1960; Dodge 1972; Covington and Moore 1994). To reduce this risk, it is necessary to develop treatments that will remove...

  5. Estimating abundance of mountain lions from unstructured spatial sampling

    USGS Publications Warehouse

    Russell, Robin E.; Royle, J. Andrew; Desimone, Richard; Schwartz, Michael K.; Edwards, Victoria L.; Pilgrim, Kristy P.; Mckelvey, Kevin S.

    2012-01-01

    Mountain lions (Puma concolor) are often difficult to monitor because of their low capture probabilities, extensive movements, and large territories. Methods for estimating the abundance of this species are needed to assess population status, determine harvest levels, evaluate the impacts of management actions on populations, and derive conservation and management strategies. Traditional mark–recapture methods do not explicitly account for differences in individual capture probabilities due to the spatial distribution of individuals in relation to survey effort (or trap locations). However, recent advances in the analysis of capture–recapture data have produced methods estimating abundance and density of animals from spatially explicit capture–recapture data that account for heterogeneity in capture probabilities due to the spatial organization of individuals and traps. We adapt recently developed spatial capture–recapture models to estimate density and abundance of mountain lions in western Montana. Volunteers and state agency personnel collected mountain lion DNA samples in portions of the Blackfoot drainage (7,908 km2) in west-central Montana using 2 methods: snow back-tracking mountain lion tracks to collect hair samples and biopsy darting treed mountain lions to obtain tissue samples. Overall, we recorded 72 individual capture events, including captures both with and without tissue sample collection and hair samples resulting in the identification of 50 individual mountain lions (30 females, 19 males, and 1 unknown sex individual). We estimated lion densities from 8 models containing effects of distance, sex, and survey effort on detection probability. Our population density estimates ranged from a minimum of 3.7 mountain lions/100 km2 (95% Cl 2.3–5.7) under the distance only model (including only an effect of distance on detection probability) to 6.7 (95% Cl 3.1–11.0) under the full model (including effects of distance, sex, survey effort, and distance x sex on detection probability). These numbers translate to a total estimate of 293 mountain lions (95% Cl 182–451) to 529 (95% Cl 245–870) within the Blackfoot drainage. Results from the distance model are similar to previous estimates of 3.6 mountain lions/100 km2 for the study area; however, results from all other models indicated greater numbers of mountain lions. Our results indicate that unstructured spatial sampling combined with spatial capture–recapture analysis can be an effective method for estimating large carnivore densities.

  6. Designing long-term fish community assessments in connecting channels: Lessons from the Saint Marys River

    USGS Publications Warehouse

    Schaeffer, Jeff; Rogers, Mark W.; Fielder, David G.; Godby, Neal; Bowen, Anjanette K.; O'Connor, Lisa; Parrish, Josh; Greenwood, Susan; Chong, Stephen; Wright, Greg

    2014-01-01

    Long-term surveys are useful in understanding trends in connecting channel fish communities; a gill net assessment in the Saint Marys River performed periodically since 1975 is the most comprehensive connecting channels sampling program within the Laurentian Great Lakes. We assessed efficiency of that survey, with intent to inform development of assessments at other connecting channels. We evaluated trends in community composition, effort versus estimates of species richness, ability to detect abundance changes for four species, and effects of subsampling yellow perch catches on size and age-structure metrics. Efficiency analysis revealed low power to detect changes in species abundance, whereas reduced effort could be considered to index species richness. Subsampling simulations indicated that subsampling would have allowed reliable estimates of yellow perch (Perca flavescens) population structure, while greatly reducing the number of fish that were assigned ages. Analyses of statistical power and efficiency of current sampling protocols are useful for managers collecting and using these types of data as well as for the development of new monitoring programs. Our approach provides insight into whether survey goals and objectives were being attained and can help evaluate ability of surveys to answer novel questions that arise as management strategies are refined.

  7. Space Transportation Booster Engine Configuration Study. Volume 3: Program Cost estimates and work breakdown structure and WBS dictionary

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The objective of the Space Transportation Booster Engine Configuration Study is to contribute to the ALS development effort by providing highly reliable, low cost booster engine concepts for both expendable and reusable rocket engines. The objectives of the Space Transportation Booster Engine (STBE) Configuration Study were: (1) to identify engine development configurations which enhance vehicle performance and provide operational flexibility at low cost; and (2) to explore innovative approaches to the follow-on Full-Scale Development (FSD) phase for the STBE.

  8. Putting the pediatrics milestones into practice: a consensus roadmap and resource analysis.

    PubMed

    Schumacher, Daniel J; Spector, Nancy D; Calaman, Sharon; West, Daniel C; Cruz, Mario; Frohna, John G; Gonzalez Del Rey, Javier; Gustafson, Kristina K; Poynter, Sue Ellen; Rosenbluth, Glenn; Southgate, W Michael; Vinci, Robert J; Sectish, Theodore C

    2014-05-01

    The Accreditation Council for Graduate Medical Education has partnered with member boards of the American Board of Medical Specialties to initiate the next steps in advancing competency-based assessment in residency programs. This initiative, known as the Milestone Project, is a paradigm shift from traditional assessment efforts and requires all pediatrics residency programs to report individual resident progression along a series of 4 to 5 developmental levels of performance, or milestones, for individual competencies every 6 months beginning in June 2014. The effort required to successfully make this shift is tremendous given the number of training programs, training institutions, and trainees. However, it holds great promise for achieving training outcomes that align with patient needs; developing a valid, reliable, and meaningful way to track residents' development; and providing trainees with a roadmap for learning. Recognizing the resources needed to implement this new system, the authors, all residency program leaders, provide their consensus view of the components necessary for implementing and sustaining this effort, including resource estimates for completing this work. The authors have identified 4 domains: (1) Program Review and Development of Stakeholders and Participants, (2) Assessment Methods and Validation, (3) Data and Assessment System Development, and (4) Summative Assessment and Feedback. This work can serve as a starting point and framework for collaboration with program, department, and institutional leaders to identify and garner necessary resources and plan for local and national efforts that will ensure successful transition to milestones-based assessment. Copyright © 2014 by the American Academy of Pediatrics.

  9. Cost-engineering modeling to support rapid concept development of an advanced infrared satellite system

    NASA Astrophysics Data System (ADS)

    Bell, Kevin D.; Dafesh, Philip A.; Hsu, L. A.; Tsuda, A. S.

    1995-12-01

    Current architectural and design trade techniques often carry unaffordable alternatives late into the decision process. Early decisions made during the concept exploration and development (CE&D) phase will drive the cost of a program more than any other phase of development; thus, designers must be able to assess both the performance and cost impacts of their early choices. The Space Based Infrared System (SBIRS) cost engineering model (CEM) described in this paper is an end-to-end process integrating engineering and cost expertise through commonly available spreadsheet software, allowing for concurrent design engineering and cost estimation to identify and balance system drives to reduce acquisition costs. The automated interconnectivity between subsystem models using spreadsheet software allows for the quick and consistent assessment of the system design impacts and relative cost impacts due to requirement changes. It is different from most CEM efforts attempted in the past as it incorporates more detailed spacecraft and sensor payload models, and has been applied to determine the cost drivers for an advanced infrared satellite system acquisition. The CEM is comprised of integrated detailed engineering and cost estimating relationships describing performance, design, and cost parameters. Detailed models have been developed to evaluate design parameters for the spacecraft bus and sensor; both step-starer and scanner sensor types incorporate models of focal plane array, optics, processing, thermal, communications, and mission performance. The current CEM effort has provided visibility to requirements, design, and cost drivers for system architects and decision makers to determine the configuration of an infrared satellite architecture that meets essential requirements cost effectively. In general, the methodology described in this paper consists of process building blocks that can be tailored to the needs of many applications. Descriptions of the spacecraft and payload subsystem models provide insight into The Aerospace Corporation expertise and scope of the SBIRS concept development effort.

  10. Towards estimation of respiratory muscle effort with respiratory inductance plethysmography signals and complementary ensemble empirical mode decomposition.

    PubMed

    Chen, Ya-Chen; Hsiao, Tzu-Chien

    2018-07-01

    Respiratory inductance plethysmography (RIP) sensor is an inexpensive, non-invasive, easy-to-use transducer for collecting respiratory movement data. Studies have reported that the RIP signal's amplitude and frequency can be used to discriminate respiratory diseases. However, with the conventional approach of RIP data analysis, respiratory muscle effort cannot be estimated. In this paper, the estimation of the respiratory muscle effort through RIP signal was proposed. A complementary ensemble empirical mode decomposition method was used, to extract hidden signals from the RIP signals based on the frequency bands of the activities of different respiratory muscles. To validate the proposed method, an experiment to collect subjects' RIP signal under thoracic breathing (TB) and abdominal breathing (AB) was conducted. The experimental results for both the TB and AB indicate that the proposed method can be used to loosely estimate the activities of thoracic muscles, abdominal muscles, and diaphragm. Graphical abstract ᅟ.

  11. Wildlife Loss Estimates and Summary of Previous Mitigation Related to Hydroelectric Projects in Montana, Volume Three, Hungry Horse Project.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casey, Daniel

    1984-10-01

    This assessment addresses the impacts to the wildlife populations and wildlife habitats due to the Hungry Horse Dam project on the South Fork of the Flathead River and previous mitigation of theses losses. In order to develop and focus mitigation efforts, it was first necessary to estimate wildlife and wildlife hatitat losses attributable to the construction and operation of the project. The purpose of this report was to document the best available information concerning the degree of impacts to target wildlife species. Indirect benefits to wildlife species not listed will be identified during the development of alternative mitigation measures. Wildlifemore » species incurring positive impacts attributable to the project were identified.« less

  12. Examining hydrogen transitions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plotkin, S. E.; Energy Systems

    2007-03-01

    This report describes the results of an effort to identify key analytic issues associated with modeling a transition to hydrogen as a fuel for light duty vehicles, and using insights gained from this effort to suggest ways to improve ongoing modeling efforts. The study reported on here examined multiple hydrogen scenarios reported in the literature, identified modeling issues associated with those scenario analyses, and examined three DOE-sponsored hydrogen transition models in the context of those modeling issues. The three hydrogen transition models are HyTrans (contractor: Oak Ridge National Laboratory), MARKAL/DOE* (Brookhaven National Laboratory), and NEMS-H2 (OnLocation, Inc). The goals ofmore » these models are (1) to help DOE improve its R&D effort by identifying key technology and other roadblocks to a transition and testing its technical program goals to determine whether they are likely to lead to the market success of hydrogen technologies, (2) to evaluate alternative policies to promote a transition, and (3) to estimate the costs and benefits of alternative pathways to hydrogen development.« less

  13. Characterizing sub-arctic peatland vegeation using height estimates from structure from motion and an unmanned aerial system (UAS)

    NASA Astrophysics Data System (ADS)

    Palace, M. W.; DelGreco, J.; Herrick, C.; Sullivan, F.; Varner, R. K.

    2017-12-01

    The collapse of permafrost, due to thawing, changes landscape topography, hydrology and vegetation. Changes in plant species composition influence methane production pathways and methane emission rates. The complex spatial heterogeneity of vegetation composition across peatlands proves important in quantifying methane emissions. Effort to characterize vegetation across these permafrost peatlands has been conducted with varied success, with difficulty seen in estimating some cover types that are at opposite ends of the permafrost collapse transition, ie palsa/tall shrub and tall graminoid. This is because some of the species are the same (horsetail) and some of the species have similar structure (horsetail/Carex spp.). High resolution digital elevation maps, developed with airborne LIght Detection And Ranging (lidar) have provided insight into some wetland attributes, but lidar collection is costly and requires extensive data processing effort. Lidar information also lacks the spectral information that optical sensors provide. We used an inexpensive Unmanned Aerial Vehicle (UAV) with an optical sensor to image a mire in northern Sweden (Stordalen Mire) in 2015. We collected 700 overlapping images that were stitched together using Structure from Motion (SfM). SfM analysis also provided, due to parallax, the ability to develop a height map of vegetation. This height map was used, along with textural analysis, to develop an artificial neural network to predict five vegetation cover types. Using 200 training points, we found improvements in our prediction of these cover types. We suggest that using the digital height model from SfM provides useful information in remotely sensing vegetation across a permafrost collapsing region that exhibit resulting changes in vegetation composition. The ability to rapidly and inexpensively deploy such a UAV system provides the opportunity to examine multiple sites with limited personnel effort in remote areas.

  14. Estimating parasitic sea lamprey abundance in Lake Huron from heterogenous data sources

    USGS Publications Warehouse

    Young, Robert J.; Jones, Michael L.; Bence, James R.; McDonald, Rodney B.; Mullett, Katherine M.; Bergstedt, Roger A.

    2003-01-01

    The Great Lakes Fishery Commission uses time series of transformer, parasitic, and spawning population estimates to evaluate the effectiveness of its sea lamprey (Petromyzon marinus) control program. This study used an inverse variance weighting method to integrate Lake Huron sea lamprey population estimates derived from two estimation procedures: 1) prediction of the lake-wide spawning population from a regression model based on stream size and, 2) whole-lake mark and recapture estimates. In addition, we used a re-sampling procedure to evaluate the effect of trading off sampling effort between the regression and mark-recapture models. Population estimates derived from the regression model ranged from 132,000 to 377,000 while mark-recapture estimates of marked recently metamorphosed juveniles and parasitic sea lampreys ranged from 536,000 to 634,000 and 484,000 to 1,608,000, respectively. The precision of the estimates varied greatly among estimation procedures and years. The integrated estimate of the mark-recapture and spawner regression procedures ranged from 252,000 to 702,000 transformers. The re-sampling procedure indicated that the regression model is more sensitive to reduction in sampling effort than the mark-recapture model. Reliance on either the regression or mark-recapture model alone could produce misleading estimates of abundance of sea lampreys and the effect of the control program on sea lamprey abundance. These analyses indicate that the precision of the lakewide population estimate can be maximized by re-allocating sampling effort from marking sea lampreys to trapping additional streams.

  15. Control of corruption, democratic accountability, and effectiveness of HIV/AIDS official development assistance

    PubMed Central

    Lee, Hwa-Young; Yang, Bong-Ming; Kang, Minah

    2016-01-01

    Background Despite continued global efforts, HIV/AIDS outcomes in developing countries have not made much progress. Poor governance in recipient countries is often seen as one of the reasons for ineffectiveness of aid efforts to achieve stated objectives and desired outcomes. Objective This study examines the impact of two important dimensions of governance – control of corruption and democratic accountability – on the effectiveness of HIV/AIDS official development assistance. Design An empirical analysis using dynamic panel Generalized Method of Moments estimation was conducted on 2001–2010 datasets. Results Control of corruption and democratic accountability revealed an independent effect and interaction with the amount of HIV/AIDS aid on incidence of HIV/AIDS, respectively, while none of the two governance variables had a significant effect on HIV/AIDS prevalence. Specifically, in countries with accountability level below −2.269, aid has a detrimental effect on incidence of HIV/AIDS. Conclusion The study findings suggest that aid programs need to be preceded or at least accompanied by serious efforts to improve governance in recipient countries and that democratic accountability ought to receive more critical attention. PMID:27189199

  16. Progress and limitations on quantifying nutrient and carbon loading to coastal waters

    NASA Astrophysics Data System (ADS)

    Stets, E.; Oelsner, G. P.; Stackpoole, S. M.

    2017-12-01

    Riverine export of nutrients and carbon to estuarine and coastal waters are important determinants of coastal ecosystem health and provide necessary insight into global biogeochemical cycles. Quantification of coastal solute loads typically relies upon modeling based on observations of concentration and discharge from selected rivers draining to the coast. Most large-scale river export models require unidirectional flow and thus are referenced to monitoring locations at the head of tide, which can be located far inland. As a result, the contributions of the coastal plain, tidal wetlands, and concentrated coastal development are often poorly represented in regional and continental-scale estimates of solute delivery to coastal waters. However, site-specific studies have found that these areas are disproportionately active in terms of nutrient and carbon export. Modeling efforts to upscale fluxes from these areas, while not common, also suggest an outsized importance to coastal flux estimates. This presentation will focus on illustrating how the problem of under-representation of near-shore environments impacts large-scale coastal flux estimates in the context of recent regional and continental-scale assessments. Alternate approaches to capturing the influence of the near-coastal terrestrial inputs including recent data aggregation efforts and modeling approaches will be discussed.

  17. Space transfer vehicle concepts and requirements. Volume 3: Program cost estimates

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Space Transfer Vehicle (STV) Concepts and Requirements Study has been an eighteen-month study effort to develop and analyze concepts for a family of vehicles to evolve from an initial STV system into a Lunar Transportation System (LTS) for use with the Heavy Lift Launch Vehicle (HLLV). The study defined vehicle configurations, facility concepts, and ground and flight operations concepts. This volume reports the program cost estimates results for this portion of the study. The STV Reference Concept described within this document provides a complete LTS system that performs both cargo and piloted Lunar missions.

  18. Modeling update for the Thirty Meter Telescope laser guide star dual-conjugate adaptive optics system

    NASA Astrophysics Data System (ADS)

    Gilles, Luc; Wang, Lianqi; Ellerbroek, Brent

    2010-07-01

    This paper describes the modeling efforts undertaken in the past couple of years to derive wavefront error (WFE) performance estimates for the Narrow Field Infrared Adaptive Optics System (NFIRAOS), which is the facility laser guide star (LGS) dual-conjugate adaptive optics (AO) system for the Thirty Meter Telescope (TMT). The estimates describe the expected performance of NFIRAOS as a function of seeing on Mauna Kea, zenith angle, and galactic latitude (GL). They have been developed through a combination of integrated AO simulations, side analyses, allocations, lab and lidar experiments.

  19. Estimation of Plutonium-240 Mass in Waste Tanks Using Ultra-Sensitive Detection of Radioactive Xenon Isotopes from Spontaneous Fission

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowyer, Theodore W.; Gesh, Christopher J.; Haas, Daniel A.

    This report details efforts to develop a technique which is able to detect and quantify the mass of 240Pu in waste storage tanks and other enclosed spaces. If the isotopic ratios of the plutonium contained in the enclosed space is also known, then this technique is capable of estimating the total mass of the plutonium without physical sample retrieval and radiochemical analysis of hazardous material. Results utilizing this technique are reported for a Hanford Site waste tank (TX-118) and a well-characterized plutonium sample in a laboratory environment.

  20. Designing a monitoring program to estimate estuarine survival of anadromous salmon smolts: simulating the effect of sample design on inference

    USGS Publications Warehouse

    Romer, Jeremy D.; Gitelman, Alix I.; Clements, Shaun; Schreck, Carl B.

    2015-01-01

    A number of researchers have attempted to estimate salmonid smolt survival during outmigration through an estuary. However, it is currently unclear how the design of such studies influences the accuracy and precision of survival estimates. In this simulation study we consider four patterns of smolt survival probability in the estuary, and test the performance of several different sampling strategies for estimating estuarine survival assuming perfect detection. The four survival probability patterns each incorporate a systematic component (constant, linearly increasing, increasing and then decreasing, and two pulses) and a random component to reflect daily fluctuations in survival probability. Generally, spreading sampling effort (tagging) across the season resulted in more accurate estimates of survival. All sampling designs in this simulation tended to under-estimate the variation in the survival estimates because seasonal and daily variation in survival probability are not incorporated in the estimation procedure. This under-estimation results in poorer performance of estimates from larger samples. Thus, tagging more fish may not result in better estimates of survival if important components of variation are not accounted for. The results of our simulation incorporate survival probabilities and run distribution data from previous studies to help illustrate the tradeoffs among sampling strategies in terms of the number of tags needed and distribution of tagging effort. This information will assist researchers in developing improved monitoring programs and encourage discussion regarding issues that should be addressed prior to implementation of any telemetry-based monitoring plan. We believe implementation of an effective estuary survival monitoring program will strengthen the robustness of life cycle models used in recovery plans by providing missing data on where and how much mortality occurs in the riverine and estuarine portions of smolt migration. These data could result in better informed management decisions and assist in guidance for more effective estuarine restoration projects.

  1. Methods of extending crop signatures from one area to another

    NASA Technical Reports Server (NTRS)

    Minter, T. C. (Principal Investigator)

    1979-01-01

    Efforts to develop a technology for signature extension during LACIE phases 1 and 2 are described. A number of haze and Sun angle correction procedures were developed and tested. These included the ROOSTER and OSCAR cluster-matching algorithms and their modifications, the MLEST and UHMLE maximum likelihood estimation procedures, and the ATCOR procedure. All these algorithms were tested on simulated data and consecutive-day LANDSAT imagery. The ATCOR, OSCAR, and MLEST algorithms were also tested for their capability to geographically extend signatures using LANDSAT imagery.

  2. Meteorological limits on the growth and development of screwworm populations

    NASA Technical Reports Server (NTRS)

    Phinney, D. E.; Arp, G. K.

    1978-01-01

    A program to evaluate the use of remotely sensed data as an additional tool in existing and projected efforts to eradicate the screwworm began in 1973. Estimating weather conditions by use of remotely sensed data was part of the study. Next, the effect of weather on screwworm populations was modeled. A significant portion of the variation in screwworm population growth and development has been traced to weather-related parameters. This report deals with the salient points of the weather and the screwworm population interaction.

  3. Developments and potential of radiation processing in the Philippines

    NASA Astrophysics Data System (ADS)

    Singson, C.; Carmona, C.

    This paper describes the research and development activities in three areas of radiation processing, namely: food irradiation, medical product sterilization and wood plastic combination. Plans and efforts exerted to acquire a larger gamma source to augment our present 5,000 curie source are discussed. Cost estimates for a radiation facility is presented on the basis of the market potential of food irradiation and medical product sterilization. Existing local industries that can benefit from the adaptation of irradiation technology in their processing requirements is described.

  4. Spacecraft Complexity Subfactors and Implications on Future Cost Growth

    NASA Technical Reports Server (NTRS)

    Leising, Charles J.; Wessen, Randii; Ellyin, Ray; Rosenberg, Leigh; Leising, Adam

    2013-01-01

    During the last ten years the Jet Propulsion Laboratory has used a set of cost-risk subfactors to independently estimate the magnitude of development risks that may not be covered in the high level cost models employed during early concept development. Within the last several years the Laboratory has also developed a scale of Concept Maturity Levels with associated criteria to quantitatively assess a concept's maturity. This latter effort has been helpful in determining whether a concept is mature enough for accurate costing but it does not provide any quantitative estimate of cost risk. Unfortunately today's missions are significantly more complex than when the original cost-risk subfactors were first formulated. Risks associated with complex missions are not being adequately evaluated and future cost growth is being underestimated. The risk subfactor process needed to be updated.

  5. 48 CFR 2052.216-70 - Level of effort.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Level of effort. 2052.216... Level of effort. As prescribed at 2016.307-70(a) the contracting officer shall insert the following..., time and materials, or labor hours basis. Level of Effort (JAN 1993) The NRC's estimate of the total...

  6. Effectiveness of medical equipment donations to improve health systems: how much medical equipment is broken in the developing world?

    PubMed

    Perry, Lora; Malkin, Robert

    2011-07-01

    It is often said that most of the medical equipment in the developing world is broken with estimates ranging up to 96% out of service. But there is little documented evidence to support these statements. We wanted to quantify the amount of medical equipment that was out of service in resource poor health settings and identify possible causes. Inventory reports were analyzed from 1986 to 2010, from hospitals in sixteen countries across four continents. The UN Human Development Index was used to determine which countries should be considered developing nations. Non-medical hospital equipment was excluded. This study examined 112,040 pieces of equipment. An average of 38.3% (42,925, range across countries: 0.83-47%) in developing countries was out of service. The three main causes were lack of training, health technology management, and infrastructure. We hope that the findings will help biomedical engineers with their efforts toward effective designs for the developing world and NGO's with efforts to design effective healthcare interventions.

  7. Harvest patterns and effort dynamics of indigenous and non-indigenous commercial sectors of the eastern Torres Strait reef line fishery

    NASA Astrophysics Data System (ADS)

    Williams, Ashley J.; Ballagh, Aaron C.; Begg, Gavin A.; Murchie, Cameron D.; Currey, Leanne M.

    2008-09-01

    The reef line fishery (RLF) in eastern Torres Strait (ETS) is unique in that it has both a commercial indigenous sector and a commercial non-indigenous sector. Recently, concerns have been expressed by all stakeholders about the long-term sustainability of the fishery. These concerns have been exacerbated by the lack of detailed catch and effort information from both sectors, which has precluded any formal assessment of the fishery. In this paper, we characterise the harvest patterns and effort dynamics of the indigenous and non-indigenous commercial sectors of the ETS RLF using a range of data sources including commercial logbooks, community freezer records, voluntary logbooks and observer surveys. We demonstrate that bycatch is a significant component of the catch for both sectors and identify substantial differences in harvest patterns and effort dynamics between the sectors. Differences between sectors were observed in species composition and spatial and temporal patterns in catch, effort and catch per unit effort. These results highlight the inherent variation in catch and effort dynamics between the two commercial sectors of the ETS RLF and provide valuable information for the development of future assessments and appropriate management strategies for the fishery. The more reliable estimates of harvest patterns and effort dynamics for both sectors obtained from observer surveys will also assist in resolving issues relating to allocation of reef fish resources in Torres Strait.

  8. Neurodynamic evaluation of hearing aid features using EEG correlates of listening effort.

    PubMed

    Bernarding, Corinna; Strauss, Daniel J; Hannemann, Ronny; Seidler, Harald; Corona-Strauss, Farah I

    2017-06-01

    In this study, we propose a novel estimate of listening effort using electroencephalographic data. This method is a translation of our past findings, gained from the evoked electroencephalographic activity, to the oscillatory EEG activity. To test this technique, electroencephalographic data from experienced hearing aid users with moderate hearing loss were recorded, wearing hearing aids. The investigated hearing aid settings were: a directional microphone combined with a noise reduction algorithm in a medium and a strong setting, the noise reduction setting turned off, and a setting using omnidirectional microphones without any noise reduction. The results suggest that the electroencephalographic estimate of listening effort seems to be a useful tool to map the exerted effort of the participants. In addition, the results indicate that a directional processing mode can reduce the listening effort in multitalker listening situations.

  9. The Impact of Advanced Greenhouse Gas Measurement Science on Policy Goals and Research Strategies

    NASA Astrophysics Data System (ADS)

    Abrahams, L.; Clavin, C.; McKittrick, A.

    2016-12-01

    In support of the Paris agreement, accurate characterizations of U.S. greenhouse gas (GHG) emissions estimates have been area of increased scientific focus. Over the last several years, the scientific community has placed significant emphasis on understanding, quantifying, and reconciling measurement and modeling methods that characterize methane emissions from petroleum and natural gas sources. This work has prompted national policy discussions and led to the improvement of regional and national methane emissions estimates. Research campaigns focusing on reconciling atmospheric measurements ("top-down") and process-based emissions estimates ("bottom-up") have sought to identify where measurement technology advances could inform policy objectives. A clear next step is development and deployment of advanced detection capabilities that could aid U.S. emissions mitigation and verification goals. The breadth of policy-relevant outcomes associated with advances in GHG measurement science are demonstrated by recent improvements in the petroleum and natural gas sector emission estimates in the EPA Greenhouse Gas Inventory, ambitious efforts to apply inverse modeling results to inform or validate national GHG inventory, and outcomes from federal GHG measurement science technology development programs. In this work, we explore the variety of policy-relevant outcomes impacted by advances in GHG measurement science, with an emphasis on improving GHG inventory estimates, identifying emissions mitigation strategies, and informing technology development requirements.

  10. Estimated economic losses associated with the destruction of plants owing to Phytophthora ramorum quarantine efforts in Washington state

    Treesearch

    N.L. Dart; G.A. Chastagner

    2008-01-01

    The number and retail value of plants destroyed in Washington state nurseries due to Phytophthora ramorum quarantine efforts was estimated using Emergency Action Notification forms (EANs) issued by the United States Department of Agriculture Animal and Plant Health Inspection Service between 2004 and 2005. Data collected from EANs indicate that...

  11. Application of a Mathematical Model to Describe the Effects of Chlorpyrifos on Caenorhabditis elegans Development

    PubMed Central

    Boyd, Windy A.; Smith, Marjolein V.; Kissling, Grace E.; Rice, Julie R.; Snyder, Daniel W.; Portier, Christopher J.; Freedman, Jonathan H.

    2009-01-01

    Background The nematode Caenorhabditis elegans is being assessed as an alternative model organism as part of an interagency effort to develop better means to test potentially toxic substances. As part of this effort, assays that use the COPAS Biosort flow sorting technology to record optical measurements (time of flight (TOF) and extinction (EXT)) of individual nematodes under various chemical exposure conditions are being developed. A mathematical model has been created that uses Biosort data to quantitatively and qualitatively describe C. elegans growth, and link changes in growth rates to biological events. Chlorpyrifos, an organophosphate pesticide known to cause developmental delays and malformations in mammals, was used as a model toxicant to test the applicability of the growth model for in vivo toxicological testing. Methodology/Principal Findings L1 larval nematodes were exposed to a range of sub-lethal chlorpyrifos concentrations (0–75 µM) and measured every 12 h. In the absence of toxicant, C. elegans matured from L1s to gravid adults by 60 h. A mathematical model was used to estimate nematode size distributions at various times. Mathematical modeling of the distributions allowed the number of measured nematodes and log(EXT) and log(TOF) growth rates to be estimated. The model revealed three distinct growth phases. The points at which estimated growth rates changed (change points) were constant across the ten chlorpyrifos concentrations. Concentration response curves with respect to several model-estimated quantities (numbers of measured nematodes, mean log(TOF) and log(EXT), growth rates, and time to reach change points) showed a significant decrease in C. elegans growth with increasing chlorpyrifos concentration. Conclusions Effects of chlorpyrifos on C. elegans growth and development were mathematically modeled. Statistical tests confirmed a significant concentration effect on several model endpoints. This confirmed that chlorpyrifos affects C. elegans development in a concentration dependent manner. The most noticeable effect on growth occurred during early larval stages: L2 and L3. This study supports the utility of the C. elegans growth assay and mathematical modeling in determining the effects of potentially toxic substances in an alternative model organism using high-throughput technologies. PMID:19753116

  12. Structural change as a key component for agricultural non-CO2 mitigation efforts.

    PubMed

    Frank, Stefan; Beach, Robert; Havlík, Petr; Valin, Hugo; Herrero, Mario; Mosnier, Aline; Hasegawa, Tomoko; Creason, Jared; Ragnauth, Shaun; Obersteiner, Michael

    2018-03-13

    Agriculture is the single largest source of anthropogenic non-carbon dioxide (non-CO 2 ) emissions. Reaching the climate target of the Paris Agreement will require significant emission reductions across sectors by 2030 and continued efforts thereafter. Here we show that the economic potential of non-CO 2 emissions reductions from agriculture is up to four times as high as previously estimated. In fact, we find that agriculture could achieve already at a carbon price of 25 $/tCO 2 eq non-CO 2 reductions of around 1 GtCO 2 eq/year by 2030 mainly through the adoption of technical and structural mitigation options. At 100 $/tCO 2 eq agriculture could even provide non-CO 2 reductions of 2.6 GtCO 2 eq/year in 2050 including demand side efforts. Immediate action to favor the widespread adoption of technical options in developed countries together with productivity increases through structural changes in developing countries is needed to move agriculture on track with a 2 °C climate stabilization pathway.

  13. How do marital status, work effort, and wage rates interact?

    PubMed

    Ahituv, Avner; Lerman, Robert I

    2007-08-01

    How marital status interacts with men's earnings is an important analytic and policy issue, especially in the context of debates in the United States over programs that encourage healthy marriage. This paper generates new findings about the earnings-marriage relationship by estimating the linkages among flows into and out of marriage, work effort, and wage rates. The estimates are based on National Longitudinal Survey of Youth panel data, covering 23 years of marital and labor market outcomes, and control for unobserved heterogeneity. We estimate marriage effects on hours worked (our proxy for work effort) and on wage rates for all men and for black and low-skilled men separately. The estimates reveal that entering marriage raises hours worked quickly and substantially but that marriage's effect on wage rates takes place more slowly while men continue in marriage. Together; the stimulus to hours worked and wage rates generates an 18%-19% increase in earnings, with about one-third to one-half of the marriage earnings premium attributable to higher work effort. At the same time, higher wage rates and hours worked encourage men to marry and to stay married. Thus, being married and having high earnings reinforce each other over time.

  14. Estimating the number of cases of acute gastrointestinal illness (AGI) associated with Canadian municipal drinking water systems.

    PubMed

    Murphy, H M; Thomas, M K; Medeiros, D T; McFADYEN, S; Pintar, K D M

    2016-05-01

    The estimated burden of endemic acute gastrointestinal illness (AGI) annually in Canada is 20·5 million cases. Approximately 4 million of these cases are domestically acquired and foodborne, yet the proportion of waterborne cases is unknown. A number of randomized controlled trials have been completed to estimate the influence of tap water from municipal drinking water plants on the burden of AGI. In Canada, 83% of the population (28 521 761 people) consumes tap water from municipal drinking water plants serving >1000 people. The drinking water-related AGI burden associated with the consumption of water from these systems in Canada is unknown. The objective of this research was to estimate the number of AGI cases attributable to consumption of drinking water from large municipal water supplies in Canada, using data from four household drinking water intervention trials. Canadian municipal water treatment systems were ranked into four categories based on source water type and quality, population size served, and treatment capability and barriers. The water treatment plants studied in the four household drinking water intervention trials were also ranked according to the aforementioned criteria, and the Canadian treatment plants were then scored against these criteria to develop four AGI risk groups. The proportion of illnesses attributed to distribution system events vs. source water quality/treatment failures was also estimated, to inform the focus of future intervention efforts. It is estimated that 334 966 cases (90% probability interval 183 006-501 026) of AGI per year are associated with the consumption of tap water from municipal systems that serve >1000 people in Canada. This study provides a framework for estimating the burden of waterborne illness at a national level and identifying existing knowledge gaps for future research and surveillance efforts, in Canada and abroad.

  15. Development of failure mechanisms for fasteners in the United States

    Treesearch

    Douglas R. Rammer; Philip Line

    2006-01-01

    In the 2001 National Design Specifications® for Wood Construction (NDS), Appendix E was added to explicitly address wood failure mechanisms that may occur in fasteners. One approach to estimate design capacities for net section, row tear out, and group tear failure mechanisms is presented in Appendix E of the 2001 NDS. Since the 2001 NDS, efforts are being untaken to...

  16. Justification of Estimates for Fiscal Year 1984 Submitted to Congress.

    DTIC Science & Technology

    1983-01-01

    sponsoring different aspects related to unique manufacturing methods than those pursued by DARPA, and duplication of effort is prevented by direct...weapons systems. Rapid and economical methods of satisfying these requirements must significantly precede weapons systems developments to prevent... methods for obtaining accurate and efficient geodetic measurements. Also, a major advanced sensor/G&G data collection capability is being urdertaken by DNA

  17. Atmosphere, Magnetosphere and Plasmas in Space (AMPS). Spacelab payload definition study. Volume 7, book 2: AMPS phase C/D analysis and planning document

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The results are presented of the AMPS Phase C/D (Design, Development, and Operations) program analysis and planning effort. Cost and schedule estimates are included. Although the AMPS program has been specifically addressed, these task descriptions are basically adaptable to a broader-based program incorporating additional or different Spacelab/orbiter payloads.

  18. Turbine Design to Mitigate Forcing (POSTPRINT)

    DTIC Science & Technology

    2012-09-01

    durability enhancements, sometimes fuel nozzles and turbine nozzle guide vanes are also clocked in an effort to reduce the heat load to downstream...e.g., aero- performance or heat load) or to estimate resonant stresses on the airfoils. So, the development of both time-mean and time-resolved...disturbances. So, great flexibility was built into the current implementation of the convergence-assessment algorithm described above. The user can

  19. Officer Career Development: Surface Warfare Officer Retention

    DTIC Science & Technology

    1991-01-01

    findings also have implications for the Navy in its effort to control turnover among military officers. The findings suggest that the role of spousal...gathering and maintaining the data needed, and completingand rcvicving the collection ofinformation . Send conticnts regarding this burden estimate or...to our understanding of the turnover process within a military setting and provides avenues for future research. 14. SUBJECTTERMS 15. NUMBER OF PAGES

  20. Federal Research and Development Contract Trends and the Supporting Industrial Base, 2000-2014

    DTIC Science & Technology

    2016-04-30

    Homeland Security, and government-wide services contracting trends; sourcing policy and cost estimation methodologies; and recent U.S. Army modernization ...been fears that the sharp downturn in federal contract obligations would disproportionately impact the R&D contracting portfolios within individual...329 - contracting portfolios , and the industrial base that supports those efforts, within each R&D contracting agency. The main finding of this

  1. State and national household concentrations of PM2.5 from solid cookfuel use: Results from measurements and modeling in India for estimation of the global burden of disease

    PubMed Central

    2013-01-01

    Background Previous global burden of disease (GBD) estimates for household air pollution (HAP) from solid cookfuel use were based on categorical indicators of exposure. Recent progress in GBD methodologies that use integrated–exposure–response (IER) curves for combustion particles required the development of models to quantitatively estimate average HAP levels experienced by large populations. Such models can also serve to inform public health intervention efforts. Thus, we developed a model to estimate national household concentrations of PM2.5 from solid cookfuel use in India, together with estimates for 29 states. Methods We monitored 24-hr household concentrations of PM2.5, in 617 rural households from 4 states in India on a cross-sectional basis between November 2004 and March 2005. We then, developed log-linear regression models that predict household concentrations as a function of multiple, independent household level variables available in national household surveys and generated national / state estimates using The Indian National Family and Health Survey (NFHS 2005). Results The measured mean 24-hr concentration of PM2.5 in solid cookfuel using households ranged from 163 μg/m3 (95% CI: 143,183; median 106; IQR: 191) in the living area to 609 μg/m3 (95% CI: 547,671; median: 472; IQR: 734) in the kitchen area. Fuel type, kitchen type, ventilation, geographical location and cooking duration were found to be significant predictors of PM2.5 concentrations in the household model. k-fold cross validation showed a fair degree of correlation (r = 0.56) between modeled and measured values. Extrapolation of the household results by state to all solid cookfuel-using households in India, covered by NFHS 2005, resulted in a modeled estimate of 450 μg/m3 (95% CI: 318,640) and 113 μg/m3 (95% CI: 102,127) , for national average 24-hr PM2.5 concentrations in the kitchen and living areas respectively. Conclusions The model affords substantial improvement over commonly used exposure indicators such as “percent solid cookfuel use” in HAP disease burden assessments, by providing some of the first estimates of national average HAP levels experienced in India. Model estimates also add considerable strength of evidence for framing and implementation of intervention efforts at the state and national levels. PMID:24020494

  2. Population health outcome models in suicide prevention policy.

    PubMed

    Lynch, Frances L

    2014-09-01

    Suicide is a leading cause of death in the U.S. and results in immense suffering and significant cost. Effective suicide prevention interventions could reduce this burden, but policy makers need estimates of health outcomes achieved by alternative interventions to focus implementation efforts. To illustrate the utility of health outcome models to help in achieving goals defined by the National Action Alliance for Suicide Prevention's Research Prioritization Task Force. The approach is illustrated specifically with psychotherapeutic interventions to prevent suicide reattempt in emergency department settings. A health outcome model using decision analysis with secondary data was applied to estimate suicide attempts and deaths averted from evidence-based interventions. Under optimal conditions, the model estimated that over 1 year, implementing evidence-based psychotherapeutic interventions in emergency departments could decrease the number of suicide attempts by 18,737, and if offered over 5 years, it could avert 109,306 attempts. Over 1 year, the model estimated 2,498 fewer deaths from suicide, and over 5 years, about 13,928 fewer suicide deaths. Health outcome models could aid in suicide prevention policy by helping focus implementation efforts. Further research developing more sophisticated models of the impact of suicide prevention interventions that include a more complex understanding of suicidal behavior, longer time frames, and inclusion of additional outcomes that capture the full benefits and costs of interventions would be helpful next steps. Copyright © 2014 American Journal of Preventive Medicine. All rights reserved.

  3. Novel metaheuristic for parameter estimation in nonlinear dynamic biological systems

    PubMed Central

    Rodriguez-Fernandez, Maria; Egea, Jose A; Banga, Julio R

    2006-01-01

    Background We consider the problem of parameter estimation (model calibration) in nonlinear dynamic models of biological systems. Due to the frequent ill-conditioning and multi-modality of many of these problems, traditional local methods usually fail (unless initialized with very good guesses of the parameter vector). In order to surmount these difficulties, global optimization (GO) methods have been suggested as robust alternatives. Currently, deterministic GO methods can not solve problems of realistic size within this class in reasonable computation times. In contrast, certain types of stochastic GO methods have shown promising results, although the computational cost remains large. Rodriguez-Fernandez and coworkers have presented hybrid stochastic-deterministic GO methods which could reduce computation time by one order of magnitude while guaranteeing robustness. Our goal here was to further reduce the computational effort without loosing robustness. Results We have developed a new procedure based on the scatter search methodology for nonlinear optimization of dynamic models of arbitrary (or even unknown) structure (i.e. black-box models). In this contribution, we describe and apply this novel metaheuristic, inspired by recent developments in the field of operations research, to a set of complex identification problems and we make a critical comparison with respect to the previous (above mentioned) successful methods. Conclusion Robust and efficient methods for parameter estimation are of key importance in systems biology and related areas. The new metaheuristic presented in this paper aims to ensure the proper solution of these problems by adopting a global optimization approach, while keeping the computational effort under reasonable values. This new metaheuristic was applied to a set of three challenging parameter estimation problems of nonlinear dynamic biological systems, outperforming very significantly all the methods previously used for these benchmark problems. PMID:17081289

  4. Novel metaheuristic for parameter estimation in nonlinear dynamic biological systems.

    PubMed

    Rodriguez-Fernandez, Maria; Egea, Jose A; Banga, Julio R

    2006-11-02

    We consider the problem of parameter estimation (model calibration) in nonlinear dynamic models of biological systems. Due to the frequent ill-conditioning and multi-modality of many of these problems, traditional local methods usually fail (unless initialized with very good guesses of the parameter vector). In order to surmount these difficulties, global optimization (GO) methods have been suggested as robust alternatives. Currently, deterministic GO methods can not solve problems of realistic size within this class in reasonable computation times. In contrast, certain types of stochastic GO methods have shown promising results, although the computational cost remains large. Rodriguez-Fernandez and coworkers have presented hybrid stochastic-deterministic GO methods which could reduce computation time by one order of magnitude while guaranteeing robustness. Our goal here was to further reduce the computational effort without loosing robustness. We have developed a new procedure based on the scatter search methodology for nonlinear optimization of dynamic models of arbitrary (or even unknown) structure (i.e. black-box models). In this contribution, we describe and apply this novel metaheuristic, inspired by recent developments in the field of operations research, to a set of complex identification problems and we make a critical comparison with respect to the previous (above mentioned) successful methods. Robust and efficient methods for parameter estimation are of key importance in systems biology and related areas. The new metaheuristic presented in this paper aims to ensure the proper solution of these problems by adopting a global optimization approach, while keeping the computational effort under reasonable values. This new metaheuristic was applied to a set of three challenging parameter estimation problems of nonlinear dynamic biological systems, outperforming very significantly all the methods previously used for these benchmark problems.

  5. Power Management and Distribution (PMAD) Model Development: Final Report

    NASA Technical Reports Server (NTRS)

    Metcalf, Kenneth J.

    2011-01-01

    Power management and distribution (PMAD) models were developed in the early 1990's to model candidate architectures for various Space Exploration Initiative (SEI) missions. They were used to generate "ballpark" component mass estimates to support conceptual PMAD system design studies. The initial set of models was provided to NASA Lewis Research Center (since renamed Glenn Research Center) in 1992. They were developed to estimate the characteristics of power conditioning components predicted to be available in the 2005 timeframe. Early 90's component and device designs and material technologies were projected forward to the 2005 timeframe, and algorithms reflecting those design and material improvements were incorporated into the models to generate mass, volume, and efficiency estimates for circa 2005 components. The models are about ten years old now and NASA GRC requested a review of them to determine if they should be updated to bring them into agreement with current performance projections or to incorporate unforeseen design or technology advances. This report documents the results of this review and the updated power conditioning models and new transmission line models generated to estimate post 2005 PMAD system masses and sizes. This effort continues the expansion and enhancement of a library of PMAD models developed to allow system designers to assess future power system architectures and distribution techniques quickly and consistently.

  6. Evaluating analytical approaches for estimating pelagic fish biomass using simulated fish communities

    USGS Publications Warehouse

    Yule, Daniel L.; Adams, Jean V.; Warner, David M.; Hrabik, Thomas R.; Kocovsky, Patrick M.; Weidel, Brian C.; Rudstam, Lars G.; Sullivan, Patrick J.

    2013-01-01

    Pelagic fish assessments often combine large amounts of acoustic-based fish density data and limited midwater trawl information to estimate species-specific biomass density. We compared the accuracy of five apportionment methods for estimating pelagic fish biomass density using simulated communities with known fish numbers that mimic Lakes Superior, Michigan, and Ontario, representing a range of fish community complexities. Across all apportionment methods, the error in the estimated biomass generally declined with increasing effort, but methods that accounted for community composition changes with water column depth performed best. Correlations between trawl catch and the true species composition were highest when more fish were caught, highlighting the benefits of targeted trawling in locations of high fish density. Pelagic fish surveys should incorporate geographic and water column depth stratification in the survey design, use apportionment methods that account for species-specific depth differences, target midwater trawling effort in areas of high fish density, and include at least 15 midwater trawls. With relatively basic biological information, simulations of fish communities and sampling programs can optimize effort allocation and reduce error in biomass estimates.

  7. Methodological Issues in the Collection, Analysis, and Reporting of Granular Data in Asian American Populations: Historical Challenges and Potential Solutions

    PubMed Central

    Islam, Nadia Shilpi; Khan, Suhaila; Kwon, Simona; Jang, Deeana; Ro, Marguerite; Trinh-Shevrin, Chau

    2011-01-01

    There are close to 15 million Asian Americans living in the United States, and they represent the fastest growing populations in the country. By the year 2050, there will be an estimated 33.4 million Asian Americans living in the country. However, their health needs remain poorly understood and there is a critical lack of data disaggregated by Asian American ethnic subgroups, primary language, and geography. This paper examines methodological issues, challenges, and potential solutions to addressing the collection, analysis, and reporting of disaggregated (or, granular) data on Asian Americans. The article explores emerging efforts to increase granular data through the use of innovative study design and analysis techniques. Concerted efforts to implement these techniques will be critical to the future development of sound research, health programs, and policy efforts targeting this and other minority populations. PMID:21099084

  8. A Theory of Information Quality and a Framework for its Implementation in the Requirements Engineering Process

    NASA Astrophysics Data System (ADS)

    Grenn, Michael W.

    This dissertation introduces a theory of information quality to explain macroscopic behavior observed in the systems engineering process. The theory extends principles of Shannon's mathematical theory of communication [1948] and statistical mechanics to information development processes concerned with the flow, transformation, and meaning of information. The meaning of requirements information in the systems engineering context is estimated or measured in terms of the cumulative requirements quality Q which corresponds to the distribution of the requirements among the available quality levels. The requirements entropy framework (REF) implements the theory to address the requirements engineering problem. The REF defines the relationship between requirements changes, requirements volatility, requirements quality, requirements entropy and uncertainty, and engineering effort. The REF is evaluated via simulation experiments to assess its practical utility as a new method for measuring, monitoring and predicting requirements trends and engineering effort at any given time in the process. The REF treats the requirements engineering process as an open system in which the requirements are discrete information entities that transition from initial states of high entropy, disorder and uncertainty toward the desired state of minimum entropy as engineering effort is input and requirements increase in quality. The distribution of the total number of requirements R among the N discrete quality levels is determined by the number of defined quality attributes accumulated by R at any given time. Quantum statistics are used to estimate the number of possibilities P for arranging R among the available quality levels. The requirements entropy H R is estimated using R, N and P by extending principles of information theory and statistical mechanics to the requirements engineering process. The information I increases as HR and uncertainty decrease, and the change in information AI needed to reach the desired state of quality is estimated from the perspective of the receiver. The HR may increase, decrease or remain steady depending on the degree to which additions, deletions and revisions impact the distribution of R among the quality levels. Current requirements trend metrics generally treat additions, deletions and revisions the same and simply measure the quantity of these changes over time. The REF evaluates the quantity of requirements changes over time, distinguishes between their positive and negative effects by calculating their impact on HR, Q, and AI, and forecasts when the desired state will be reached, enabling more accurate assessment of the status and progress of the requirements engineering effort. Results from random variable simulations suggest the REF is an improved leading indicator of requirements trends that can be readily combined with current methods. The increase in I, or decrease in H R and uncertainty, is proportional to the engineering effort E input into the requirements engineering process. The REF estimates the AE needed to transition R from their current state of quality to the desired end state or some other interim state of interest. Simulation results are compared with measured engineering effort data for Department of Defense programs published in the SE literature, and the results suggest the REF is a promising new method for estimation of AE.

  9. Modelling detectability of kiore (Rattus exulans) on Aguiguan, Mariana Islands, to inform possible eradication and monitoring efforts

    USGS Publications Warehouse

    Adams, A.A.Y.; Stanford, J.W.; Wiewel, A.S.; Rodda, G.H.

    2011-01-01

    Estimating the detection probability of introduced organisms during the pre-monitoring phase of an eradication effort can be extremely helpful in informing eradication and post-eradication monitoring efforts, but this step is rarely taken. We used data collected during 11 nights of mark-recapture sampling on Aguiguan, Mariana Islands, to estimate introduced kiore (Rattus exulans Peale) density and detection probability, and evaluated factors affecting detectability to help inform possible eradication efforts. Modelling of 62 captures of 48 individuals resulted in a model-averaged density estimate of 55 kiore/ha. Kiore detection probability was best explained by a model allowing neophobia to diminish linearly (i.e. capture probability increased linearly) until occasion 7, with additive effects of sex and cumulative rainfall over the prior 48 hours. Detection probability increased with increasing rainfall and females were up to three times more likely than males to be trapped. In this paper, we illustrate the type of information that can be obtained by modelling mark-recapture data collected during pre-eradication monitoring and discuss the potential of using these data to inform eradication and posteradication monitoring efforts. ?? New Zealand Ecological Society.

  10. Estimates of population change in selected species of tropical birds using mark-recapture data

    USGS Publications Warehouse

    Brawn, J.; Nichols, J.D.; Hines, J.E.; Nesbitt, J.

    2000-01-01

    The population biology of tropical birds is known for a only small sample of species; especially in the Neotropics. Robust estimates of parameters such as survival rate and finite rate of population change (A) are crucial for conservation purposes and useful for studies of avian life histories. We used methods developed by Pradel (1996, Biometrics 52:703-709) to estimate A for 10 species of tropical forest lowland birds using data from a long-term (> 20 yr) banding study in Panama. These species constitute a ecologically and phylogenetically diverse sample. We present these estimates and explore if they are consistent with what we know from selected studies of banded birds and from 5 yr of estimating nesting success (i.e., an important component of A). A major goal of these analyses is to assess if the mark-recapture methods generate reliable and reasonably precise estimates of population change than traditional methods that require more sampling effort.

  11. Ultra-Wideband Time-Difference-of-Arrival High Resolution 3D Proximity Tracking System

    NASA Technical Reports Server (NTRS)

    Ni, Jianjun; Arndt, Dickey; Ngo, Phong; Phan, Chau; Dekome, Kent; Dusl, John

    2010-01-01

    This paper describes a research and development effort for a prototype ultra-wideband (UWB) tracking system that is currently under development at NASA Johnson Space Center (JSC). The system is being studied for use in tracking of lunar./Mars rovers and astronauts during early exploration missions when satellite navigation systems are not available. U IATB impulse radio (UWB-IR) technology is exploited in the design and implementation of the prototype location and tracking system. A three-dimensional (3D) proximity tracking prototype design using commercially available UWB products is proposed to implement the Time-Difference- Of-Arrival (TDOA) tracking methodology in this research effort. The TDOA tracking algorithm is utilized for location estimation in the prototype system, not only to exploit the precise time resolution possible with UWB signals, but also to eliminate the need for synchronization between the transmitter and the receiver. Simulations show that the TDOA algorithm can achieve the fine tracking resolution with low noise TDOA estimates for close-in tracking. Field tests demonstrated that this prototype UWB TDOA High Resolution 3D Proximity Tracking System is feasible for providing positioning-awareness information in a 3D space to a robotic control system. This 3D tracking system is developed for a robotic control system in a facility called "Moonyard" at Honeywell Defense & System in Arizona under a Space Act Agreement.

  12. Remote Sensing and Capacity Building to Improve Food Security

    NASA Astrophysics Data System (ADS)

    Husak, G. J.; Funk, C. C.; Verdin, J. P.; Rowland, J.; Budde, M. E.

    2012-12-01

    The Famine Early Warning Systems Network (FEWS NET) is a U.S. Agency for International Development (USAID) supported project designed to monitor and anticipate food insecurity in the developing world, primarily Africa, Central America, the Caribbean and Central Asia. This is done through a network of partners involving U.S. government agencies, universities, country representatives, and partner institutions. This presentation will focus on the remotely sensed data used in FEWS NET activities and capacity building efforts designed to expand and enhance the use of FEWS NET tools and techniques. Remotely sensed data are of particular value in the developing world, where ground data networks and data reporting are limited. FEWS NET uses satellite based rainfall and vegetation greenness measures to monitor and assess food production conditions. Satellite rainfall estimates also drive crop models which are used in determining yield potential. Recent FEWS NET products also include estimates of actual evapotranspiration. Efforts are currently underway to assimilate these products into a single tool which would indicate areas experiencing abnormal conditions with implications for food production. FEWS NET is also involved in a number of capacity building activities. Two primary examples are the development of software and training of institutional partners in basic GIS and remote sensing. Software designed to incorporate rainfall station data with existing satellite-derived rainfall estimates gives users the ability to enhance satellite rainfall estimates or long-term means, resulting in gridded fields of rainfall that better reflect ground conditions. Further, this software includes a crop water balance model driven by the improved rainfall estimates. Finally, crop parameters, such as the planting date or length of growing period, can be adjusted by users to tailor the crop model to actual conditions. Training workshops in the use of this software, as well as basic GIS and remote sensing tools, are routinely conducted by FEWS NET representatives at host country meteorological and agricultural services. These institutions are then able to produce information that can more accurately inform food security decision making. Informed decision making reduces the risk associated with a given hazard. In the case of FEWS NET, this involves identification of shocks to food availability, allowing for the pre-positioning of aid to be available when a hazard strikes. Developing tools to incorporate better information in food production estimates and working closely with local staff trained in state-of-the-practice techniques results in a more informed decision making process, reducing the impacts of food security hazards.

  13. Estimating the greenhouse gas benefits of forestry projects: A Costa Rican Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Busch, Christopher; Sathaye, Jayant; Sanchez Azofeifa, G. Arturo

    If the Clean Development Mechanism proposed under the Kyoto Protocol is to serve as an effective means for combating global climate change, it will depend upon reliable estimates of greenhouse gas benefits. This paper sketches the theoretical basis for estimating the greenhouse gas benefits of forestry projects and suggests lessons learned based on a case study of Costa Rica's Protected Areas Project, which is a 500,000 hectare effort to reduce deforestation and enhance reforestation. The Protected Areas Project in many senses advances the state of the art for Clean Development Mechanism-type forestry projects, as does the third-party verification work ofmore » SGS International Certification Services on the project. Nonetheless, sensitivity analysis shows that carbon benefit estimates for the project vary widely based on the imputed deforestation rate in the baseline scenario, e.g. the deforestation rate expected if the project were not implemented. This, along with a newly available national dataset that confirms other research showing a slower rate of deforestation in Costa Rica, suggests that the use of the 1979--1992 forest cover data originally as the basis for estimating carbon savings should be reconsidered. When the newly available data is substituted, carbon savings amount to 8.9 Mt (million tones) of carbon, down from the original estimate of 15.7 Mt. The primary general conclusion is that project developers should give more attention to the forecasting land use and land cover change scenarios underlying estimates of greenhouse gas benefits.« less

  14. Magnitude of Neck-Surface Vibration as an Estimate of Subglottal Pressure during Modulations of Vocal Effort and Intensity in Healthy Speakers

    ERIC Educational Resources Information Center

    McKenna, Victoria S.; Llico, Andres F.; Mehta, Daryush D.; Perkell, Joseph S.; Stepp, Cara E.

    2017-01-01

    Purpose: This study examined the relationship between the magnitude of neck-surface vibration (NSV[subscript Mag]; transduced with an accelerometer) and intraoral estimates of subglottal pressure (P'[subscript sg]) during variations in vocal effort at 3 intensity levels. Method: Twelve vocally healthy adults produced strings of /p?/ syllables in 3…

  15. The potential benefits of a new poliovirus vaccine for long-term poliovirus risk management.

    PubMed

    Duintjer Tebbens, Radboud J; Thompson, Kimberly M

    2016-12-01

    To estimate the incremental net benefits (INBs) of a hypothetical ideal vaccine with all of the advantages and no disadvantages of existing oral and inactivated poliovirus vaccines compared with current vaccines available for future outbreak response. INB estimates based on expected costs and polio cases from an existing global model of long-term poliovirus risk management. Excluding the development costs, an ideal poliovirus vaccine could offer expected INBs of US$1.6 billion. The ideal vaccine yields small benefits in most realizations of long-term risks, but great benefits in low-probability-high-consequence realizations. New poliovirus vaccines may offer valuable insurance against long-term poliovirus risks and new vaccine development efforts should continue as the world gathers more evidence about polio endgame risks.

  16. Trade-offs in experimental designs for estimating post-release mortality in containment studies

    USGS Publications Warehouse

    Rogers, Mark W.; Barbour, Andrew B; Wilson, Kyle L

    2014-01-01

    Estimates of post-release mortality (PRM) facilitate accounting for unintended deaths from fishery activities and contribute to development of fishery regulations and harvest quotas. The most popular method for estimating PRM employs containers for comparing control and treatment fish, yet guidance for experimental design of PRM studies with containers is lacking. We used simulations to evaluate trade-offs in the number of containers (replicates) employed versus the number of fish-per container when estimating tagging mortality. We also investigated effects of control fish survival and how among container variation in survival affects the ability to detect additive mortality. Simulations revealed that high experimental effort was required when: (1) additive treatment mortality was small, (2) control fish mortality was non-negligible, and (3) among container variability in control fish mortality exceeded 10% of the mean. We provided programming code to allow investigators to compare alternative designs for their individual scenarios and expose trade-offs among experimental design options. Results from our simulations and simulation code will help investigators develop efficient PRM experimental designs for precise mortality assessment.

  17. A method to accurately estimate the muscular torques of human wearing exoskeletons by torque sensors.

    PubMed

    Hwang, Beomsoo; Jeon, Doyoung

    2015-04-09

    In exoskeletal robots, the quantification of the user's muscular effort is important to recognize the user's motion intentions and evaluate motor abilities. In this paper, we attempt to estimate users' muscular efforts accurately using joint torque sensor which contains the measurements of dynamic effect of human body such as the inertial, Coriolis, and gravitational torques as well as torque by active muscular effort. It is important to extract the dynamic effects of the user's limb accurately from the measured torque. The user's limb dynamics are formulated and a convenient method of identifying user-specific parameters is suggested for estimating the user's muscular torque in robotic exoskeletons. Experiments were carried out on a wheelchair-integrated lower limb exoskeleton, EXOwheel, which was equipped with torque sensors in the hip and knee joints. The proposed methods were evaluated by 10 healthy participants during body weight-supported gait training. The experimental results show that the torque sensors are to estimate the muscular torque accurately in cases of relaxed and activated muscle conditions.

  18. A methodology for airplane parameter estimation and confidence interval determination in nonlinear estimation problems. Ph.D. Thesis - George Washington Univ., Apr. 1985

    NASA Technical Reports Server (NTRS)

    Murphy, P. C.

    1986-01-01

    An algorithm for maximum likelihood (ML) estimation is developed with an efficient method for approximating the sensitivities. The ML algorithm relies on a new optimization method referred to as a modified Newton-Raphson with estimated sensitivities (MNRES). MNRES determines sensitivities by using slope information from local surface approximations of each output variable in parameter space. With the fitted surface, sensitivity information can be updated at each iteration with less computational effort than that required by either a finite-difference method or integration of the analytically determined sensitivity equations. MNRES eliminates the need to derive sensitivity equations for each new model, and thus provides flexibility to use model equations in any convenient format. A random search technique for determining the confidence limits of ML parameter estimates is applied to nonlinear estimation problems for airplanes. The confidence intervals obtained by the search are compared with Cramer-Rao (CR) bounds at the same confidence level. The degree of nonlinearity in the estimation problem is an important factor in the relationship between CR bounds and the error bounds determined by the search technique. Beale's measure of nonlinearity is developed in this study for airplane identification problems; it is used to empirically correct confidence levels and to predict the degree of agreement between CR bounds and search estimates.

  19. An Estimate of Avian Mortality at Communication Towers in the United States and Canada

    PubMed Central

    Longcore, Travis; Rich, Catherine; Mineau, Pierre; MacDonald, Beau; Bert, Daniel G.; Sullivan, Lauren M.; Mutrie, Erin; Gauthreaux, Sidney A.; Avery, Michael L.; Crawford, Robert L.; Manville, Albert M.; Travis, Emilie R.; Drake, David

    2012-01-01

    Avian mortality at communication towers in the continental United States and Canada is an issue of pressing conservation concern. Previous estimates of this mortality have been based on limited data and have not included Canada. We compiled a database of communication towers in the continental United States and Canada and estimated avian mortality by tower with a regression relating avian mortality to tower height. This equation was derived from 38 tower studies for which mortality data were available and corrected for sampling effort, search efficiency, and scavenging where appropriate. Although most studies document mortality at guyed towers with steady-burning lights, we accounted for lower mortality at towers without guy wires or steady-burning lights by adjusting estimates based on published studies. The resulting estimate of mortality at towers is 6.8 million birds per year in the United States and Canada. Bootstrapped subsampling indicated that the regression was robust to the choice of studies included and a comparison of multiple regression models showed that incorporating sampling, scavenging, and search efficiency adjustments improved model fit. Estimating total avian mortality is only a first step in developing an assessment of the biological significance of mortality at communication towers for individual species or groups of species. Nevertheless, our estimate can be used to evaluate this source of mortality, develop subsequent per-species mortality estimates, and motivate policy action. PMID:22558082

  20. An estimate of avian mortality at communication towers in the United States and Canada.

    PubMed

    Longcore, Travis; Rich, Catherine; Mineau, Pierre; MacDonald, Beau; Bert, Daniel G; Sullivan, Lauren M; Mutrie, Erin; Gauthreaux, Sidney A; Avery, Michael L; Crawford, Robert L; Manville, Albert M; Travis, Emilie R; Drake, David

    2012-01-01

    Avian mortality at communication towers in the continental United States and Canada is an issue of pressing conservation concern. Previous estimates of this mortality have been based on limited data and have not included Canada. We compiled a database of communication towers in the continental United States and Canada and estimated avian mortality by tower with a regression relating avian mortality to tower height. This equation was derived from 38 tower studies for which mortality data were available and corrected for sampling effort, search efficiency, and scavenging where appropriate. Although most studies document mortality at guyed towers with steady-burning lights, we accounted for lower mortality at towers without guy wires or steady-burning lights by adjusting estimates based on published studies. The resulting estimate of mortality at towers is 6.8 million birds per year in the United States and Canada. Bootstrapped subsampling indicated that the regression was robust to the choice of studies included and a comparison of multiple regression models showed that incorporating sampling, scavenging, and search efficiency adjustments improved model fit. Estimating total avian mortality is only a first step in developing an assessment of the biological significance of mortality at communication towers for individual species or groups of species. Nevertheless, our estimate can be used to evaluate this source of mortality, develop subsequent per-species mortality estimates, and motivate policy action.

  1. Observed differences in upper extremity forces, muscle efforts, postures, velocities and accelerations across computer activities in a field study of office workers.

    PubMed

    Bruno Garza, J L; Eijckelhof, B H W; Johnson, P W; Raina, S M; Rynell, P W; Huysmans, M A; van Dieën, J H; van der Beek, A J; Blatter, B M; Dennerlein, J T

    2012-01-01

    This study, a part of the PRedicting Occupational biomechanics in OFfice workers (PROOF) study, investigated whether there are differences in field-measured forces, muscle efforts, postures, velocities and accelerations across computer activities. These parameters were measured continuously for 120 office workers performing their own work for two hours each. There were differences in nearly all forces, muscle efforts, postures, velocities and accelerations across keyboard, mouse and idle activities. Keyboard activities showed a 50% increase in the median right trapezius muscle effort when compared to mouse activities. Median shoulder rotation changed from 25 degrees internal rotation during keyboard use to 15 degrees external rotation during mouse use. Only keyboard use was associated with median ulnar deviations greater than 5 degrees. Idle activities led to the greatest variability observed in all muscle efforts and postures measured. In future studies, measurements of computer activities could be used to provide information on the physical exposures experienced during computer use. Practitioner Summary: Computer users may develop musculoskeletal disorders due to their force, muscle effort, posture and wrist velocity and acceleration exposures during computer use. We report that many physical exposures are different across computer activities. This information may be used to estimate physical exposures based on patterns of computer activities over time.

  2. A two-phase sampling design for increasing detections of rare species in occupancy surveys

    USGS Publications Warehouse

    Pacifici, Krishna; Dorazio, Robert M.; Dorazio, Michael J.

    2012-01-01

    1. Occupancy estimation is a commonly used tool in ecological studies owing to the ease at which data can be collected and the large spatial extent that can be covered. One major obstacle to using an occupancy-based approach is the complications associated with designing and implementing an efficient survey. These logistical challenges become magnified when working with rare species when effort can be wasted in areas with none or very few individuals. 2. Here, we develop a two-phase sampling approach that mitigates these problems by using a design that places more effort in areas with higher predicted probability of occurrence. We compare our new sampling design to traditional single-season occupancy estimation under a range of conditions and population characteristics. We develop an intuitive measure of predictive error to compare the two approaches and use simulations to assess the relative accuracy of each approach. 3. Our two-phase approach exhibited lower predictive error rates compared to the traditional single-season approach in highly spatially correlated environments. The difference was greatest when detection probability was high (0·75) regardless of the habitat or sample size. When the true occupancy rate was below 0·4 (0·05-0·4), we found that allocating 25% of the sample to the first phase resulted in the lowest error rates. 4. In the majority of scenarios, the two-phase approach showed lower error rates compared to the traditional single-season approach suggesting our new approach is fairly robust to a broad range of conditions and design factors and merits use under a wide variety of settings. 5. Synthesis and applications. Conservation and management of rare species are a challenging task facing natural resource managers. It is critical for studies involving rare species to efficiently allocate effort and resources as they are usually of a finite nature. We believe our approach provides a framework for optimal allocation of effort while maximizing the information content of the data in an attempt to provide the highest conservation value per unit of effort.

  3. A web based Radiation Oncology Dose Manager with a rich User Interface developed using AJAX, ruby, dynamic XHTML and the new Yahoo/EXT User Interface Library.

    PubMed

    Vali, Faisal; Hong, Robert

    2007-10-11

    With the evolution of AJAX, ruby on rails, advanced dynamic XHTML technologies and the advent of powerful user interface libraries for javascript (EXT, Yahoo User Interface Library), developers now have the ability to provide truly rich interfaces within web browsers, with reasonable effort and without third-party plugins. We designed and developed an example of such a solution. The User Interface allows radiation oncology practices to intuitively manage different dose fractionation schemes by helping estimate total dose to irradiated organs.

  4. Single-Step BLUP with Varying Genotyping Effort in Open-Pollinated Picea glauca.

    PubMed

    Ratcliffe, Blaise; El-Dien, Omnia Gamal; Cappa, Eduardo P; Porth, Ilga; Klápště, Jaroslav; Chen, Charles; El-Kassaby, Yousry A

    2017-03-10

    Maximization of genetic gain in forest tree breeding programs is contingent on the accuracy of the predicted breeding values and precision of the estimated genetic parameters. We investigated the effect of the combined use of contemporary pedigree information and genomic relatedness estimates on the accuracy of predicted breeding values and precision of estimated genetic parameters, as well as rankings of selection candidates, using single-step genomic evaluation (HBLUP). In this study, two traits with diverse heritabilities [tree height (HT) and wood density (WD)] were assessed at various levels of family genotyping efforts (0, 25, 50, 75, and 100%) from a population of white spruce ( Picea glauca ) consisting of 1694 trees from 214 open-pollinated families, representing 43 provenances in Québec, Canada. The results revealed that HBLUP bivariate analysis is effective in reducing the known bias in heritability estimates of open-pollinated populations, as it exposes hidden relatedness, potential pedigree errors, and inbreeding. The addition of genomic information in the analysis considerably improved the accuracy in breeding value estimates by accounting for both Mendelian sampling and historical coancestry that were not captured by the contemporary pedigree alone. Increasing family genotyping efforts were associated with continuous improvement in model fit, precision of genetic parameters, and breeding value accuracy. Yet, improvements were observed even at minimal genotyping effort, indicating that even modest genotyping effort is effective in improving genetic evaluation. The combined utilization of both pedigree and genomic information may be a cost-effective approach to increase the accuracy of breeding values in forest tree breeding programs where shallow pedigrees and large testing populations are the norm. Copyright © 2017 Ratcliffe et al.

  5. Psychosocial work environment and myocardial infarction: improving risk estimation by combining two complementary job stress models in the SHEEP Study

    PubMed Central

    Peter, R; Siegrist, J; Hallqvist, J; Reuterwall, C; Theorell, T

    2002-01-01

    Objectives: Associations between two alternative formulations of job stress derived from the effort-reward imbalance and the job strain model and first non-fatal acute myocardial infarction were studied. Whereas the job strain model concentrates on situational (extrinsic) characteristics the effort-reward imbalance model analyses distinct person (intrinsic) characteristics in addition to situational ones. In view of these conceptual differences the hypothesis was tested that combining information from the two models improves the risk estimation of acute myocardial infarction. Methods: 951 male and female myocardial infarction cases and 1147 referents aged 45–64 years of The Stockholm Heart Epidemiology (SHEEP) case-control study underwent a clinical examination. Information on job stress and health adverse behaviours was derived from standardised questionnaires. Results: Multivariate analysis showed moderately increased odds ratios for either model. Yet, with respect to the effort-reward imbalance model gender specific effects were found: in men the extrinsic component contributed to risk estimation, whereas this was the case with the intrinsic component in women. Controlling each job stress model for the other in order to test the independent effect of either approach did not show systematically increased odds ratios. An improved estimation of acute myocardial infarction risk resulted from combining information from the two models by defining groups characterised by simultaneous exposure to effort-reward imbalance and job strain (men: odds ratio 2.02 (95% confidence intervals (CI) 1.34 to 3.07); women odds ratio 2.19 (95% CI 1.11 to 4.28)). Conclusions: Findings show an improved risk estimation of acute myocardial infarction by combining information from the two job stress models under study. Moreover, gender specific effects of the two components of the effort-reward imbalance model were observed. PMID:11896138

  6. Expert judgment on markers to deter inadvertent human intrusion into the Waste Isolation Pilot Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trauth, K.M.; Hora, S.C.; Guzowski, R.V.

    1993-11-01

    The expert panel identified basic principles to guide current and future marker development efforts: (1) the site must be marked, (2) message(s) must be truthful and informative, (3) multiple components within a marker system, (4) multiple means of communication (e.g., language, pictographs, scientific diagrams), (5) multiple levels of complexity within individual messages on individual marker system elements, (6) use of materials with little recycle value, and (7) international effort to maintain knowledge of the locations and contents of nuclear waste repositories. The efficacy of the markers in deterring inadvertent human intrusion was estimated to decrease with time, with the probabilitymore » function varying with the mode of intrusion (who is intruding and for what purpose) and the level of technological development of the society. The development of a permanent, passive marker system capable of surviving and remaining interpretable for 10,000 years will require further study prior to implementation.« less

  7. The MERMAID project

    NASA Technical Reports Server (NTRS)

    Cowderoy, A. J. C.; Jenkins, John O.; Poulymenakou, A

    1992-01-01

    The tendency for software development projects to be completed over schedule and over budget was documented extensively. Additionally many projects are completed within budgetary and schedule target only as a result of the customer agreeing to accept reduced functionality. In his classic book, The Mythical Man Month, Fred Brooks exposes the fallacy that effort and schedule are freely interchangeable. All current cost models are produced on the assumption that there is very limited scope for schedule compression unless there is a corresponding reduction in delivered functionality. The Metrication and Resources Modeling Aid (MERMAID) project, partially financed by the Commission of the European Communities (CEC) as Project 2046 began in Oct. 1988 and its goal were as follows: (1) improvement of understanding of the relationships between software development productivity and product and process metrics; (2) to facilitate the widespread technology transfer from the Consortium to the European Software Industry; and (3) to facilitate the widespread uptake of cost estimation techniques by the provision of prototype cost estimation tools. MERMAID developed a family of methods for cost estimation, many of which have had tools implemented in prototypes. These prototypes are best considered as toolkits or workbenches.

  8. Using effort information with change-in-ratio data for population estimation

    USGS Publications Warehouse

    Udevitz, Mark S.; Pollock, Kenneth H.

    1995-01-01

    Most change-in-ratio (CIR) methods for estimating fish and wildlife population sizes have been based only on assumptions about how encounter probabilities vary among population subclasses. When information on sampling effort is available, it is also possible to derive CIR estimators based on assumptions about how encounter probabilities vary over time. This paper presents a generalization of previous CIR models that allows explicit consideration of a range of assumptions about the variation of encounter probabilities among subclasses and over time. Explicit estimators are derived under this model for specific sets of assumptions about the encounter probabilities. Numerical methods are presented for obtaining estimators under the full range of possible assumptions. Likelihood ratio tests for these assumptions are described. Emphasis is on obtaining estimators based on assumptions about variation of encounter probabilities over time.

  9. Combining satellite imagery and machine learning to predict poverty.

    PubMed

    Jean, Neal; Burke, Marshall; Xie, Michael; Davis, W Matthew; Lobell, David B; Ermon, Stefano

    2016-08-19

    Reliable data on economic livelihoods remain scarce in the developing world, hampering efforts to study these outcomes and to design policies that improve them. Here we demonstrate an accurate, inexpensive, and scalable method for estimating consumption expenditure and asset wealth from high-resolution satellite imagery. Using survey and satellite data from five African countries--Nigeria, Tanzania, Uganda, Malawi, and Rwanda--we show how a convolutional neural network can be trained to identify image features that can explain up to 75% of the variation in local-level economic outcomes. Our method, which requires only publicly available data, could transform efforts to track and target poverty in developing countries. It also demonstrates how powerful machine learning techniques can be applied in a setting with limited training data, suggesting broad potential application across many scientific domains. Copyright © 2016, American Association for the Advancement of Science.

  10. AdaNET research plan

    NASA Technical Reports Server (NTRS)

    Mcbride, John G.

    1990-01-01

    The mission of the AdaNET research effort is to determine how to increase the availability of reusable Ada components and associated software engineering technology to both private and Federal sectors. The effort is structured to define the requirements for transfer of Federally developed software technology, study feasible approaches to meeting the requirements, and to gain experience in applying various technologies and practices. The overall approach to the development of the AdaNET System Specification is presented. A work breakdown structure is presented with each research activity described in detail. The deliverables for each work area are summarized. The overall organization and responsibilities for each research area are described. The schedule and necessary resources are presented for each research activity. The estimated cost is summarized for each activity. The project plan is fully described in the Super Project Expert data file contained on the floppy disk attached to the back cover of this plan.

  11. Evaluation of Progressive Failure Analysis and Modeling of Impact Damage in Composite Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Sanchez, Christopher M.

    2011-01-01

    NASA White Sands Test Facility (WSTF) is leading an evaluation effort in advanced destructive and nondestructive testing of composite pressure vessels and structures. WSTF is using progressive finite element analysis methods for test design and for confirmation of composite pressure vessel performance. Using composite finite element analysis models and failure theories tested in the World-Wide Failure Exercise, WSTF is able to estimate the static strength of composite pressure vessels. Additionally, test and evaluation on composites that have been impact damaged is in progress so that models can be developed to estimate damage tolerance and the degradation in static strength.

  12. Work More, Then Feel More: The Influence of Effort on Affective Predictions

    PubMed Central

    Jiga-Boy, Gabriela M.; Toma, Claudia; Corneille, Olivier

    2014-01-01

    Two studies examined how effort invested in a task shapes the affective predictions related to potential success in that task, and the mechanism underlying this relationship. In Study 1, PhD students awaiting an editorial decision about a submitted manuscript estimated the effort they had invested in preparing that manuscript for submission and how happy they would feel if it were accepted. Subjective estimates of effort were positively related to participants' anticipated happiness, an effect mediated by the higher perceived quality of one's work. In other words, the more effort one though having invested, the happier one expected to feel if it were accepted, because one expected a higher quality manuscript. We replicated this effect and its underlying mediation in Study 2, this time using an experimental manipulation of effort in the context of creating an advertising slogan. Study 2 further showed that participants mistakenly thought their extra efforts invested in the task had improved the quality of their work, while independent judges had found no objective differences in quality between the outcomes of the high- and low-effort groups. We discuss the implications of the relationship between effort and anticipated emotions and the conditions under which such relationship might be functional. PMID:25028961

  13. Gaps in sampling and limitations to tree biomass estimation: a review of past sampling efforts over the past 50 years

    Treesearch

    Aaron Weiskittel; Jereme Frank; James Westfall; David Walker; Phil Radtke; David Affleck; David Macfarlane

    2015-01-01

    Tree biomass models are widely used but differ due to variation in the quality and quantity of data used in their development. We reviewed over 250 biomass studies and categorized them by species, location, sampled diameter distribution, and sample size. Overall, less than half of the tree species in Forest Inventory and Analysis database (FIADB) are without a...

  14. Nuclear electric propulsion mission engineering study development program and costs estimates, Phase 2 review

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The results are presented of the second six-month performance period of the Nuclear Electric Propulsion Mission Engineering Study. A brief overview of the program, identifying the study objectives and approach, and a discussion of the program status and schedule are presented. The program results are reviewed and key conclusions to date are summarized. Planned effort for the remainder of the program is reviewed.

  15. Distributed Compression in Camera Sensor Networks

    DTIC Science & Technology

    2006-02-13

    complicated in this context. This effort will make use of the correlation structure of the data given by the plenoptic function n the case of multi-camera...systems. In many cases the structure of the plenoptic function can be estimated without requiring inter-sensor communications, but by using some a...priori global geometrical information. Once the structure of the plenoptic function has been predicted, it is possible to develop specific distributed

  16. Two Mathematical Models of Nonlinear Vibrations

    NASA Technical Reports Server (NTRS)

    Brugarolas, Paul; Bayard, David; Spanos, John; Breckenridge, William

    2007-01-01

    Two innovative mathematical models of nonlinear vibrations, and methods of applying them, have been conceived as byproducts of an effort to develop a Kalman filter for highly precise estimation of bending motions of a large truss structure deployed in outer space from a space-shuttle payload bay. These models are also applicable to modeling and analysis of vibrations in other engineering disciplines, on Earth as well as in outer space.

  17. Weapon Acquisition Program Outcomes and Efforts to Reform DOD’s Acquisition Process

    DTIC Science & Technology

    2016-05-09

    portfolio’s total estimated acquisition cost. 11. The equity prices of contractors delivering the ten costliest programs performed well relative to broad...cost growth. • In a constrained funding environment, unforeseen cost increases limit investment choices. The equity prices of the contractors ...remain profitable well into the future • Five publicly-traded defense contractors are developing and delivering the ten largest DOD programs in the 2015

  18. Rating curve uncertainty: A comparison of estimation methods

    USGS Publications Warehouse

    Mason, Jr., Robert R.; Kiang, Julie E.; Cohn, Timothy A.; Constantinescu, George; Garcia, Marcelo H.; Hanes, Dan

    2016-01-01

    The USGS is engaged in both internal development and collaborative efforts to evaluate existing methods for characterizing the uncertainty of streamflow measurements (gaugings), stage-discharge relations (ratings), and, ultimately, the streamflow records derived from them. This paper provides a brief overview of two candidate methods that may be used to characterize the uncertainty of ratings, and illustrates the results of their application to the ratings of the two USGS streamgages.

  19. Estimation of mortality for stage-structured zooplankton populations: What is to be done?

    NASA Astrophysics Data System (ADS)

    Ohman, Mark D.

    2012-05-01

    Estimation of zooplankton mortality rates in field populations is a challenging task that some contend is inherently intractable. This paper examines several of the objections that are commonly raised to efforts to estimate mortality. We find that there are circumstances in the field where it is possible to sequentially sample the same population and to resolve biologically caused mortality, albeit with error. Precision can be improved with sampling directed by knowledge of the physical structure of the water column, combined with adequate sample replication. Intercalibration of sampling methods can make it possible to sample across the life history in a quantitative manner. Rates of development can be constrained by laboratory-based estimates of stage durations from temperature- and food-dependent functions, mesocosm studies of molting rates, or approximation of development rates from growth rates, combined with the vertical distributions of organisms in relation to food and temperature gradients. Careful design of field studies guided by the assumptions of specific estimation models can lead to satisfactory mortality estimates, but model uncertainty also needs to be quantified. We highlight additional issues requiring attention to further advance the field, including the need for linked cooperative studies of the rates and causes of mortality of co-occurring holozooplankton and ichthyoplankton.

  20. Structure/activity relationships for biodegradability and their role in environmental assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boethling, R.S.

    1994-12-31

    Assessment of biodegradability is an important part of the review process for both new and existing chemicals under the Toxic Substances Control Act. It is often necessary to estimate biodegradability because experimental data are unavailable. Structure/biodegradability relationships (SBR) are a means to this end. Quantitative SBR have been developed, but this approach has not been very useful because they apply only to a few narrowly defined classes of chemicals. In response to the need for more widely applicable methods, multivariate analysis has been used to develop biodegradability classification models. For example, recent efforts have produced four new models. Two calculatemore » the probability of rapid biodegradation and can be used for classification; the other two models allow semi-quantitative estimation of primary and ultimate biodegradation rates. All are based on multiple regressions against 36 preselected substructures plus molecular weight. Such efforts have been fairly successful by statistical criteria, but in general are hampered by a lack of large and consistent datasets. Knowledge-based expert systems may represent the next step in the evolution of SBR. In principle such systems need not be as severely limited by imperfect datasets. However, the codification of expert knowledge and reasoning is a critical prerequisite. Results of knowledge acquisition exercises and modeling based on them will also be described.« less

  1. Aircraft ground damage and the use of predictive models to estimate costs

    NASA Astrophysics Data System (ADS)

    Kromphardt, Benjamin D.

    Aircraft are frequently involved in ground damage incidents, and repair costs are often accepted as part of doing business. The Flight Safety Foundation (FSF) estimates ground damage to cost operators $5-10 billion annually. Incident reports, documents from manufacturers or regulatory agencies, and other resources were examined to better understand the problem of ground damage in aviation. Major contributing factors were explained, and two versions of a computer-based model were developed to project costs and show what is possible. One objective was to determine if the models could match the FSF's estimate. Another objective was to better understand cost savings that could be realized by efforts to further mitigate the occurrence of ground incidents. Model effectiveness was limited by access to official data, and assumptions were used if data was not available. However, the models were determined to sufficiently estimate the costs of ground incidents.

  2. SUBJECTIVE ESTIMATION OF EFFORT, RESERVE, AND ISCHEMIC PAIN.

    DTIC Science & Technology

    Two studies were conducted to compare: (1) ratings of pain and effort induced by a muscle contraction maintained to the limit of endurance; and (2...sensations increased in intensity. The rate of growth of the sensations of pain and effort were influenced by the strength of the muscle contraction . The

  3. Regression models for estimating salinity and selenium concentrations at selected sites in the Upper Colorado River Basin, Colorado, 2009-2012

    USGS Publications Warehouse

    Linard, Joshua I.; Schaffrath, Keelin R.

    2014-01-01

    Elevated concentrations of salinity and selenium in the tributaries and main-stem reaches of the Colorado River are a water-quality concern and have been the focus of remediation efforts for many years. Land-management practices with the objective of limiting the amount of salt and selenium that reaches the stream have focused on improving the methods by which irrigation water is conveyed and distributed. Federal land managers implement improvements in accordance with the Colorado River Basin Salinity Control Act of 1974, which directs Federal land managers to enhance and protect the quality of water available in the Colorado River. In an effort to assist in evaluating and mitigating the detrimental effects of salinity and selenium, the U.S. Geological Survey, in cooperation with the Bureau of Reclamation, the Colorado River Water Resources District, and the Bureau of Land Management, analyzed salinity and selenium data collected at sites to develop regression models. The study area and sites are on the Colorado River or in one of three small basins in Western Colorado: the White River Basin, the Lower Gunnison River Basin, and the Dolores River Basin. By using data collected from water years 2009 through 2011, regression models able to estimate concentrations were developed for salinity at six sites and selenium at six sites. At a minimum, data from discrete measurement of salinity or selenium concentration, streamflow, and specific conductance at each of the sites were needed for model development. Comparison of the Adjusted R2 and standard error statistics of the two salinity models developed at each site indicated the models using specific conductance as the explanatory variable performed better than those using streamflow. The addition of multiple explanatory variables improved the ability to estimate selenium concentration at several sites compared with use of solely streamflow or specific conductance. The error associated with the log-transformed salinity and selenium estimates is consistent in log space; however, when the estimates are transformed into non-log values, the error increases as the estimates decrease. Continuous streamflow and specific conductance data collected at study sites provide the means to examine temporal variability in constituent concentration and load. The regression models can estimate continuous concentrations or loads on the basis of continuous specific conductance or streamflow data. Similar estimates are available for other sites at the USGS National Real-Time Water Quality Web page (http://nrtwq.usgs.gov) and provide water-resource managers with a means of improving their general understanding of how constituent concentration or load can change annually, seasonally, or in real time.

  4. Detecting declines in the abundance of a bull trout (Salvelinus confluentus) population: Understanding the accuracy, precision, and costs of our efforts

    USGS Publications Warehouse

    Al-Chokhachy, R.; Budy, P.; Conner, M.

    2009-01-01

    Using empirical field data for bull trout (Salvelinus confluentus), we evaluated the trade-off between power and sampling effort-cost using Monte Carlo simulations of commonly collected mark-recapture-resight and count data, and we estimated the power to detect changes in abundance across different time intervals. We also evaluated the effects of monitoring different components of a population and stratification methods on the precision of each method. Our results illustrate substantial variability in the relative precision, cost, and information gained from each approach. While grouping estimates by age or stage class substantially increased the precision of estimates, spatial stratification of sampling units resulted in limited increases in precision. Although mark-resight methods allowed for estimates of abundance versus indices of abundance, our results suggest snorkel surveys may be a more affordable monitoring approach across large spatial scales. Detecting a 25% decline in abundance after 5 years was not possible, regardless of technique (power = 0.80), without high sampling effort (48% of study site). Detecting a 25% decline was possible after 15 years, but still required high sampling efforts. Our results suggest detecting moderate changes in abundance of freshwater salmonids requires considerable resource and temporal commitments and highlight the difficulties of using abundance measures for monitoring bull trout populations.

  5. Development of a variable structure-based fault detection and diagnosis strategy applied to an electromechanical system

    NASA Astrophysics Data System (ADS)

    Gadsden, S. Andrew; Kirubarajan, T.

    2017-05-01

    Signal processing techniques are prevalent in a wide range of fields: control, target tracking, telecommunications, robotics, fault detection and diagnosis, and even stock market analysis, to name a few. Although first introduced in the 1950s, the most popular method used for signal processing and state estimation remains the Kalman filter (KF). The KF offers an optimal solution to the estimation problem under strict assumptions. Since this time, a number of other estimation strategies and filters were introduced to overcome robustness issues, such as the smooth variable structure filter (SVSF). In this paper, properties of the SVSF are explored in an effort to detect and diagnosis faults in an electromechanical system. The results are compared with the KF method, and future work is discussed.

  6. Recent land-use/land-cover change in the Central California Valley

    USGS Publications Warehouse

    Soulard, Christopher E.; Wilson, Tamara S.

    2013-01-01

    Open access to Landsat satellite data has enabled annual analyses of modern land-use and land-cover change (LULCC) for the Central California Valley ecoregion between 2005 and 2010. Our annual LULCC estimates capture landscape-level responses to water policy changes, climate, and economic instability. From 2005 to 2010, agriculture in the region fluctuated along with regulatory-driven changes in water allocation as well as persistent drought conditions. Grasslands and shrublands declined, while developed lands increased in former agricultural and grassland/shrublands. Development rates stagnated in 2007, coinciding with the onset of the historic foreclosure crisis in California and the global economic downturn. We utilized annual LULCC estimates to generate interval-based LULCC estimates (2000–2005 and 2005–2010) and extend existing 27 year interval-based land change monitoring through 2010. Resulting change data provides insights into the drivers of landscape change in the Central California Valley ecoregion and represents the first, continuous, 37 year mapping effort of its kind.

  7. Terminal Area Productivity Airport Wind Analysis and Chicago O'Hare Model Description

    NASA Technical Reports Server (NTRS)

    Hemm, Robert; Shapiro, Gerald

    1998-01-01

    This paper describes two results from a continuing effort to provide accurate cost-benefit analyses of the NASA Terminal Area Productivity (TAP) program technologies. Previous tasks have developed airport capacity and delay models and completed preliminary cost benefit estimates for TAP technologies at 10 U.S. airports. This task covers two improvements to the capacity and delay models. The first improvement is the completion of a detailed model set for the Chicago O'Hare (ORD) airport. Previous analyses used a more general model to estimate the benefits for ORD. This paper contains a description of the model details with results corresponding to current conditions. The second improvement is the development of specific wind speed and direction criteria for use in the delay models to predict when the Aircraft Vortex Spacing System (AVOSS) will allow use of reduced landing separations. This paper includes a description of the criteria and an estimate of AVOSS utility for 10 airports based on analysis of 35 years of weather data.

  8. Oil Recovery Enhancement from Fractured, Low Permeability Reservoirs. [Carbonated Water

    DOE R&D Accomplishments Database

    Poston, S. W.

    1991-01-01

    The results of the investigative efforts for this jointly funded DOE-State of Texas research project achieved during the 1990-1991 year may be summarized as follows: Geological Characterization - Detailed maps of the development and hierarchical nature the fracture system exhibited by Austin Chalk outcrops were prepared. The results of these efforts were directly applied to the development of production decline type curves applicable to a dual-fracture-matrix flow system. Analysis of production records obtained from Austin Chalk operators illustrated the utility of these type curves to determine relative fracture/matrix contributions and extent. Well-log response in Austin Chalk wells has been shown to be a reliable indicator of organic maturity. Shear-wave splitting concepts were used to estimate fracture orientations from Vertical Seismic Profile, VSP data. Several programs were written to facilitate analysis of the data. The results of these efforts indicated fractures could be detected with VSP seismic methods. Development of the EOR Imbibition Process - Laboratory displacement as well as Magnetic Resonance Imaging, MRI and Computed Tomography, CT imaging studies have shown the carbonated water-imbibition displacement process significantly accelerates and increases recovery from oil saturated, low permeability rocks. Field Tests - Two operators amenable to conducting a carbonated water flood test on an Austin Chalk well have been identified. Feasibility studies are presently underway.

  9. Accounting for Incomplete Species Detection in Fish Community Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McManamay, Ryan A; Orth, Dr. Donald J; Jager, Yetta

    2013-01-01

    Riverine fish assemblages are heterogeneous and very difficult to characterize with a one-size-fits-all approach to sampling. Furthermore, detecting changes in fish assemblages over time requires accounting for variation in sampling designs. We present a modeling approach that permits heterogeneous sampling by accounting for site and sampling covariates (including method) in a model-based framework for estimation (versus a sampling-based framework). We snorkeled during three surveys and electrofished during a single survey in suite of delineated habitats stratified by reach types. We developed single-species occupancy models to determine covariates influencing patch occupancy and species detection probabilities whereas community occupancy models estimated speciesmore » richness in light of incomplete detections. For most species, information-theoretic criteria showed higher support for models that included patch size and reach as covariates of occupancy. In addition, models including patch size and sampling method as covariates of detection probabilities also had higher support. Detection probability estimates for snorkeling surveys were higher for larger non-benthic species whereas electrofishing was more effective at detecting smaller benthic species. The number of sites and sampling occasions required to accurately estimate occupancy varied among fish species. For rare benthic species, our results suggested that higher number of occasions, and especially the addition of electrofishing, may be required to improve detection probabilities and obtain accurate occupancy estimates. Community models suggested that richness was 41% higher than the number of species actually observed and the addition of an electrofishing survey increased estimated richness by 13%. These results can be useful to future fish assemblage monitoring efforts by informing sampling designs, such as site selection (e.g. stratifying based on patch size) and determining effort required (e.g. number of sites versus occasions).« less

  10. Yield Model Development (YMD) implementation plan for fiscal years 1981 and 1982

    NASA Technical Reports Server (NTRS)

    Ambroziak, R. A. (Principal Investigator)

    1981-01-01

    A plan is described for supporting USDA crop production forecasting and estimation by (1) testing, evaluating, and selecting crop yield models for application testing; (2) identifying areas of feasible research for improvement of models; and (3) conducting research to modify existing models and to develop new crop yield assessment methods. Tasks to be performed for each of these efforts are described as well as for project management and support. The responsibilities of USDA, USDC, USDI, and NASA are delineated as well as problem areas to be addressed.

  11. Classification accuracy of claims-based methods for identifying providers failing to meet performance targets.

    PubMed

    Hubbard, Rebecca A; Benjamin-Johnson, Rhondee; Onega, Tracy; Smith-Bindman, Rebecca; Zhu, Weiwei; Fenton, Joshua J

    2015-01-15

    Quality assessment is critical for healthcare reform, but data sources are lacking for measurement of many important healthcare outcomes. With over 49 million people covered by Medicare as of 2010, Medicare claims data offer a potentially valuable source that could be used in targeted health care quality improvement efforts. However, little is known about the operating characteristics of provider profiling methods using claims-based outcome measures that may estimate provider performance with error. Motivated by the example of screening mammography performance, we compared approaches to identifying providers failing to meet guideline targets using Medicare claims data. We used data from the Breast Cancer Surveillance Consortium and linked Medicare claims to compare claims-based and clinical estimates of cancer detection rate. We then demonstrated the performance of claim-based estimates across a broad range of operating characteristics using simulation studies. We found that identification of poor performing providers was extremely sensitive to algorithm specificity, with no approach identifying more than 65% of poor performing providers when claims-based measures had specificity of 0.995 or less. We conclude that claims have the potential to contribute important information on healthcare outcomes to quality improvement efforts. However, to achieve this potential, development of highly accurate claims-based outcome measures should remain a priority. Copyright © 2014 John Wiley & Sons, Ltd.

  12. Estimating abundance of adult striped bass in reservoirs using mobile hydroacoustics

    USGS Publications Warehouse

    Hightower, Joseph E.; Taylor, J. Christopher; Degan, Donald J.

    2013-01-01

    Hydroacoustic surveys have proven valuable for estimating reservoir forage fish abundance but are more challenging for adult predators such as striped bass Morone saxatilis. Difficulties in assessing striped bass in reservoirs include their low density and the inability to distinguish species with hydroacoustic data alone. Despite these difficulties, mobile hydroacoustic surveys have potential to provide useful data for management because of the large sample volume compared to traditional methods such as gill netting and the ability to target specific areas where striped bass are aggregated. Hydroacoustic estimates of reservoir striped bass have been made using mobile surveys, with data analysis using a threshold for target strength in order to focus on striped bass-sized targets, and auxiliary sampling with nets to obtain species composition. We provide recommendations regarding survey design, based in part on simulations that provide insight on the level of effort that would be required to achieve reasonable estimates of abundance. Future surveys may be able to incorporate telemetry or other sonar techniques such as side-scan or multibeam in order to focus survey efforts on productive habitats (within lake and vertically). However, species apportionment will likely remain the main source of error, and we see no hydroacoustic system on the horizon that will identify fish by species at the spatial and temporal scale required for most reservoir surveys. In situations where species composition can be reliably assessed using traditional gears, abundance estimates from hydroacoustic methods should be useful to fishery managers interested in developing harvest regulations, assessing survival of stocked juveniles, identifying seasonal aggregations, and examining predator–prey balance.

  13. Marine atmospheric effects on electro-optical systems performance

    NASA Astrophysics Data System (ADS)

    Richter, Juergen H.; Hughes, Herbert G.

    1990-09-01

    For the past twelve years, a coordinated tri-service effort has been underway in the United States Department of Defense to provide an atmospheric effects assessment capability for existing and planned electro-optical (E0) systems. This paper reviews the exploratory development effort in the US Navy. A key responsibility for the Navy was the development of marine aerosol models. An initial model, the Navy Aerosol Model (NAN), was developed, tested, and transitioned into LOWTRAN 6. A more comprehensive model, the Navy Oceanic Vertical Aerosol Model (NOVAM), has been formulated and is presently undergoing comprehensive evaluation and testing. Marine aerosols and their extinction properties are only one important factor in EO systems performance assessment. For many EO systems applications, an accurate knowledge of marine background radiances is required in addition to considering the effects of the intervening atmosphere. Accordingly, a capability was developed to estimate the apparent sea surface radiance for different sea states and meteorological conditions. Also, an empirical relationship was developed which directly relates apparent mean sea temperature to calculated mean sky temperature. In situ measurements of relevant environmental parameters are essential for real-time EO systems performance assessment. Direct measurement of slant path extinction would be most desirable. This motivated a careful investigation of lidar (light detection and ranging) techniques including improvements to single-ended lidar profile inversion algorithms and development of new lidar techniques such as double-ended and dual-angle configurations. It was concluded that single-ended, single frequency lidars can not be used to infer slant path extinction with an accuracy necessary to make meaningful performance assessments. Other lidar configurations may find limited application in model validation and research efforts. No technique has emerged yet which could be considered ready for shipboard implementation. A shipboard real-time performance assessment system was developed and named PREOS (Performance and Range for EO Systems). PREOS has been incorporated into the Navy's Tactical Environmental Support System (TESS). The present version of PREOS is a first step in accomplishing the complex task of real-time systems performance assessment. Improved target and background models are under development and will be incorporated into TESS when tested and validated. A reliable assessment capability can be used to develop Tactical Decision Aids (TDAs). TDAs permit optimum selection or combination of sensors and estimation of a ship's own vulnerability against hostile systems.

  14. Enhanced chemical weapon warning via sensor fusion

    NASA Astrophysics Data System (ADS)

    Flaherty, Michael; Pritchett, Daniel; Cothren, Brian; Schwaiger, James

    2011-05-01

    Torch Technologies Inc., is actively involved in chemical sensor networking and data fusion via multi-year efforts with Dugway Proving Ground (DPG) and the Defense Threat Reduction Agency (DTRA). The objective of these efforts is to develop innovative concepts and advanced algorithms that enhance our national Chemical Warfare (CW) test and warning capabilities via the fusion of traditional and non-traditional CW sensor data. Under Phase I, II, and III Small Business Innovative Research (SBIR) contracts with DPG, Torch developed the Advanced Chemical Release Evaluation System (ACRES) software to support non real-time CW sensor data fusion. Under Phase I and II SBIRs with DTRA in conjunction with the Edgewood Chemical Biological Center (ECBC), Torch is using the DPG ACRES CW sensor data fuser as a framework from which to develop the Cloud state Estimation in a Networked Sensor Environment (CENSE) data fusion system. Torch is currently developing CENSE to implement and test innovative real-time sensor network based data fusion concepts using CW and non-CW ancillary sensor data to improve CW warning and detection in tactical scenarios.

  15. Can we achieve Millennium Development Goal 4? New analysis of country trends and forecasts of under-5 mortality to 2015.

    PubMed

    Murray, Christopher J L; Laakso, Thomas; Shibuya, Kenji; Hill, Kenneth; Lopez, Alan D

    2007-09-22

    Global efforts have increased the accuracy and timeliness of estimates of under-5 mortality; however, these estimates fail to use all data available, do not use transparent and reproducible methods, do not distinguish predictions from measurements, and provide no indication of uncertainty around point estimates. We aimed to develop new reproducible methods and reanalyse existing data to elucidate detailed time trends. We merged available databases, added to them when possible, and then applied Loess regression to estimate past trends and forecast to 2015 for 172 countries. We developed uncertainty estimates based on different model specifications and estimated levels and trends in neonatal, post-neonatal, and childhood mortality. Global under-5 mortality has fallen from 110 (109-110) per 1000 in 1980 to 72 (70-74) per 1000 in 2005. Child deaths worldwide have decreased from 13.5 (13.4-13.6) million in 1980 to an estimated 9.7 (9.5-10.0) million in 2005. Global under-5 mortality is expected to decline by 27% from 1990 to 2015, substantially less than the target of Millennium Development Goal 4 (MDG4) of a 67% decrease. Several regions in Latin America, north Africa, the Middle East, Europe, and southeast Asia have had consistent annual rates of decline in excess of 4% over 35 years. Global progress on MDG4 is dominated by slow reductions in sub-Saharan Africa, which also has the slowest rates of decline in fertility. Globally, we are not doing a better job of reducing child mortality now than we were three decades ago. Further improvements in the quality and timeliness of child-mortality measurements should be possible by more fully using existing datasets and applying standard analytical strategies.

  16. A synthetic phylogeny of freshwater crayfish: insights for conservation.

    PubMed

    Owen, Christopher L; Bracken-Grissom, Heather; Stern, David; Crandall, Keith A

    2015-02-19

    Phylogenetic systematics is heading for a renaissance where we shift from considering our phylogenetic estimates as a static image in a published paper and taxonomies as a hardcopy checklist to treating both the phylogenetic estimate and dynamic taxonomies as metadata for further analyses. The Open Tree of Life project (opentreeoflife.org) is developing synthesis tools for harnessing the power of phylogenetic inference and robust taxonomy to develop a synthetic tree of life. We capitalize on this approach to estimate a synthesis tree for the freshwater crayfish. The crayfish make an exceptional group to demonstrate the utility of the synthesis approach, as there recently have been a number of phylogenetic studies on the crayfishes along with a robust underlying taxonomic framework. Importantly, the crayfish have also been extensively assessed by an IUCN Red List team and therefore have accurate and up-to-date area and conservation status data available for analysis within a phylogenetic context. Here, we develop a synthesis phylogeny for the world's freshwater crayfish and examine the phylogenetic distribution of threat. We also estimate a molecular phylogeny based on all available GenBank crayfish sequences and use this tree to estimate divergence times and test for divergence rate variation. Finally, we conduct EDGE and HEDGE analyses and identify a number of species of freshwater crayfish of highest priority in conservation efforts. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  17. A synthetic phylogeny of freshwater crayfish: insights for conservation

    PubMed Central

    Owen, Christopher L.; Bracken-Grissom, Heather; Stern, David; Crandall, Keith A.

    2015-01-01

    Phylogenetic systematics is heading for a renaissance where we shift from considering our phylogenetic estimates as a static image in a published paper and taxonomies as a hardcopy checklist to treating both the phylogenetic estimate and dynamic taxonomies as metadata for further analyses. The Open Tree of Life project (opentreeoflife.org) is developing synthesis tools for harnessing the power of phylogenetic inference and robust taxonomy to develop a synthetic tree of life. We capitalize on this approach to estimate a synthesis tree for the freshwater crayfish. The crayfish make an exceptional group to demonstrate the utility of the synthesis approach, as there recently have been a number of phylogenetic studies on the crayfishes along with a robust underlying taxonomic framework. Importantly, the crayfish have also been extensively assessed by an IUCN Red List team and therefore have accurate and up-to-date area and conservation status data available for analysis within a phylogenetic context. Here, we develop a synthesis phylogeny for the world's freshwater crayfish and examine the phylogenetic distribution of threat. We also estimate a molecular phylogeny based on all available GenBank crayfish sequences and use this tree to estimate divergence times and test for divergence rate variation. Finally, we conduct EDGE and HEDGE analyses and identify a number of species of freshwater crayfish of highest priority in conservation efforts. PMID:25561670

  18. Effort estimation for enterprise resource planning implementation projects using social choice - a comparative study

    NASA Astrophysics Data System (ADS)

    Koch, Stefan; Mitlöhner, Johann

    2010-08-01

    ERP implementation projects have received enormous attention in the last years, due to their importance for organisations, as well as the costs and risks involved. The estimation of effort and costs associated with new projects therefore is an important topic. Unfortunately, there is still a lack of models that can cope with the special characteristics of these projects. As the main focus lies in adapting and customising a complex system, and even changing the organisation, traditional models like COCOMO can not easily be applied. In this article, we will apply effort estimation based on social choice in this context. Social choice deals with aggregating the preferences of a number of voters into a collective preference, and we will apply this idea by substituting the voters by project attributes. Therefore, instead of supplying numeric values for various project attributes, a new project only needs to be placed into rankings per attribute, necessitating only ordinal values, and the resulting aggregate ranking can be used to derive an estimation. We will describe the estimation process using a data set of 39 projects, and compare the results to other approaches proposed in the literature.

  19. Summary of the DREAM8 Parameter Estimation Challenge: Toward Parameter Identification for Whole-Cell Models.

    PubMed

    Karr, Jonathan R; Williams, Alex H; Zucker, Jeremy D; Raue, Andreas; Steiert, Bernhard; Timmer, Jens; Kreutz, Clemens; Wilkinson, Simon; Allgood, Brandon A; Bot, Brian M; Hoff, Bruce R; Kellen, Michael R; Covert, Markus W; Stolovitzky, Gustavo A; Meyer, Pablo

    2015-05-01

    Whole-cell models that explicitly represent all cellular components at the molecular level have the potential to predict phenotype from genotype. However, even for simple bacteria, whole-cell models will contain thousands of parameters, many of which are poorly characterized or unknown. New algorithms are needed to estimate these parameters and enable researchers to build increasingly comprehensive models. We organized the Dialogue for Reverse Engineering Assessments and Methods (DREAM) 8 Whole-Cell Parameter Estimation Challenge to develop new parameter estimation algorithms for whole-cell models. We asked participants to identify a subset of parameters of a whole-cell model given the model's structure and in silico "experimental" data. Here we describe the challenge, the best performing methods, and new insights into the identifiability of whole-cell models. We also describe several valuable lessons we learned toward improving future challenges. Going forward, we believe that collaborative efforts supported by inexpensive cloud computing have the potential to solve whole-cell model parameter estimation.

  20. Summary of the DREAM8 Parameter Estimation Challenge: Toward Parameter Identification for Whole-Cell Models

    PubMed Central

    Karr, Jonathan R.; Williams, Alex H.; Zucker, Jeremy D.; Raue, Andreas; Steiert, Bernhard; Timmer, Jens; Kreutz, Clemens; Wilkinson, Simon; Allgood, Brandon A.; Bot, Brian M.; Hoff, Bruce R.; Kellen, Michael R.; Covert, Markus W.; Stolovitzky, Gustavo A.; Meyer, Pablo

    2015-01-01

    Whole-cell models that explicitly represent all cellular components at the molecular level have the potential to predict phenotype from genotype. However, even for simple bacteria, whole-cell models will contain thousands of parameters, many of which are poorly characterized or unknown. New algorithms are needed to estimate these parameters and enable researchers to build increasingly comprehensive models. We organized the Dialogue for Reverse Engineering Assessments and Methods (DREAM) 8 Whole-Cell Parameter Estimation Challenge to develop new parameter estimation algorithms for whole-cell models. We asked participants to identify a subset of parameters of a whole-cell model given the model’s structure and in silico “experimental” data. Here we describe the challenge, the best performing methods, and new insights into the identifiability of whole-cell models. We also describe several valuable lessons we learned toward improving future challenges. Going forward, we believe that collaborative efforts supported by inexpensive cloud computing have the potential to solve whole-cell model parameter estimation. PMID:26020786

  1. A novel approach for analyzing data on recurrent events with duration to estimate the combined cumulative rate of both variables over time.

    PubMed

    Bhattacharya, Sudipta

    2018-06-01

    Recurrent adverse events, once occur often continue for some duration of time in clinical trials; and the number of events along with their durations is clinically considered as a measure of severity of a disease under study. While there are methods available for analyzing recurrent events or durations or for analyzing both side by side, no effort has been made so far to combine them and present as a single measure. However, this single-valued combined measure may help clinicians assess the wholesome effect of recurrence of incident comprising events and durations. Non-parametric approach is adapted here to develop an estimator for estimating the combined rate of both, the recurrence of events as well as the event-continuation, that is the duration per event. The proposed estimator produces a single numerical value, the interpretation and meaningfulness of which are discussed through the analysis of a real-life clinical dataset. The algebraic expression of variance is derived, asymptotic normality of the estimator is noted, and demonstration is provided on how the estimator can be used in the setup of testing of statistical hypothesis. Further possible development of the estimator is also noted, to adjust for the dependence of event occurrences on the history of the process generating recurrent events through covariates and for the case of dependent censoring.

  2. Developing a Methodology for Eliciting Subjective Probability Estimates During Expert Evaluations of Safety Interventions: Application for Bayesian Belief Networks

    NASA Technical Reports Server (NTRS)

    Wiegmann, Douglas A.a

    2005-01-01

    The NASA Aviation Safety Program (AvSP) has defined several products that will potentially modify airline and/or ATC operations, enhance aircraft systems, and improve the identification of potential hazardous situations within the National Airspace System (NAS). Consequently, there is a need to develop methods for evaluating the potential safety benefit of each of these intervention products so that resources can be effectively invested to produce the judgments to develop Bayesian Belief Networks (BBN's) that model the potential impact that specific interventions may have. Specifically, the present report summarizes methodologies for improving the elicitation of probability estimates during expert evaluations of AvSP products for use in BBN's. The work involved joint efforts between Professor James Luxhoj from Rutgers University and researchers at the University of Illinois. The Rutgers' project to develop BBN's received funding by NASA entitled "Probabilistic Decision Support for Evaluating Technology Insertion and Assessing Aviation Safety System Risk." The proposed project was funded separately but supported the existing Rutgers' program.

  3. Parametric Cost Models for Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Henrichs, Todd; Dollinger, Courtney

    2010-01-01

    Multivariable parametric cost models for space telescopes provide several benefits to designers and space system project managers. They identify major architectural cost drivers and allow high-level design trades. They enable cost-benefit analysis for technology development investment. And, they provide a basis for estimating total project cost. A survey of historical models found that there is no definitive space telescope cost model. In fact, published models vary greatly [1]. Thus, there is a need for parametric space telescopes cost models. An effort is underway to develop single variable [2] and multi-variable [3] parametric space telescope cost models based on the latest available data and applying rigorous analytical techniques. Specific cost estimating relationships (CERs) have been developed which show that aperture diameter is the primary cost driver for large space telescopes; technology development as a function of time reduces cost at the rate of 50% per 17 years; it costs less per square meter of collecting aperture to build a large telescope than a small telescope; and increasing mass reduces cost.

  4. Design Protocols and Analytical Strategies that Incorporate Structural Reliability Models

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.

    1995-01-01

    The general goal of this project is to establish design protocols that enable the engineer to analyze and predict certain types of behavior in ceramic composites. Sections of the final report addresses the following: Description of the Problem that Motivated the Technology Development, Description of the New Technology that was Developed, Unique and Novel Features of the Technology and Results/Benefits of Application (year by year accomplishments), and Utilization of New Technology in Non-Aerospace Applications. Activities for this reporting period included the development of a design analysis as part of a cooperative agreement with general Electric Aircraft Engines. The effort focused on modifying the Toughened Ceramics Analysis and Reliability Evaluation of Structures (TCARES) algorithm for use in the design of engine components fabricated from NiAl. Other activities related to the development of an ASTM standard practice for estimating Weibull parameters. The standard focuses on the evaluation and reporting of uniaxial strength data, and the estimation of probability distribution parameters for ceramics which fail in a brittle fashion.

  5. Estimating the Costs of Preventive Interventions

    ERIC Educational Resources Information Center

    Foster, E. Michael; Porter, Michele M.; Ayers, Tim S.; Kaplan, Debra L.; Sandler, Irwin

    2007-01-01

    The goal of this article is to improve the practice and reporting of cost estimates of prevention programs. It reviews the steps in estimating the costs of an intervention and the principles that should guide estimation. The authors then review prior efforts to estimate intervention costs using a sample of well-known but diverse studies. Finally,…

  6. New Methodology for Estimating Fuel Economy by Vehicle Class

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, Shih-Miao; Dabbs, Kathryn; Hwang, Ho-Ling

    2011-01-01

    Office of Highway Policy Information to develop a new methodology to generate annual estimates of average fuel efficiency and number of motor vehicles registered by vehicle class for Table VM-1 of the Highway Statistics annual publication. This paper describes the new methodology developed under this effort and compares the results of the existing manual method and the new systematic approach. The methodology developed under this study takes a two-step approach. First, the preliminary fuel efficiency rates are estimated based on vehicle stock models for different classes of vehicles. Then, a reconciliation model is used to adjust the initial fuel consumptionmore » rates from the vehicle stock models and match the VMT information for each vehicle class and the reported total fuel consumption. This reconciliation model utilizes a systematic approach that produces documentable and reproducible results. The basic framework utilizes a mathematical programming formulation to minimize the deviations between the fuel economy estimates published in the previous year s Highway Statistics and the results from the vehicle stock models, subject to the constraint that fuel consumptions for different vehicle classes must sum to the total fuel consumption estimate published in Table MF-21 of the current year Highway Statistics. The results generated from this new approach provide a smoother time series for the fuel economies by vehicle class. It also utilizes the most up-to-date and best available data with sound econometric models to generate MPG estimates by vehicle class.« less

  7. Implementation and testing of a sensor-netting algorithm for early warning and high confidence C/B threat detection

    NASA Astrophysics Data System (ADS)

    Gruber, Thomas; Grim, Larry; Fauth, Ryan; Tercha, Brian; Powell, Chris; Steinhardt, Kristin

    2011-05-01

    Large networks of disparate chemical/biological (C/B) sensors, MET sensors, and intelligence, surveillance, and reconnaissance (ISR) sensors reporting to various command/display locations can lead to conflicting threat information, questions of alarm confidence, and a confused situational awareness. Sensor netting algorithms (SNA) are being developed to resolve these conflicts and to report high confidence consensus threat map data products on a common operating picture (COP) display. A data fusion algorithm design was completed in a Phase I SBIR effort and development continues in the Phase II SBIR effort. The initial implementation and testing of the algorithm has produced some performance results. The algorithm accepts point and/or standoff sensor data, and event detection data (e.g., the location of an explosion) from various ISR sensors (e.g., acoustic, infrared cameras, etc.). These input data are preprocessed to assign estimated uncertainty to each incoming piece of data. The data are then sent to a weighted tomography process to obtain a consensus threat map, including estimated threat concentration level uncertainty. The threat map is then tested for consistency and the overall confidence for the map result is estimated. The map and confidence results are displayed on a COP. The benefits of a modular implementation of the algorithm and comparisons of fused / un-fused data results will be presented. The metrics for judging the sensor-netting algorithm performance are warning time, threat map accuracy (as compared to ground truth), false alarm rate, and false alarm rate v. reported threat confidence level.

  8. Measuring Coverage in MNCH: Tracking Progress in Health for Women and Children Using DHS and MICS Household Surveys

    PubMed Central

    Hancioglu, Attila; Arnold, Fred

    2013-01-01

    Household surveys are the primary data source of coverage indicators for children and women for most developing countries. Most of this information is generated by two global household survey programmes—the USAID-supported Demographic and Health Surveys (DHS) and the UNICEF-supported Multiple Indicator Cluster Surveys (MICS). In this review, we provide an overview of these two programmes, which cover a wide range of child and maternal health topics and provide estimates of many Millennium Development Goal indicators, as well as estimates of the indicators for the Countdown to 2015 initiative and the Commission on Information and Accountability for Women's and Children's Health. MICS and DHS collaborate closely and work through interagency processes to ensure that survey tools are harmonized and comparable as far as possible, but we highlight differences between DHS and MICS in the population covered and the reference periods used to measure coverage. These differences need to be considered when comparing estimates of reproductive, maternal, newborn, and child health indicators across countries and over time and we discuss the implications of these differences for coverage measurement. Finally, we discuss the need for survey planners and consumers of survey results to understand the strengths, limitations, and constraints of coverage measurements generated through household surveys, and address some technical issues surrounding sampling and quality control. We conclude that, although much effort has been made to improve coverage measurement in household surveys, continuing efforts are needed, including further research to improve and refine survey methods and analytical techniques. PMID:23667333

  9. Producing HIV estimates: from global advocacy to country planning and impact measurement

    PubMed Central

    Mahy, Mary; Brown, Tim; Stover, John; Walker, Neff; Stanecki, Karen; Kirungi, Wilford; Garcia-Calleja, Txema; Ghys, Peter D.

    2017-01-01

    ABSTRACT Background: The development of global HIV estimates has been critical for understanding, advocating for and funding the HIV response. The process of generating HIV estimates has been cited as the gold standard for public health estimates. Objective: This paper provides important lessons from an international scientific collaboration and provides a useful model for those producing public health estimates in other fields. Design: Through the compilation and review of published journal articles, United Nations reports, other documents and personal experience we compiled historical information about the estimates and identified potential lessons for other public health estimation efforts. Results: Through the development of core partnerships with country teams, implementers, demographers, mathematicians, epidemiologists and international organizations, UNAIDS has led a process to develop the capacity of country teams to produce internationally comparable HIV estimates. The guidance provided by these experts has led to refinements in the estimated numbers of people living with HIV, new HIV infections and AIDS-related deaths over the past 20 years. A number of important updates to the methods since 1997 resulted in fluctuations in the estimated levels, trends and impact of HIV. The largest correction occurred between the 2005 and 2007 rounds with the additions of household survey data into the models. In 2001 the UNAIDS models at that time estimated there were 40 million people living with HIV. In 2016, improved models estimate there were 30 million (27.6–32.7 million) people living with HIV in 2001. Conclusions: Country ownership of the estimation tools has allowed for additional uses of the results than had the results been produced by researchers or a team in Geneva. Guidance from a reference group and input from country teams have led to critical improvements in the models over time. Those changes have improved countries’ and stakeholders’ understanding of the HIV epidemic. PMID:28532304

  10. Effort testing in children: can cognitive and symptom validity measures differentiate malingered performances?

    PubMed

    Rambo, Philip L; Callahan, Jennifer L; Hogan, Lindsey R; Hullmann, Stephanie; Wrape, Elizabeth

    2015-01-01

    Recent efforts have contributed to significant advances in the detection of malingered performances in adults during cognitive assessment. However, children's ability to purposefully underperform has received relatively little attention. The purpose of the present investigation was to examine children's performances on common intellectual measures, as well as two symptom validity measures: the Test of Memory Malingering and the Dot-Counting Test. This was accomplished through the administration of measures to children ages 6 to 12 years old in randomly assigned full-effort (control) and poor-effort (treatment) conditions. Prior to randomization, children's general intellectual functioning (i.e., IQ) was estimated via administration of the Kaufman Brief Intellectual Battery-Second Edition (KBIT-2). Multivariate analyses revealed that the conditions significantly differed on some but not all administered measures. Specifically, children's estimated IQ in the treatment condition significantly differed from the full-effort IQ initially obtained from the same children on the KBIT-2, as well as from the IQs obtained in the full-effort control condition. These findings suggest that children are fully capable of willfully underperforming during cognitive testing; however, consistent with prior investigations, some measures evidence greater sensitivity than others in evaluating effort.

  11. An Empirical Comparison between Two Recursive Filters for Attitude and Rate Estimation of Spinning Spacecraft

    NASA Technical Reports Server (NTRS)

    Harman, Richard R.

    2006-01-01

    The advantages of inducing a constant spin rate on a spacecraft are well known. A variety of science missions have used this technique as a relatively low cost method for conducting science. Starting in the late 1970s, NASA focused on building spacecraft using 3-axis control as opposed to the single-axis control mentioned above. Considerable effort was expended toward sensor and control system development, as well as the development of ground systems to independently process the data. As a result, spinning spacecraft development and their resulting ground system development stagnated. In the 1990s, shrinking budgets made spinning spacecraft an attractive option for science. The attitude requirements for recent spinning spacecraft are more stringent and the ground systems must be enhanced in order to provide the necessary attitude estimation accuracy. Since spinning spacecraft (SC) typically have no gyroscopes for measuring attitude rate, any new estimator would need to rely on the spacecraft dynamics equations. One estimation technique that utilized the SC dynamics and has been used successfully in 3-axis gyro-less spacecraft ground systems is the pseudo-linear Kalman filter algorithm. Consequently, a pseudo-linear Kalman filter has been developed which directly estimates the spacecraft attitude quaternion and rate for a spinning SC. Recently, a filter using Markley variables was developed specifically for spinning spacecraft. The pseudo-linear Kalman filter has the advantage of being easier to implement but estimates the quaternion which, due to the relatively high spinning rate, changes rapidly for a spinning spacecraft. The Markley variable filter is more complicated to implement but, being based on the SC angular momentum, estimates parameters which vary slowly. This paper presents a comparison of the performance of these two filters. Monte-Carlo simulation runs will be presented which demonstrate the advantages and disadvantages of both filters.

  12. Formation Flying Satellite Control Around the L2 Sun-Earth Libration Point

    NASA Technical Reports Server (NTRS)

    Hamilton, Nicholas H.; Folta, David; Carpenter, Russell; Bauer, Frank (Technical Monitor)

    2002-01-01

    This paper discusses the development of a linear control algorithm for formations in the vicinity of the L2 sun-Earth libration point. The development of a simplified extended Kalman filter is included as well. Simulations are created for the analysis of the stationkeeping and various formation maneuvers of the Stellar Imager mission. The simulations provide tracking error, estimation error, and control effort results. For formation maneuvering, the formation spacecraft track to within 4 meters of their desired position and within 1.5 millimeters per second of their desired zero velocity. The filter, with few exceptions, keeps the estimation errors within their three-sigma values. Without noise, the controller performs extremely well, with the formation spacecraft tracking to within several micrometers. Each spacecraft uses around 1 to 2 grams of propellant per maneuver, depending on the circumstances.

  13. Utilizing Ion-Mobility Data to Estimate Molecular Masses

    NASA Technical Reports Server (NTRS)

    Duong, Tuan; Kanik, Isik

    2008-01-01

    A method is being developed for utilizing readings of an ion-mobility spectrometer (IMS) to estimate molecular masses of ions that have passed through the spectrometer. The method involves the use of (1) some feature-based descriptors of structures of molecules of interest and (2) reduced ion mobilities calculated from IMS readings as inputs to (3) a neural network. This development is part of a larger effort to enable the use of IMSs as relatively inexpensive, robust, lightweight instruments to identify, via molecular masses, individual compounds or groups of compounds (especially organic compounds) that may be present in specific environments or samples. Potential applications include detection of organic molecules as signs of life on remote planets, modeling and detection of biochemicals of interest in the pharmaceutical and agricultural industries, and detection of chemical and biological hazards in industrial, homeland-security, and industrial settings.

  14. A generic open-source software framework supporting scenario simulations in bioterrorist crises.

    PubMed

    Falenski, Alexander; Filter, Matthias; Thöns, Christian; Weiser, Armin A; Wigger, Jan-Frederik; Davis, Matthew; Douglas, Judith V; Edlund, Stefan; Hu, Kun; Kaufman, James H; Appel, Bernd; Käsbohrer, Annemarie

    2013-09-01

    Since the 2001 anthrax attack in the United States, awareness of threats originating from bioterrorism has grown. This led internationally to increased research efforts to improve knowledge of and approaches to protecting human and animal populations against the threat from such attacks. A collaborative effort in this context is the extension of the open-source Spatiotemporal Epidemiological Modeler (STEM) simulation and modeling software for agro- or bioterrorist crisis scenarios. STEM, originally designed to enable community-driven public health disease models and simulations, was extended with new features that enable integration of proprietary data as well as visualization of agent spread along supply and production chains. STEM now provides a fully developed open-source software infrastructure supporting critical modeling tasks such as ad hoc model generation, parameter estimation, simulation of scenario evolution, estimation of effects of mitigation or management measures, and documentation. This open-source software resource can be used free of charge. Additionally, STEM provides critical features like built-in worldwide data on administrative boundaries, transportation networks, or environmental conditions (eg, rainfall, temperature, elevation, vegetation). Users can easily combine their own confidential data with built-in public data to create customized models of desired resolution. STEM also supports collaborative and joint efforts in crisis situations by extended import and export functionalities. In this article we demonstrate specifically those new software features implemented to accomplish STEM application in agro- or bioterrorist crisis scenarios.

  15. ExMC Work Prioritization Process

    NASA Technical Reports Server (NTRS)

    Simon, Matthew

    2015-01-01

    Last year, NASA's Human Research Program (HRP) introduced the concept of a "Path to Risk Reduction" (PRR), which will provide a roadmap that shows how the work being done within each HRP element can be mapped to reducing or closing exploration risks. Efforts are currently underway within the Exploration Medical Capability (ExMC) Element to develop a structured, repeatable process for prioritizing work utilizing decision analysis techniques and risk estimation tools. The goal of this effort is to ensure that the work done within the element maximizes risk reduction for future exploration missions in a quantifiable way and better aligns with the intent and content of the Path to Risk Reduction. The Integrated Medical Model (IMM) will be used to identify those conditions that are major contributors of medical risk for a given design reference mission. For each of these conditions, potential prevention, screening, diagnosis, and treatment methods will be identified. ExMC will then aim to prioritize its potential investments in these mitigation methods based upon their potential for risk reduction and other factors such as vehicle performance impacts, near term schedule needs, duplication with external efforts, and cost. This presentation will describe the process developed to perform this prioritization and inform investment discussions in future element planning efforts. It will also provide an overview of the required input information, types of process participants, figures of merit, and the expected outputs of the process.

  16. Precision and relative effectiveness of a purse seine for sampling age-0 river herring in lakes

    USGS Publications Warehouse

    Devine, Matthew T.; Roy, Allison; Whiteley, Andrew R.; Gahagan, Benjamin I.; Armstrong, Michael P.; Jordaan, Adrian

    2018-01-01

    Stock assessments for anadromous river herring, collectively Alewife Alosa pseudoharengus and Blueback Herring A. aestivalis, lack adequate demographic information, particularly with respect to early life stages. Although sampling adult river herring is increasingly common throughout their range, currently no standardized, field‐based, analytical methods exist for estimating juvenile abundance in freshwater lakes. The objective of this research was to evaluate the relative effectiveness and sampling precision of a purse seine for estimating densities of age‐0 river herring in freshwater lakes. We used a purse seine to sample age‐0 river herring in June–September 2015 and June–July 2016 in 16 coastal freshwater lakes in the northeastern USA. Sampling effort varied from two seine hauls to more than 50 seine hauls per lake. Catch rates were highest in June and July, and sampling precision was maximized in July. Sampling at night (versus day) in open water (versus littoral areas) was most effective for capturing newly hatched larvae and juveniles up to ca. 100 mm TL. Bootstrap simulation results indicated that sampling precision of CPUE estimates increased with sampling effort, and there was a clear threshold beyond which increased effort resulted in negligible increases in precision. The effort required to produce precise CPUE estimates, as determined by the CV, was dependent on lake size; river herring densities could be estimated with up to 10 purse‐seine hauls (one‐two nights) in a small lake (<50 ha) and 15–20 hauls (two‐three nights) in a large lake (>50 ha). Fish collection techniques using a purse seine as described in this paper are likely to be effective for estimating recruit abundance of river herring in freshwater lakes across their range.

  17. Estimations of relative effort during sit-to-stand increase when accounting for variations in maximum voluntary torque with joint angle and angular velocity.

    PubMed

    Bieryla, Kathleen A; Anderson, Dennis E; Madigan, Michael L

    2009-02-01

    The main purpose of this study was to compare three methods of determining relative effort during sit-to-stand (STS). Fourteen young (mean 19.6+/-SD 1.2 years old) and 17 older (61.7+/-5.5 years old) adults completed six STS trials at three speeds: slow, normal, and fast. Sagittal plane joint torques at the hip, knee, and ankle were calculated through inverse dynamics. Isometric and isokinetic maximum voluntary contractions (MVC) for the hip, knee, and ankle were collected and used for model parameters to predict the participant-specific maximum voluntary joint torque. Three different measures of relative effort were determined by normalizing STS joint torques to three different estimates of maximum voluntary torque. Relative effort at the hip, knee, and ankle were higher when accounting for variations in maximum voluntary torque with joint angle and angular velocity (hip=26.3+/-13.5%, knee=78.4+/-32.2%, ankle=27.9+/-14.1%) compared to methods which do not account for these variations (hip=23.5+/-11.7%, knee=51.7+/-15.0%, ankle=20.7+/-10.4%). At higher velocities, the difference in calculating relative effort with respect to isometric MVC or incorporating joint angle and angular velocity became more evident. Estimates of relative effort that account for the variations in maximum voluntary torque with joint angle and angular velocity may provide higher levels of accuracy compared to methods based on measurements of maximal isometric torques.

  18. Antecedent causes of a measles resurgence in the Democratic Republic of the Congo

    PubMed Central

    Scobie, Heather Melissa; Ilunga, Benoît Kebela; Mulumba, Audry; Shidi, Calixte; Coulibaly, Tiekoura; Obama, Ricardo; Tamfum, Jean-Jacques Muyembe; Simbu, Elisabeth Pukuta; Smit, Sheilagh Brigitte; Masresha, Balcha; Perry, Robert Tyrrell; Alleman, Mary Margaret; Kretsinger, Katrina; Goodson, James

    2015-01-01

    Introduction Despite accelerated measles control efforts, a massive measles resurgence occurred in the Democratic Republic of the Congo (DRC) starting in mid-2010, prompting an investigation into likely causes. Methods We conducted a descriptive epidemiological analysis using measles immunization and surveillance data to understand the causes of the measles resurgence and to develop recommendations for elimination efforts in DRC. Results During 2004-2012, performance indicator targets for case-based surveillance and routine measles vaccination were not met. Estimated coverage with the routine first dose of measles-containing vaccine (MCV1) increased from 57% to 73%. Phased supplementary immunization activities (SIAs) were conducted starting in 2002, in some cases with sub-optimal coverage (≤95%). In 2010, SIAs in five of 11 provinces were not implemented as planned, resulting in a prolonged interval between SIAs, and a missed birth cohort in one province. During July 1, 2010-December 30, 2012, high measles attack rates (>100 cases per 100,000 population) occurred in provinces that had estimated MCV1 coverage lower than the national estimate and did not implement planned 2010 SIAs. The majority of confirmed case-patients were aged <10 years (87%) and unvaccinated or with unknown vaccination status (75%). Surveillance detected two genotype B3 and one genotype B2 measles virus strains that were previously identified in the region. Conclusion The resurgence was likely caused by an accumulation of unvaccinated, measles-susceptible children due to low MCV1 coverage and suboptimal SIA implementation. To achieve the regional goal of measles elimination by 2020, efforts are needed in DRC to improve case-based surveillance and increase two-dose measles vaccination coverage through routine services and SIAs. PMID:26401224

  19. Antecedent causes of a measles resurgence in the Democratic Republic of the Congo.

    PubMed

    Scobie, Heather Melissa; Ilunga, Benoît Kebela; Mulumba, Audry; Shidi, Calixte; Coulibaly, Tiekoura; Obama, Ricardo; Tamfum, Jean-Jacques Muyembe; Simbu, Elisabeth Pukuta; Smit, Sheilagh Brigitte; Masresha, Balcha; Perry, Robert Tyrrell; Alleman, Mary Margaret; Kretsinger, Katrina; Goodson, James

    2015-01-01

    Despite accelerated measles control efforts, a massive measles resurgence occurred in the Democratic Republic of the Congo (DRC) starting in mid-2010, prompting an investigation into likely causes. We conducted a descriptive epidemiological analysis using measles immunization and surveillance data to understand the causes of the measles resurgence and to develop recommendations for elimination efforts in DRC. During 2004-2012, performance indicator targets for case-based surveillance and routine measles vaccination were not met. Estimated coverage with the routine first dose of measles-containing vaccine (MCV1) increased from 57% to 73%. Phased supplementary immunization activities (SIAs) were conducted starting in 2002, in some cases with sub-optimal coverage (≤95%). In 2010, SIAs in five of 11 provinces were not implemented as planned, resulting in a prolonged interval between SIAs, and a missed birth cohort in one province. During July 1, 2010-December 30, 2012, high measles attack rates (>100 cases per 100,000 population) occurred in provinces that had estimated MCV1 coverage lower than the national estimate and did not implement planned 2010 SIAs. The majority of confirmed case-patients were aged <10 years (87%) and unvaccinated or with unknown vaccination status (75%). Surveillance detected two genotype B3 and one genotype B2 measles virus strains that were previously identified in the region. The resurgence was likely caused by an accumulation of unvaccinated, measles-susceptible children due to low MCV1 coverage and suboptimal SIA implementation. To achieve the regional goal of measles elimination by 2020, efforts are needed in DRC to improve case-based surveillance and increase two-dose measles vaccination coverage through routine services and SIAs.

  20. Statistical aspects of carbon fiber risk assessment modeling. [fire accidents involving aircraft

    NASA Technical Reports Server (NTRS)

    Gross, D.; Miller, D. R.; Soland, R. M.

    1980-01-01

    The probabilistic and statistical aspects of the carbon fiber risk assessment modeling of fire accidents involving commercial aircraft are examined. Three major sources of uncertainty in the modeling effort are identified. These are: (1) imprecise knowledge in establishing the model; (2) parameter estimation; and (3)Monte Carlo sampling error. All three sources of uncertainty are treated and statistical procedures are utilized and/or developed to control them wherever possible.

  1. Behavioral Determinants of Switching to Arsenic-Safe Water Wells: An Analysis of a Randomized Controlled Trial of Health Education Interventions Coupled With Water Arsenic Testing

    ERIC Educational Resources Information Center

    George, Christine Marie; Inauen, Jennifer; Perin, Jamie; Tighe, Jennifer; Hasan, Khaled; Zheng, Yan

    2017-01-01

    More than 100 million people globally are estimated to be exposed to arsenic in drinking water that exceeds the World Health Organization guideline of 10 µg/L. In an effort to develop and test a low-cost sustainable approach for water arsenic testing in Bangladesh, we conducted a randomized controlled trial which found arsenic educational…

  2. Multi-Family Group Intervention for OEF/OIF Traumatic Brain Injury Survivors and Their Families

    DTIC Science & Technology

    2010-10-01

    information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden...considerable layer of complexity to recruitment, especially as the PI and study clinicians were based in psychiatry. It has taken many months to develop...coordination or recruitment efforts by psychiatry with the services diagnosing and treating the vets is complex and time-consuming. In New Jersey

  3. Improved relocatable over-the-horizon radar detection and tracking using the maximum likelihood adaptive neural system algorithm

    NASA Astrophysics Data System (ADS)

    Perlovsky, Leonid I.; Webb, Virgil H.; Bradley, Scott R.; Hansen, Christopher A.

    1998-07-01

    An advanced detection and tracking system is being developed for the U.S. Navy's Relocatable Over-the-Horizon Radar (ROTHR) to provide improved tracking performance against small aircraft typically used in drug-smuggling activities. The development is based on the Maximum Likelihood Adaptive Neural System (MLANS), a model-based neural network that combines advantages of neural network and model-based algorithmic approaches. The objective of the MLANS tracker development effort is to address user requirements for increased detection and tracking capability in clutter and improved track position, heading, and speed accuracy. The MLANS tracker is expected to outperform other approaches to detection and tracking for the following reasons. It incorporates adaptive internal models of target return signals, target tracks and maneuvers, and clutter signals, which leads to concurrent clutter suppression, detection, and tracking (track-before-detect). It is not combinatorial and thus does not require any thresholding or peak picking and can track in low signal-to-noise conditions. It incorporates superresolution spectrum estimation techniques exceeding the performance of conventional maximum likelihood and maximum entropy methods. The unique spectrum estimation method is based on the Einsteinian interpretation of the ROTHR received energy spectrum as a probability density of signal frequency. The MLANS neural architecture and learning mechanism are founded on spectrum models and maximization of the "Einsteinian" likelihood, allowing knowledge of the physical behavior of both targets and clutter to be injected into the tracker algorithms. The paper describes the addressed requirements and expected improvements, theoretical foundations, engineering methodology, and results of the development effort to date.

  4. A Method to Accurately Estimate the Muscular Torques of Human Wearing Exoskeletons by Torque Sensors

    PubMed Central

    Hwang, Beomsoo; Jeon, Doyoung

    2015-01-01

    In exoskeletal robots, the quantification of the user’s muscular effort is important to recognize the user’s motion intentions and evaluate motor abilities. In this paper, we attempt to estimate users’ muscular efforts accurately using joint torque sensor which contains the measurements of dynamic effect of human body such as the inertial, Coriolis, and gravitational torques as well as torque by active muscular effort. It is important to extract the dynamic effects of the user’s limb accurately from the measured torque. The user’s limb dynamics are formulated and a convenient method of identifying user-specific parameters is suggested for estimating the user’s muscular torque in robotic exoskeletons. Experiments were carried out on a wheelchair-integrated lower limb exoskeleton, EXOwheel, which was equipped with torque sensors in the hip and knee joints. The proposed methods were evaluated by 10 healthy participants during body weight-supported gait training. The experimental results show that the torque sensors are to estimate the muscular torque accurately in cases of relaxed and activated muscle conditions. PMID:25860074

  5. Advances in Scientific Balloon Thermal Modeling

    NASA Technical Reports Server (NTRS)

    Bohaboj, T.; Cathey, H. M., Jr.

    2004-01-01

    The National Aeronautics and Space Administration's Balloon Program office has long acknowledged that the accurate modeling of balloon performance and flight prediction is dependant on how well the balloon is thermally modeled. This ongoing effort is focused on developing accurate balloon thermal models that can be used to quickly predict balloon temperatures and balloon performance. The ability to model parametric changes is also a driver for this effort. This paper will present the most recent advances made in this area. This research effort continues to utilize the "Thrmal Desktop" addition to AUTO CAD for the modeling. Recent advances have been made by using this analytical tool. A number of analyses have been completed to test the applicability of this tool to the problem with very positive results. Progressively detailed models have been developed to explore the capabilities of the tool as well as to provide guidance in model formulation. A number of parametric studies have been completed. These studies have varied the shape of the structure, material properties, environmental inputs, and model geometry. These studies have concentrated on spherical "proxy models" for the initial development stages and then to transition to the natural shaped zero pressure and super pressure balloons. An assessment of required model resolution has also been determined. Model solutions have been cross checked with known solutions via hand calculations. The comparison of these cases will also be presented. One goal is to develop analysis guidelines and an approach for modeling balloons for both simple first order estimates and detailed full models. This papa presents the step by step advances made as part of this effort, capabilities, limitations, and the lessons learned. Also presented are the plans for further thermal modeling work.

  6. Advanced Small Modular Reactor Economics Model Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, Thomas J.

    2014-10-01

    The US Department of Energy Office of Nuclear Energy’s Advanced Small Modular Reactor (SMR) research and development activities focus on four key areas: Developing assessment methods for evaluating advanced SMR technologies and characteristics; and Developing and testing of materials, fuels and fabrication techniques; and Resolving key regulatory issues identified by US Nuclear Regulatory Commission and industry; and Developing advanced instrumentation and controls and human-machine interfaces. This report focuses on development of assessment methods to evaluate advanced SMR technologies and characteristics. Specifically, this report describes the expansion and application of the economic modeling effort at Oak Ridge National Laboratory. Analysis ofmore » the current modeling methods shows that one of the primary concerns for the modeling effort is the handling of uncertainty in cost estimates. Monte Carlo–based methods are commonly used to handle uncertainty, especially when implemented by a stand-alone script within a program such as Python or MATLAB. However, a script-based model requires each potential user to have access to a compiler and an executable capable of handling the script. Making the model accessible to multiple independent analysts is best accomplished by implementing the model in a common computing tool such as Microsoft Excel. Excel is readily available and accessible to most system analysts, but it is not designed for straightforward implementation of a Monte Carlo–based method. Using a Monte Carlo algorithm requires in-spreadsheet scripting and statistical analyses or the use of add-ons such as Crystal Ball. An alternative method uses propagation of error calculations in the existing Excel-based system to estimate system cost uncertainty. This method has the advantage of using Microsoft Excel as is, but it requires the use of simplifying assumptions. These assumptions do not necessarily bring into question the analytical results. In fact, the analysis shows that the propagation of error method introduces essentially negligible error, especially when compared to the uncertainty associated with some of the estimates themselves. The results of these uncertainty analyses generally quantify and identify the sources of uncertainty in the overall cost estimation. The obvious generalization—that capital cost uncertainty is the main driver—can be shown to be an accurate generalization for the current state of reactor cost analysis. However, the detailed analysis on a component-by-component basis helps to demonstrate which components would benefit most from research and development to decrease the uncertainty, as well as which components would benefit from research and development to decrease the absolute cost.« less

  7. Cost Benefit Analysis Modeling Tool for Electric vs. ICE Airport Ground Support Equipment – Development and Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James Francfort; Kevin Morrow; Dimitri Hochard

    2007-02-01

    This report documents efforts to develop a computer tool for modeling the economic payback for comparative airport ground support equipment (GSE) that are propelled by either electric motors or gasoline and diesel engines. The types of GSE modeled are pushback tractors, baggage tractors, and belt loaders. The GSE modeling tool includes an emissions module that estimates the amount of tailpipe emissions saved by replacing internal combustion engine GSE with electric GSE. This report contains modeling assumptions, methodology, a user’s manual, and modeling results. The model was developed based on the operations of two airlines at four United States airports.

  8. System analysis study of space platform and station accommodations for life sciences research facilities. Volume 1: Executive summary. Phase A: Conceptual design and programmatics

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The study was conducted in 3 parts over a 3 year period. The study schedule and the documentation associated with each study part is given. This document summarized selected study results from the conceptual design and programmatics segment of the effort. The objectives were: (1) to update requirements and tradeoffs and develop a detailed design and mission requirements document; (2) to develop conceptual designs and mission descriptions; and (3) to develop programmatic, i.e., work breakdown structure and work breakdown structure dictionary, estimated cost, and implementing plans and schedules.

  9. Continuous Space Estimation: Increasing WiFi-Based Indoor Localization Resolution without Increasing the Site-Survey Effort.

    PubMed

    Hernández, Noelia; Ocaña, Manuel; Alonso, Jose M; Kim, Euntai

    2017-01-13

    Although much research has taken place in WiFi indoor localization systems, their accuracy can still be improved. When designing this kind of system, fingerprint-based methods are a common choice. The problem with fingerprint-based methods comes with the need of site surveying the environment, which is effort consuming. In this work, we propose an approach, based on support vector regression, to estimate the received signal strength at non-site-surveyed positions of the environment. Experiments, performed in a real environment, show that the proposed method could be used to improve the resolution of fingerprint-based indoor WiFi localization systems without increasing the site survey effort.

  10. Continuous Space Estimation: Increasing WiFi-Based Indoor Localization Resolution without Increasing the Site-Survey Effort †

    PubMed Central

    Hernández, Noelia; Ocaña, Manuel; Alonso, Jose M.; Kim, Euntai

    2017-01-01

    Although much research has taken place in WiFi indoor localization systems, their accuracy can still be improved. When designing this kind of system, fingerprint-based methods are a common choice. The problem with fingerprint-based methods comes with the need of site surveying the environment, which is effort consuming. In this work, we propose an approach, based on support vector regression, to estimate the received signal strength at non-site-surveyed positions of the environment. Experiments, performed in a real environment, show that the proposed method could be used to improve the resolution of fingerprint-based indoor WiFi localization systems without increasing the site survey effort. PMID:28098773

  11. WHE-PAGER Project: A new initiative in estimating global building inventory and its seismic vulnerability

    USGS Publications Warehouse

    Porter, K.A.; Jaiswal, K.S.; Wald, D.J.; Greene, M.; Comartin, Craig

    2008-01-01

    The U.S. Geological Survey’s Prompt Assessment of Global Earthquake’s Response (PAGER) Project and the Earthquake Engineering Research Institute’s World Housing Encyclopedia (WHE) are creating a global database of building stocks and their earthquake vulnerability. The WHE already represents a growing, community-developed public database of global housing and its detailed structural characteristics. It currently contains more than 135 reports on particular housing types in 40 countries. The WHE-PAGER effort extends the WHE in several ways: (1) by addressing non-residential construction; (2) by quantifying the prevalence of each building type in both rural and urban areas; (3) by addressing day and night occupancy patterns, (4) by adding quantitative vulnerability estimates from judgment or statistical observation; and (5) by analytically deriving alternative vulnerability estimates using in part laboratory testing.

  12. Crop area estimation based on remotely-sensed data with an accurate but costly subsample

    NASA Technical Reports Server (NTRS)

    Gunst, R. F.

    1985-01-01

    Research activities conducted under the auspices of National Aeronautics and Space Administration Cooperative Agreement NCC 9-9 are discussed. During this contract period research efforts are concentrated in two primary areas. The first are is an investigation of the use of measurement error models as alternatives to least squares regression estimators of crop production or timber biomass. The secondary primary area of investigation is on the estimation of the mixing proportion of two-component mixture models. This report lists publications, technical reports, submitted manuscripts, and oral presentation generated by these research efforts. Possible areas of future research are mentioned.

  13. Respiratory effort energy estimation using Doppler radar.

    PubMed

    Shahhaidar, Ehsaneh; Yavari, Ehsan; Young, Jared; Boric-Lubecke, Olga; Stickley, Cris

    2012-01-01

    Human respiratory effort can be harvested to power wearable biosensors and mobile electronic devices. The very first step toward designing a harvester is to estimate available energy and power. This paper describes an estimation of the available power and energy due to the movements of the torso during breathing, using Doppler radar by detecting breathing rate, torso displacement, torso movement velocity and acceleration along the sagittal movement of the torso. The accuracy of the detected variables is verified by two reference methods. The experimental result obtained from a healthy female human subject shows that the available power from circumferential movement can be higher than the power from the sagittal movement.

  14. A Vision for Incorporating Environmental Effects into Nitrogen Management Decision Support Tools for U.S. Maize Production

    PubMed Central

    Banger, Kamaljit; Yuan, Mingwei; Wang, Junming; Nafziger, Emerson D.; Pittelkow, Cameron M.

    2017-01-01

    Meeting crop nitrogen (N) demand while minimizing N losses to the environment has proven difficult despite significant field research and modeling efforts. To improve N management, several real-time N management tools have been developed with a primary focus on enhancing crop production. However, no coordinated effort exists to simultaneously address sustainability concerns related to N losses at field- and regional-scales. In this perspective, we highlight the opportunity for incorporating environmental effects into N management decision support tools for United States maize production systems by integrating publicly available crop models with grower-entered management information and gridded soil and climate data in a geospatial framework specifically designed to quantify environmental and crop production tradeoffs. To facilitate advances in this area, we assess the capability of existing crop models to provide in-season N recommendations while estimating N leaching and nitrous oxide emissions, discuss several considerations for initial framework development, and highlight important challenges related to improving the accuracy of crop model predictions. Such a framework would benefit the development of regional sustainable intensification strategies by enabling the identification of N loss hotspots which could be used to implement spatially explicit mitigation efforts in relation to current environmental quality goals and real-time weather conditions. Nevertheless, we argue that this long-term vision can only be realized by leveraging a variety of existing research efforts to overcome challenges related to improving model structure, accessing field data to enhance model performance, and addressing the numerous social difficulties in delivery and adoption of such tool by stakeholders. PMID:28804490

  15. A Vision for Incorporating Environmental Effects into Nitrogen Management Decision Support Tools for U.S. Maize Production.

    PubMed

    Banger, Kamaljit; Yuan, Mingwei; Wang, Junming; Nafziger, Emerson D; Pittelkow, Cameron M

    2017-01-01

    Meeting crop nitrogen (N) demand while minimizing N losses to the environment has proven difficult despite significant field research and modeling efforts. To improve N management, several real-time N management tools have been developed with a primary focus on enhancing crop production. However, no coordinated effort exists to simultaneously address sustainability concerns related to N losses at field- and regional-scales. In this perspective, we highlight the opportunity for incorporating environmental effects into N management decision support tools for United States maize production systems by integrating publicly available crop models with grower-entered management information and gridded soil and climate data in a geospatial framework specifically designed to quantify environmental and crop production tradeoffs. To facilitate advances in this area, we assess the capability of existing crop models to provide in-season N recommendations while estimating N leaching and nitrous oxide emissions, discuss several considerations for initial framework development, and highlight important challenges related to improving the accuracy of crop model predictions. Such a framework would benefit the development of regional sustainable intensification strategies by enabling the identification of N loss hotspots which could be used to implement spatially explicit mitigation efforts in relation to current environmental quality goals and real-time weather conditions. Nevertheless, we argue that this long-term vision can only be realized by leveraging a variety of existing research efforts to overcome challenges related to improving model structure, accessing field data to enhance model performance, and addressing the numerous social difficulties in delivery and adoption of such tool by stakeholders.

  16. 77 FR 42189 - Marine Recreational Fisheries of the United States; National Saltwater Angler Registry and State...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-18

    ... improvements to the design and management of survey and estimation methods used to produce marine recreational... to these effort surveys provides for continued pilot testing of effort sampling designs that use both... NMFS determine what specific sampling design to use in MRIP effort surveys on the Atlantic and Gulf...

  17. Estimating the size of key populations at higher risk of HIV infection: a summary of experiences and lessons presented during a technical meeting on size estimation among key populations in Asian countries

    PubMed Central

    Calleja, Jesus Maria Garcia; Zhao, Jinkou; Reddy, Amala; Seguy, Nicole

    2014-01-01

    Problem Size estimates of key populations at higher risk of HIV exposure are recognized as critical for understanding the trajectory of the HIV epidemic and planning and monitoring an effective response, especially for countries with concentrated and low epidemics such as those in Asia. Context To help countries estimate population sizes of key populations, global guidelines were updated in 2011 to reflect new technical developments and recent field experiences in applying these methods. Action In September 2013, a meeting of programme managers and experts experienced with population size estimates (PSE) for key populations was held for 13 Asian countries. This article summarizes the key results presented, shares practical lessons learnt and reviews the methodological approaches from implementing PSE in 13 countries. Lessons learnt It is important to build capacity to collect, analyse and use PSE data; establish a technical review group; and implement a transparent, well documented process. Countries should adapt global PSE guidelines and maintain operational definitions that are more relevant and useable for country programmes. Development of methods for non-venue-based key populations requires more investment and collaborative efforts between countries and among partners. PMID:25320676

  18. Diagnostic accuracy of a bayesian latent group analysis for the detection of malingering-related poor effort.

    PubMed

    Ortega, Alonso; Labrenz, Stephan; Markowitsch, Hans J; Piefke, Martina

    2013-01-01

    In the last decade, different statistical techniques have been introduced to improve assessment of malingering-related poor effort. In this context, we have recently shown preliminary evidence that a Bayesian latent group model may help to optimize classification accuracy using a simulation research design. In the present study, we conducted two analyses. Firstly, we evaluated how accurately this Bayesian approach can distinguish between participants answering in an honest way (honest response group) and participants feigning cognitive impairment (experimental malingering group). Secondly, we tested the accuracy of our model in the differentiation between patients who had real cognitive deficits (cognitively impaired group) and participants who belonged to the experimental malingering group. All Bayesian analyses were conducted using the raw scores of a visual recognition forced-choice task (2AFC), the Test of Memory Malingering (TOMM, Trial 2), and the Word Memory Test (WMT, primary effort subtests). The first analysis showed 100% accuracy for the Bayesian model in distinguishing participants of both groups with all effort measures. The second analysis showed outstanding overall accuracy of the Bayesian model when estimates were obtained from the 2AFC and the TOMM raw scores. Diagnostic accuracy of the Bayesian model diminished when using the WMT total raw scores. Despite, overall diagnostic accuracy can still be considered excellent. The most plausible explanation for this decrement is the low performance in verbal recognition and fluency tasks of some patients of the cognitively impaired group. Additionally, the Bayesian model provides individual estimates, p(zi |D), of examinees' effort levels. In conclusion, both high classification accuracy levels and Bayesian individual estimates of effort may be very useful for clinicians when assessing for effort in medico-legal settings.

  19. Theoretical basis to measure the impact of short-lasting control of an infectious disease on the epidemic peak

    PubMed Central

    2011-01-01

    Background While many pandemic preparedness plans have promoted disease control effort to lower and delay an epidemic peak, analytical methods for determining the required control effort and making statistical inferences have yet to be sought. As a first step to address this issue, we present a theoretical basis on which to assess the impact of an early intervention on the epidemic peak, employing a simple epidemic model. Methods We focus on estimating the impact of an early control effort (e.g. unsuccessful containment), assuming that the transmission rate abruptly increases when control is discontinued. We provide analytical expressions for magnitude and time of the epidemic peak, employing approximate logistic and logarithmic-form solutions for the latter. Empirical influenza data (H1N1-2009) in Japan are analyzed to estimate the effect of the summer holiday period in lowering and delaying the peak in 2009. Results Our model estimates that the epidemic peak of the 2009 pandemic was delayed for 21 days due to summer holiday. Decline in peak appears to be a nonlinear function of control-associated reduction in the reproduction number. Peak delay is shown to critically depend on the fraction of initially immune individuals. Conclusions The proposed modeling approaches offer methodological avenues to assess empirical data and to objectively estimate required control effort to lower and delay an epidemic peak. Analytical findings support a critical need to conduct population-wide serological survey as a prior requirement for estimating the time of peak. PMID:21269441

  20. Understanding and Predicting the Process of Software Maintenance Releases

    NASA Technical Reports Server (NTRS)

    Basili, Victor; Briand, Lionel; Condon, Steven; Kim, Yong-Mi; Melo, Walcelio L.; Valett, Jon D.

    1996-01-01

    One of the major concerns of any maintenance organization is to understand and estimate the cost of maintenance releases of software systems. Planning the next release so as to maximize the increase in functionality and the improvement in quality are vital to successful maintenance management. The objective of this paper is to present the results of a case study in which an incremental approach was used to better understand the effort distribution of releases and build a predictive effort model for software maintenance releases. This study was conducted in the Flight Dynamics Division (FDD) of NASA Goddard Space Flight Center(GSFC). This paper presents three main results: 1) a predictive effort model developed for the FDD's software maintenance release process; 2) measurement-based lessons learned about the maintenance process in the FDD; and 3) a set of lessons learned about the establishment of a measurement-based software maintenance improvement program. In addition, this study provides insights and guidelines for obtaining similar results in other maintenance organizations.

  1. Modelling HIV/AIDS epidemics in sub-Saharan Africa using seroprevalence data from antenatal clinics.

    PubMed Central

    Salomon, J. A.; Murray, C. J.

    2001-01-01

    OBJECTIVE: To improve the methodological basis for modelling the HIV/AIDS epidemics in adults in sub-Saharan Africa, with examples from Botswana, Central African Republic, Ethiopia, and Zimbabwe. Understanding the magnitude and trajectory of the HIV/AIDS epidemic is essential for planning and evaluating control strategies. METHODS: Previous mathematical models were developed to estimate epidemic trends based on sentinel surveillance data from pregnant women. In this project, we have extended these models in order to take full advantage of the available data. We developed a maximum likelihood approach for the estimation of model parameters and used numerical simulation methods to compute uncertainty intervals around the estimates. FINDINGS: In the four countries analysed, there were an estimated half a million new adult HIV infections in 1999 (range: 260 to 960 thousand), 4.7 million prevalent infections (range: 3.0 to 6.6 million), and 370 thousand adult deaths from AIDS (range: 266 to 492 thousand). CONCLUSION: While this project addresses some of the limitations of previous modelling efforts, an important research agenda remains, including the need to clarify the relationship between sentinel data from pregnant women and the epidemiology of HIV and AIDS in the general population. PMID:11477962

  2. The Canadian Urban Environmental Health Research Consortium - a protocol for building a national environmental exposure data platform for integrated analyses of urban form and health.

    PubMed

    Brook, Jeffrey R; Setton, Eleanor M; Seed, Evan; Shooshtari, Mahdi; Doiron, Dany

    2018-01-08

    Multiple external environmental exposures related to residential location and urban form including, air pollutants, noise, greenness, and walkability have been linked to health impacts or benefits. The Canadian Urban Environmental Health Research Consortium (CANUE) was established to facilitate the linkage of extensive geospatial exposure data to existing Canadian cohorts and administrative health data holdings. We hypothesize that this linkage will enable investigators to test a variety of their own hypotheses related to the interdependent associations of built environment features with diverse health outcomes encompassed by the cohorts and administrative data. We developed a protocol for compiling measures of built environment features that quantify exposure; vary spatially on the urban and suburban scale; and can be modified through changes in policy or individual behaviour to benefit health. These measures fall into six domains: air quality, noise, greenness, weather/climate, and transportation and neighbourhood factors; and will be indexed to six-digit postal codes to facilitate merging with health databases. Initial efforts focus on existing data and include estimates of air pollutants, greenness, temperature extremes, and neighbourhood walkability and socioeconomic characteristics. Key gaps will be addressed for noise exposure, with a new national model being developed, and for transportation-related exposures, with detailed estimates of truck volumes and diesel emissions now underway in selected cities. Improvements to existing exposure estimates are planned, primarily by increasing temporal and/or spatial resolution given new satellite-based sensors and more detailed national air quality modelling. Novel metrics are also planned for walkability and food environments, green space access and function and life-long climate-related exposures based on local climate zones. Critical challenges exist, for example, the quantity and quality of input data to many of the models and metrics has changed over time, making it difficult to develop and validate historical exposures. CANUE represents a unique effort to coordinate and leverage substantial research investments and will enable a more focused effort on filling gaps in exposure information, improving the range of exposures quantified, their precision and mechanistic relevance to health. Epidemiological studies may be better able to explore the common theme of urban form and health in an integrated manner, ultimately contributing new knowledge informing policies that enhance healthy urban living.

  3. Identifying the Machine Translation Error Types with the Greatest Impact on Post-editing Effort

    PubMed Central

    Daems, Joke; Vandepitte, Sonia; Hartsuiker, Robert J.; Macken, Lieve

    2017-01-01

    Translation Environment Tools make translators’ work easier by providing them with term lists, translation memories and machine translation output. Ideally, such tools automatically predict whether it is more effortful to post-edit than to translate from scratch, and determine whether or not to provide translators with machine translation output. Current machine translation quality estimation systems heavily rely on automatic metrics, even though they do not accurately capture actual post-editing effort. In addition, these systems do not take translator experience into account, even though novices’ translation processes are different from those of professional translators. In this paper, we report on the impact of machine translation errors on various types of post-editing effort indicators, for professional translators as well as student translators. We compare the impact of MT quality on a product effort indicator (HTER) with that on various process effort indicators. The translation and post-editing process of student translators and professional translators was logged with a combination of keystroke logging and eye-tracking, and the MT output was analyzed with a fine-grained translation quality assessment approach. We find that most post-editing effort indicators (product as well as process) are influenced by machine translation quality, but that different error types affect different post-editing effort indicators, confirming that a more fine-grained MT quality analysis is needed to correctly estimate actual post-editing effort. Coherence, meaning shifts, and structural issues are shown to be good indicators of post-editing effort. The additional impact of experience on these interactions between MT quality and post-editing effort is smaller than expected. PMID:28824482

  4. Identifying the Machine Translation Error Types with the Greatest Impact on Post-editing Effort.

    PubMed

    Daems, Joke; Vandepitte, Sonia; Hartsuiker, Robert J; Macken, Lieve

    2017-01-01

    Translation Environment Tools make translators' work easier by providing them with term lists, translation memories and machine translation output. Ideally, such tools automatically predict whether it is more effortful to post-edit than to translate from scratch, and determine whether or not to provide translators with machine translation output. Current machine translation quality estimation systems heavily rely on automatic metrics, even though they do not accurately capture actual post-editing effort. In addition, these systems do not take translator experience into account, even though novices' translation processes are different from those of professional translators. In this paper, we report on the impact of machine translation errors on various types of post-editing effort indicators, for professional translators as well as student translators. We compare the impact of MT quality on a product effort indicator (HTER) with that on various process effort indicators. The translation and post-editing process of student translators and professional translators was logged with a combination of keystroke logging and eye-tracking, and the MT output was analyzed with a fine-grained translation quality assessment approach. We find that most post-editing effort indicators (product as well as process) are influenced by machine translation quality, but that different error types affect different post-editing effort indicators, confirming that a more fine-grained MT quality analysis is needed to correctly estimate actual post-editing effort. Coherence, meaning shifts, and structural issues are shown to be good indicators of post-editing effort. The additional impact of experience on these interactions between MT quality and post-editing effort is smaller than expected.

  5. Modeling avian abundance from replicated counts using binomial mixture models

    USGS Publications Warehouse

    Kery, Marc; Royle, J. Andrew; Schmid, Hans

    2005-01-01

    Abundance estimation in ecology is usually accomplished by capture–recapture, removal, or distance sampling methods. These may be hard to implement at large spatial scales. In contrast, binomial mixture models enable abundance estimation without individual identification, based simply on temporally and spatially replicated counts. Here, we evaluate mixture models using data from the national breeding bird monitoring program in Switzerland, where some 250 1-km2 quadrats are surveyed using the territory mapping method three times during each breeding season. We chose eight species with contrasting distribution (wide–narrow), abundance (high–low), and detectability (easy–difficult). Abundance was modeled as a random effect with a Poisson or negative binomial distribution, with mean affected by forest cover, elevation, and route length. Detectability was a logit-linear function of survey date, survey date-by-elevation, and sampling effort (time per transect unit). Resulting covariate effects and parameter estimates were consistent with expectations. Detectability per territory (for three surveys) ranged from 0.66 to 0.94 (mean 0.84) for easy species, and from 0.16 to 0.83 (mean 0.53) for difficult species, depended on survey effort for two easy and all four difficult species, and changed seasonally for three easy and three difficult species. Abundance was positively related to route length in three high-abundance and one low-abundance (one easy and three difficult) species, and increased with forest cover in five forest species, decreased for two nonforest species, and was unaffected for a generalist species. Abundance estimates under the most parsimonious mixture models were between 1.1 and 8.9 (median 1.8) times greater than estimates based on territory mapping; hence, three surveys were insufficient to detect all territories for each species. We conclude that binomial mixture models are an important new approach for estimating abundance corrected for detectability when only repeated-count data are available. Future developments envisioned include estimation of trend, occupancy, and total regional abundance.

  6. Precipitation and Diabatic Heating Distributions from TRMM/GPM

    NASA Astrophysics Data System (ADS)

    Olson, W. S.; Grecu, M.; Wu, D.; Tao, W. K.; L'Ecuyer, T.; Jiang, X.

    2016-12-01

    The initial focus of our research effort was the development of a physically-based methodology for estimating 3D precipitation distributions from a combination of spaceborne radar and passive microwave radiometer observations. This estimation methodology was originally developed for applications to Global Precipitation Measurement (GPM) mission sensor data, but it has recently been adapted to Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar and Microwave Imager observations. Precipitation distributions derived from the TRMM sensors are interpreted using cloud-system resolving model simulations to infer atmospheric latent+eddy heating (Q1-QR) distributions in the tropics and subtropics. Further, the estimates of Q1-QR are combined with estimates of radiative heating (QR), derived from TRMM Microwave Imager and Visible and Infrared Scanner data as well as environmental properties from NCEP reanalyses, to yield estimates of the large-scale total diabatic heating (Q1). A thirteen-year database of precipitation and diabatic heating is constructed using TRMM observations from 1998-2010 as part of NASA's Energy and Water cycle Study program. State-dependent errors in precipitation and heating products are evaluated by propagating the potential errors of a priori modeling assumptions through the estimation method framework. Knowledge of these errors is critical for determining the "closure" of global water and energy budgets. Applications of the precipitation/heating products to climate studies will be presented at the conference.

  7. A spatial method to calculate small-scale fisheries effort in data poor scenarios.

    PubMed

    Johnson, Andrew Frederick; Moreno-Báez, Marcia; Giron-Nava, Alfredo; Corominas, Julia; Erisman, Brad; Ezcurra, Exequiel; Aburto-Oropeza, Octavio

    2017-01-01

    To gauge the collateral impacts of fishing we must know where fishing boats operate and how much they fish. Although small-scale fisheries land approximately the same amount of fish for human consumption as industrial fleets globally, methods of estimating their fishing effort are comparatively poor. We present an accessible, spatial method of calculating the effort of small-scale fisheries based on two simple measures that are available, or at least easily estimated, in even the most data-poor fisheries: the number of boats and the local coastal human population. We illustrate the method using a small-scale fisheries case study from the Gulf of California, Mexico, and show that our measure of Predicted Fishing Effort (PFE), measured as the number of boats operating in a given area per day adjusted by the number of people in local coastal populations, can accurately predict fisheries landings in the Gulf. Comparing our values of PFE to commercial fishery landings throughout the Gulf also indicates that the current number of small-scale fishing boats in the Gulf is approximately double what is required to land theoretical maximum fish biomass. Our method is fishery-type independent and can be used to quantitatively evaluate the efficacy of growth in small-scale fisheries. This new method provides an important first step towards estimating the fishing effort of small-scale fleets globally.

  8. Methodologies for estimating advisory curve speeds on Oregon highways.

    DOT National Transportation Integrated Search

    2008-01-01

    This report reviews an Oregon research effort to evaluate the identification and marking of advisory speeds on Oregon : highways. In particular, this research effort focused on the implications of modified advisory speed thresholds and : identificati...

  9. Estimating Total Deposition Using NADP & CASTNET Data

    EPA Science Inventory

    For more than 40 years, efforts have been made to estimate total sulfur and nitrogen deposition in the United States using a combination of measured concentrations in precipitation and in the air, precipitation amounts for wet deposition, and various modeled or estimated depositi...

  10. Influence of Sampling Effort on the Estimated Richness of Road-Killed Vertebrate Wildlife

    NASA Astrophysics Data System (ADS)

    Bager, Alex; da Rosa, Clarissa A.

    2011-05-01

    Road-killed mammals, birds, and reptiles were collected weekly from highways in southern Brazil in 2002 and 2005. The objective was to assess variation in estimates of road-kill impacts on species richness produced by different sampling efforts, and to provide information to aid in the experimental design of future sampling. Richness observed in weekly samples was compared with sampling for different periods. In each period, the list of road-killed species was evaluated based on estimates the community structure derived from weekly samplings, and by the presence of the ten species most subject to road mortality, and also of threatened species. Weekly samples were sufficient only for reptiles and mammals, considered separately. Richness estimated from the biweekly samples was equal to that found in the weekly samples, and gave satisfactory results for sampling the most abundant and threatened species. The ten most affected species showed constant road-mortality rates, independent of sampling interval, and also maintained their dominance structure. Birds required greater sampling effort. When the composition of road-killed species varies seasonally, it is necessary to take biweekly samples for a minimum of one year. Weekly or more-frequent sampling for periods longer than two years is necessary to provide a reliable estimate of total species richness.

  11. Influence of sampling effort on the estimated richness of road-killed vertebrate wildlife.

    PubMed

    Bager, Alex; da Rosa, Clarissa A

    2011-05-01

    Road-killed mammals, birds, and reptiles were collected weekly from highways in southern Brazil in 2002 and 2005. The objective was to assess variation in estimates of road-kill impacts on species richness produced by different sampling efforts, and to provide information to aid in the experimental design of future sampling. Richness observed in weekly samples was compared with sampling for different periods. In each period, the list of road-killed species was evaluated based on estimates the community structure derived from weekly samplings, and by the presence of the ten species most subject to road mortality, and also of threatened species. Weekly samples were sufficient only for reptiles and mammals, considered separately. Richness estimated from the biweekly samples was equal to that found in the weekly samples, and gave satisfactory results for sampling the most abundant and threatened species. The ten most affected species showed constant road-mortality rates, independent of sampling interval, and also maintained their dominance structure. Birds required greater sampling effort. When the composition of road-killed species varies seasonally, it is necessary to take biweekly samples for a minimum of one year. Weekly or more-frequent sampling for periods longer than two years is necessary to provide a reliable estimate of total species richness.

  12. Estimated flood-inundation maps for Cowskin Creek in western Wichita, Kansas

    USGS Publications Warehouse

    Studley, Seth E.

    2003-01-01

    The October 31, 1998, flood on Cowskin Creek in western Wichita, Kansas, caused millions of dollars in damages. Emergency management personnel and flood mitigation teams had difficulty in efficiently identifying areas affected by the flooding, and no warning was given to residents because flood-inundation information was not available. To provide detailed information about future flooding on Cowskin Creek, high-resolution estimated flood-inundation maps were developed using geographic information system technology and advanced hydraulic analysis. Two-foot-interval land-surface elevation data from a 1996 flood insurance study were used to create a three-dimensional topographic representation of the study area for hydraulic analysis. The data computed from the hydraulic analyses were converted into geographic information system format with software from the U.S. Army Corps of Engineers' Hydrologic Engineering Center. The results were overlaid on the three-dimensional topographic representation of the study area to produce maps of estimated flood-inundation areas and estimated depths of water in the inundated areas for 1-foot increments on the basis of stream stage at an index streamflow-gaging station. A Web site (http://ks.water.usgs.gov/Kansas/cowskin.floodwatch) was developed to provide the public with information pertaining to flooding in the study area. The Web site shows graphs of the real-time streamflow data for U.S. Geological Survey gaging stations in the area and monitors the National Weather Service Arkansas-Red Basin River Forecast Center for Cowskin Creek flood-forecast information. When a flood is forecast for the Cowskin Creek Basin, an estimated flood-inundation map is displayed for the stream stage closest to the National Weather Service's forecasted peak stage. Users of the Web site are able to view the estimated flood-inundation maps for selected stages at any time and to access information about this report and about flooding in general. Flood recovery teams also have the ability to view the estimated flood-inundation map pertaining to the most recent flood. The availability of these maps and the ability to monitor the real-time stream stage through the U.S. Geological Survey Web site provide emergency management personnel and residents with information that is critical for evacuation and rescue efforts in the event of a flood as well as for post-flood recovery efforts.

  13. Estimating sample size for landscape-scale mark-recapture studies of North American migratory tree bats

    USGS Publications Warehouse

    Ellison, Laura E.; Lukacs, Paul M.

    2014-01-01

    Concern for migratory tree-roosting bats in North America has grown because of possible population declines from wind energy development. This concern has driven interest in estimating population-level changes. Mark-recapture methodology is one possible analytical framework for assessing bat population changes, but sample size requirements to produce reliable estimates have not been estimated. To illustrate the sample sizes necessary for a mark-recapture-based monitoring program we conducted power analyses using a statistical model that allows reencounters of live and dead marked individuals. We ran 1,000 simulations for each of five broad sample size categories in a Burnham joint model, and then compared the proportion of simulations in which 95% confidence intervals overlapped between and among years for a 4-year study. Additionally, we conducted sensitivity analyses of sample size to various capture probabilities and recovery probabilities. More than 50,000 individuals per year would need to be captured and released to accurately determine 10% and 15% declines in annual survival. To detect more dramatic declines of 33% or 50% survival over four years, then sample sizes of 25,000 or 10,000 per year, respectively, would be sufficient. Sensitivity analyses reveal that increasing recovery of dead marked individuals may be more valuable than increasing capture probability of marked individuals. Because of the extraordinary effort that would be required, we advise caution should such a mark-recapture effort be initiated because of the difficulty in attaining reliable estimates. We make recommendations for what techniques show the most promise for mark-recapture studies of bats because some techniques violate the assumptions of mark-recapture methodology when used to mark bats.

  14. Accounting for health spending in developing countries.

    PubMed

    Raciborska, Dorota A; Hernández, Patricia; Glassman, Amanda

    2008-01-01

    Data on health system financing and spending, together with information on the disease prevalence and cost-effectiveness of interventions, constitute essential input into health policy. It is particularly critical in developing countries, where resources are scarce and the marginal dollar has a major impact. Yet regular monitoring of health spending tends to be absent from those countries, and the results of international efforts to stimulate estimation activities have been mixed. This paper offers a history of health spending measurement, describes alternative sources of data, and recommends improving international collaboration and advocacy with the private sector for the way forward.

  15. BBU design of linear induction accelerator cells for radiography application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shang, C.C.; Chen, Y.J.; Gaporaso, G.J.

    1997-05-06

    There is an ongoing effort to develop accelerating modules for high-current electron accelerators for advanced radiography application. Accelerating modules with low beam-cavity coupling impedances along with gap designs with acceptable field stresses comprise a set of fundamental design criteria. We examine improved cell designs which have been developed for accelerator application in several radiographic operating regimes. We evaluate interaction impedances, analyze the effects of beam structure coupling on beam dynamics (beam break-up instability and corkscrew motion). We also provide estimates of coupling through interesting new high-gradient insulators and evaluate their potential future application in induction cells.

  16. Roadmap to Secure Control Systems in the Water Sector

    DTIC Science & Technology

    2008-03-01

    solutions for ICS security. The purposes of this roadmap are as follows: • Define a consensus-based framework that articulates strategies of owners and...each failure is manageable in itself • Be used as ransomware 400,000 persons, and was estimated by the Center for Disease Control (CDC) to cost a total...and focused efforts. The water sector has developed and will pursue a set of strategic goals articulating these ambitions. These goals will help

  17. Research in digital adaptive flight controllers

    NASA Technical Reports Server (NTRS)

    Kaufman, H.

    1976-01-01

    A design study of adaptive control logic suitable for implementation in modern airborne digital flight computers was conducted. Both explicit controllers which directly utilize parameter identification and implicit controllers which do not require identification were considered. Extensive analytical and simulation efforts resulted in the recommendation of two explicit digital adaptive flight controllers. Interface weighted least squares estimation procedures with control logic were developed using either optimal regulator theory or with control logic based upon single stage performance indices.

  18. Calibration of the Software Architecture Sizing and Estimation Tool (SASET).

    DTIC Science & Technology

    1995-09-01

    model is of more value than the uncalibrated one. Also, as will be discussed in Chapters 3 and 4, there are quite a few manual (and undocumented) steps...complexity, normalized effective size, and normalized effort. One other field ("development phases included") was extracted manually since it was not listed...Bowden, R.G., Cheadle, W.G., & Ratliff, R.W. SASET 3.0 Technical Reference Manual . Publication S-3730-93-2. Denver: Martin Marietta Astronautics

  19. Software Development Projects: Estimation of Cost and Effort (A Manager’s Digest).

    DTIC Science & Technology

    1982-12-01

    it later when changes need to be made to it. Various levels of difficulty are experiencel .4 due to the skill level of the programmer, poor Orcaram...impurities that if eliminatel 4 reduce the level of complexity of the program. They are as follows: 16 1. Complementary Operations: unreduced expressions 2...greater quality than that to support standard business applications. Remus defines quality as "...the number of program defects normalized by size over

  20. Assessment of fish assemblages and minimum sampling effort required to determine botic integrity of large rivers in southern Idaho, 2002

    USGS Publications Warehouse

    Maret, Terry R.; Ott, D.S.

    2004-01-01

    width was determined to be sufficient for collecting an adequate number of fish to estimate species richness and evaluate biotic integrity. At most sites, about 250 fish were needed to effectively represent 95 percent of the species present. Fifty-three percent of the sites assessed, using an IBI developed specifically for large Idaho rivers, received scores of less than 50, indicating poor biotic integrity.

  1. Insensitive Munitions Modeling Improvement Efforts

    DTIC Science & Technology

    2010-10-01

    LLNL) ALE3D . Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to...codes most commonly used by munition designers are CTH and the SIERRA suite of codes produced by Sandia National Labs (SNL) and ALE3D produced by... ALE3D , a LLNL developed code, is also used by various DoD participants. It was however, designed differently than either CTH or Sierra. ALE3D is a

  2. Predicting Critical Power in Elite Cyclists: Questioning the Validity of the 3-Minute All-Out Test.

    PubMed

    Bartram, Jason C; Thewlis, Dominic; Martin, David T; Norton, Kevin I

    2017-07-01

    New applications of the critical-power concept, such as the modeling of intermittent-work capabilities, are exciting prospects for elite cycling. However, accurate calculation of the required parameters is traditionally time invasive and somewhat impractical. An alternative single-test protocol (3-min all-out) has recently been proposed, but validation in an elite population is lacking. The traditional approach for parameter establishment, but with fewer tests, could also prove an acceptable compromise. Six senior Australian endurance track-cycling representatives completed 6 efforts to exhaustion on 2 separate days over a 3-wk period. These included 1-, 4-, 6-, 8-, and 10-min self-paced efforts, plus the 3-min all-out protocol. Traditional work-vs-time calculations of CP and anaerobic energy contribution (W') using the 5 self-paced efforts were compared with calculations from the 3-min all-out protocol. The impact of using just 2 or 3 self-paced efforts for traditional CP and W' estimation was also explored using thresholds of agreement (8 W, 2.0 kJ, respectively). CP estimated from the 3-min all-out approach was significantly higher than from the traditional approach (402 ± 33, 351 ± 27 W, P < .001), while W' was lower (15.5 ± 3.0, 24.3 ± 4.0 kJ, P = .02). Five different combinations of 2 or 3 self-paced efforts led to CP estimates within the threshold of agreement, with only 1 combination deemed accurate for W'. In elite cyclists the 3-min all-out approach is not suitable to estimate CP when compared with the traditional method. However, reducing the number of tests used in the traditional method lessens testing burden while maintaining appropriate parameter accuracy.

  3. A methodology to incorporate life cycle analysis and the triple bottom line mechanism for sustainable management of industrial enterprises

    NASA Astrophysics Data System (ADS)

    Wang, Ling; Lin, Li

    2004-02-01

    Since 1970"s, the environmental protection movement has challenged industries to increase their investment in Environmentally Conscious Manufacturing (ECM) techniques and management tools. Social considerations for global citizens and their descendants also motivated the examination on the complex issues of sustainable development beyond the immediate economic impact. Consequently, industrial enterprises have started to understand sustainable development in considering the Triple Bottom Line (TBL): economic prosperity, environmental quality and social justice. For the management, however, a lack of systematic ECM methodologies hinders their effort in planning, evaluating, reporting and auditing of sustainability. To address this critical need, this research develops a framework of a sustainable management system by incorporating a Life Cycle Analysis (LCA) of industrial operations with the TBL mechanism. A TBL metric system with seven sets of indices for the TBL elements and their complex relations is identified for the comprehensive evaluation of a company"s sustainability performance. Utilities of the TBL indices are estimated to represent the views of various stakeholders, including the company, investors, employees and the society at large. Costs of these indices are also captured to reflect the company"s effort in meeting the utilities. An optimization model is formulated to maximize the economic, environmental and social benefits by the company"s effort in developing sustainable strategies. To promote environmental and social consciousness, the methodology can significantly facilitate management decisions by its capabilities of including "non-business" values and external costs that the company has not contemplated before.

  4. Monitoring osseointegration and developing intelligent systems (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Salvino, Liming W.

    2017-05-01

    Effective monitoring of structural and biological systems is an extremely important research area that enables technology development for future intelligent devices, platforms, and systems. This presentation provides an overview of research efforts funded by the Office of Naval Research (ONR) to establish structural health monitoring (SHM) methodologies in the human domain. Basic science efforts are needed to utilize SHM sensing, data analysis, modeling, and algorithms to obtain the relevant physiological and biological information for human-specific health and performance conditions. This overview of current research efforts is based on the Monitoring Osseointegrated Prosthesis (MOIP) program. MOIP develops implantable and intelligent prosthetics that are directly anchored to the bone of residual limbs. Through real-time monitoring, sensing, and responding to osseointegration of bones and implants as well as interface conditions and environment, our research program aims to obtain individualized actionable information for implant failure identification, load estimation, infection mitigation and treatment, as well as healing assessment. Looking ahead to achieve ultimate goals of SHM, we seek to expand our research areas to cover monitoring human, biological and engineered systems, as well as human-machine interfaces. Examples of such include 1) brainwave monitoring and neurological control, 2) detecting and evaluating brain injuries, 3) monitoring and maximizing human-technological object teaming, and 4) closed-loop setups in which actions can be triggered automatically based on sensors, actuators, and data signatures. Finally, some ongoing and future collaborations across different disciplines for the development of knowledge automation and intelligent systems will be discussed.

  5. Multiple data sources improve DNA-based mark-recapture population estimates of grizzly bears.

    PubMed

    Boulanger, John; Kendall, Katherine C; Stetz, Jeffrey B; Roon, David A; Waits, Lisette P; Paetkau, David

    2008-04-01

    A fundamental challenge to estimating population size with mark-recapture methods is heterogeneous capture probabilities and subsequent bias of population estimates. Confronting this problem usually requires substantial sampling effort that can be difficult to achieve for some species, such as carnivores. We developed a methodology that uses two data sources to deal with heterogeneity and applied this to DNA mark-recapture data from grizzly bears (Ursus arctos). We improved population estimates by incorporating additional DNA "captures" of grizzly bears obtained by collecting hair from unbaited bear rub trees concurrently with baited, grid-based, hair snag sampling. We consider a Lincoln-Petersen estimator with hair snag captures as the initial session and rub tree captures as the recapture session and develop an estimator in program MARK that treats hair snag and rub tree samples as successive sessions. Using empirical data from a large-scale project in the greater Glacier National Park, Montana, USA, area and simulation modeling we evaluate these methods and compare the results to hair-snag-only estimates. Empirical results indicate that, compared with hair-snag-only data, the joint hair-snag-rub-tree methods produce similar but more precise estimates if capture and recapture rates are reasonably high for both methods. Simulation results suggest that estimators are potentially affected by correlation of capture probabilities between sample types in the presence of heterogeneity. Overall, closed population Huggins-Pledger estimators showed the highest precision and were most robust to sparse data, heterogeneity, and capture probability correlation among sampling types. Results also indicate that these estimators can be used when a segment of the population has zero capture probability for one of the methods. We propose that this general methodology may be useful for other species in which mark-recapture data are available from multiple sources.

  6. Effects of oil development in Arctic America

    USGS Publications Warehouse

    Reed, J.C.

    1970-01-01

    Large and important discoveries of petroleum were made in northern Alaska in 1968. The reserves were estimated then to be perhaps as much as ten thousand million barrels. Subsequent exploration has shown the resources to be much greater than was estimated earlier. Many problems must be solved before petroleum from northern Alaska reaches the world's markets. These problems are of three types: 1, those related to exploring, developing, and operating under the physical environments of the region; 2, those having to do with people-both the native people and those brought in from lower latitudes-and 3, those concerning the protection of the natural environments. The problems are great, but so also are the reserves of petroleum. To the extent that the problems are not solved, the cost of development and operation will be higher, the use of people will be expensive and unsatisfactory, and the natural environments will be threatened. The whole effort could be jeopardized on those grounds. ?? 1970.

  7. Mammalian cell culture process for monoclonal antibody production: nonlinear modelling and parameter estimation.

    PubMed

    Selişteanu, Dan; Șendrescu, Dorin; Georgeanu, Vlad; Roman, Monica

    2015-01-01

    Monoclonal antibodies (mAbs) are at present one of the fastest growing products of pharmaceutical industry, with widespread applications in biochemistry, biology, and medicine. The operation of mAbs production processes is predominantly based on empirical knowledge, the improvements being achieved by using trial-and-error experiments and precedent practices. The nonlinearity of these processes and the absence of suitable instrumentation require an enhanced modelling effort and modern kinetic parameter estimation strategies. The present work is dedicated to nonlinear dynamic modelling and parameter estimation for a mammalian cell culture process used for mAb production. By using a dynamical model of such kind of processes, an optimization-based technique for estimation of kinetic parameters in the model of mammalian cell culture process is developed. The estimation is achieved as a result of minimizing an error function by a particle swarm optimization (PSO) algorithm. The proposed estimation approach is analyzed in this work by using a particular model of mammalian cell culture, as a case study, but is generic for this class of bioprocesses. The presented case study shows that the proposed parameter estimation technique provides a more accurate simulation of the experimentally observed process behaviour than reported in previous studies.

  8. Costs of Chronic Diseases at the State Level: The Chronic Disease Cost Calculator

    PubMed Central

    Murphy, Louise B.; Khavjou, Olga A.; Li, Rui; Maylahn, Christopher M.; Tangka, Florence K.; Nurmagambetov, Tursynbek A.; Ekwueme, Donatus U.; Nwaise, Isaac; Chapman, Daniel P.; Orenstein, Diane

    2015-01-01

    Introduction Many studies have estimated national chronic disease costs, but state-level estimates are limited. The Centers for Disease Control and Prevention developed the Chronic Disease Cost Calculator (CDCC), which estimates state-level costs for arthritis, asthma, cancer, congestive heart failure, coronary heart disease, hypertension, stroke, other heart diseases, depression, and diabetes. Methods Using publicly available and restricted secondary data from multiple national data sets from 2004 through 2008, disease-attributable annual per-person medical and absenteeism costs were estimated. Total state medical and absenteeism costs were derived by multiplying per person costs from regressions by the number of people in the state treated for each disease. Medical costs were estimated for all payers and separately for Medicaid, Medicare, and private insurers. Projected medical costs for all payers (2010 through 2020) were calculated using medical costs and projected state population counts. Results Median state-specific medical costs ranged from $410 million (asthma) to $1.8 billion (diabetes); median absenteeism costs ranged from $5 million (congestive heart failure) to $217 million (arthritis). Conclusion CDCC provides methodologically rigorous chronic disease cost estimates. These estimates highlight possible areas of cost savings achievable through targeted prevention efforts or research into new interventions and treatments. PMID:26334712

  9. Mammalian Cell Culture Process for Monoclonal Antibody Production: Nonlinear Modelling and Parameter Estimation

    PubMed Central

    Selişteanu, Dan; Șendrescu, Dorin; Georgeanu, Vlad

    2015-01-01

    Monoclonal antibodies (mAbs) are at present one of the fastest growing products of pharmaceutical industry, with widespread applications in biochemistry, biology, and medicine. The operation of mAbs production processes is predominantly based on empirical knowledge, the improvements being achieved by using trial-and-error experiments and precedent practices. The nonlinearity of these processes and the absence of suitable instrumentation require an enhanced modelling effort and modern kinetic parameter estimation strategies. The present work is dedicated to nonlinear dynamic modelling and parameter estimation for a mammalian cell culture process used for mAb production. By using a dynamical model of such kind of processes, an optimization-based technique for estimation of kinetic parameters in the model of mammalian cell culture process is developed. The estimation is achieved as a result of minimizing an error function by a particle swarm optimization (PSO) algorithm. The proposed estimation approach is analyzed in this work by using a particular model of mammalian cell culture, as a case study, but is generic for this class of bioprocesses. The presented case study shows that the proposed parameter estimation technique provides a more accurate simulation of the experimentally observed process behaviour than reported in previous studies. PMID:25685797

  10. Estimating Total Deposition Using NADP & CASTNET Data (NADP 2016 poster)

    EPA Science Inventory

    For more than 40 years, efforts have been made to estimate total sulfur and nitrogen deposition in the United States using a combination of measured concentrations in precipitation and in the air, precipitation amounts for wet deposition, and various modeled or estimated depositi...

  11. Estimating carbon fluxes on small rotationally grazed pastures

    USDA-ARS?s Scientific Manuscript database

    Satellite-based Normalized Difference Vegetation Index (NDVI) data have been extensively used for estimating gross primary productivity (GPP) and yield of grazing lands throughout the world. Large-scale estimates of GPP are a necessary component of efforts to monitor the soil carbon balance of grazi...

  12. Air-water partition coefficients for a suite of polycyclic aromatic and other C10 through C20 unsaturated hydrocarbons.

    PubMed

    Rayne, Sierra; Forest, Kaya

    2016-09-18

    The air-water partition coefficients (Kaw) for 86 large polycyclic aromatic hydrocarbons and their unsaturated relatives were estimated using high-level G4(MP2) gas and aqueous phase calculations with the SMD, IEFPCM-UFF, and CPCM solvation models. An extensive method validation effort was undertaken which involved confirming that, via comparisons to experimental enthalpies of formation, gas-phase energies at the G4(MP2) level for the compounds of interest were at or near thermochemical accuracy. Investigations of the three solvation models using a range of neutral and ionic compounds suggested that while no clear preferential solvation model could be chosen in advance for accurate Kaw estimates of the target compounds, the employment of increasingly higher levels of theory would result in lower Kaw errors. Subsequent calculations on the polycyclic aromatic and unsaturated hydrocarbons at the G4(MP2) level revealed excellent agreement for the IEFPCM-UFF and CPCM models against limited available experimental data. The IEFPCM-UFF-G4(MP2) and CPCM-G4(MP2) solvation energy calculation approaches are anticipated to give Kaw estimates within typical experimental ranges, each having general Kaw errors of less than 0.5 log10 units. When applied to other large organic compounds, the method should allow development of a broad and reliable Kaw database for multimedia environmental modeling efforts on various contaminants.

  13. Practical simplifications for radioimmunotherapy dosimetric models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, S.; DeNardo, G.L.; O`Donnell, R.T.

    1999-01-01

    Radiation dosimetry is potentially useful for assessment and prediction of efficacy and toxicity for radionuclide therapy. The usefulness of these dose estimates relies on the establishment of a dose-response model using accurate pharmacokinetic data and a radiation dosimetric model. Due to the complexity in radiation dose estimation, many practical simplifications have been introduced in the dosimetric modeling for clinical trials of radioimmunotherapy. Although research efforts are generally needed to improve the simplifications used at each stage of model development, practical simplifications are often possible for specific applications without significant consequences to the dose-response model. In the development of dosimetric methodsmore » for radioimmunotherapy, practical simplifications in the dosimetric models were introduced. This study evaluated the magnitude of uncertainty associated with practical simplifications for: (1) organ mass of the MIRD phantom; (2) radiation contribution from target alone; (3) interpolation of S value; (4) macroscopic tumor uniformity; and (5) fit of tumor pharmacokinetic data.« less

  14. Glass Property Data and Models for Estimating High-Level Waste Glass Volume

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vienna, John D.; Fluegel, Alexander; Kim, Dong-Sang

    2009-10-05

    This report describes recent efforts to develop glass property models that can be used to help estimate the volume of high-level waste (HLW) glass that will result from vitrification of Hanford tank waste. The compositions of acceptable and processable HLW glasses need to be optimized to minimize the waste-form volume and, hence, to save cost. A database of properties and associated compositions for simulated waste glasses was collected for developing property-composition models. This database, although not comprehensive, represents a large fraction of data on waste-glass compositions and properties that were available at the time of this report. Glass property-composition modelsmore » were fit to subsets of the database for several key glass properties. These models apply to a significantly broader composition space than those previously publised. These models should be considered for interim use in calculating properties of Hanford waste glasses.« less

  15. Contributions of national and global health estimates to monitoring health-related sustainable development goals.

    PubMed

    Bundhamcharoen, Kanitta; Limwattananon, Supon; Kusreesakul, Khanitta; Tangcharoensathien, Viroj

    2016-01-01

    The millennium development goals triggered an increased demand for data on child and maternal mortalities for monitoring progress. With the advent of the sustainable development goals and growing evidence of an epidemiological transition toward non-communicable diseases, policymakers need data on mortality and disease trends and distribution to inform effective policies and support monitoring progress. Where there are limited capacities to produce national health estimates (NHEs), global health estimates (GHEs) can fill gaps for global monitoring and comparisons. This paper discusses lessons learned from Thailand's burden of disease (BOD) study on capacity development on NHEs and discusses the contributions and limitations of GHEs in informing policies at the country level. Through training and technical support by external partners, capacities are gradually strengthened and institutionalized to enable regular updates of BOD at national and subnational levels. Initially, the quality of cause-of-death reporting in death certificates was inadequate, especially for deaths occurring in the community. Verbal autopsies were conducted, using domestic resources, to determine probable causes of deaths occurring in the community. This method helped to improve the estimation of years of life lost. Since the achievement of universal health coverage in 2002, the quality of clinical data on morbidities has also considerably improved. There are significant discrepancies between the Global Burden of Disease 2010 study estimates for Thailand and the 1999 nationally generated BOD, especially for years of life lost due to HIV/AIDS, and the ranking of priority diseases. National ownership of NHEs and an effective interface between researchers and decision-makers contribute to enhanced country policy responses, whereas subnational data are intended to be used by various subnational partners. Although GHEs contribute to benchmarking country achievement compared with global health commitments, they may hamper development of NHE capacities. GHEs should encourage and support countries to improve their data systems and develop a data infrastructure that supports the production of empirical data needed to underpin estimation efforts.

  16. Contributions of national and global health estimates to monitoring health-related Sustainable Development Goals in Thailand.

    PubMed

    Bundhamcharoen, Kanitta; Limwattananon, Supon; Kusreesakul, Khanitta; Tangcharoensathien, Viroj

    2017-01-01

    The Millennium Development Goals (MDGs) triggered increased demand for data on child and maternal mortality for monitoring progress. With the advent of the Sustainable Development Goals (SDGs) and growing evidence of an epidemiological transition towards non-communicable diseases, policy makers need data on mortality and disease trends and distribution to inform effective policies and support monitoring progress. Where there are limited capacities to produce national health estimates (NHEs), global health estimates (GHEs) can fill gaps for global monitoring and comparisons. This paper draws lessons learned from Thailand's burden of disease study (BOD) on capacity development for NHEs, and discusses the contributions and limitation of GHEs in informing policies at country level. Through training and technical support by external partners, capacities are gradually strengthened and institutionalized to enable regular updates of BOD at national and sub-national levels. Initially, the quality of cause of death reporting in the death certificates was inadequate, especially for deaths occurring in the community. Verbal autopsies were conducted, using domestic resources, to determine probable causes of deaths occurring in the community. This helped improve the estimation of years of life lost. Since the achievement of universal health coverage in 2002, the quality of clinical data on morbidities has also considerably improved. There are significant discrepancies between the 2010 Global Burden of Diseases (GBD) estimates for Thailand and the 1999 nationally generated BOD, especially for years of life lost due to HIV/AIDS, and the ranking of priority diseases. National ownership of NHEs and effective interfaces between researchers and decision makers contribute to enhanced country policy responses, while sub-national data are intended to be used by various sub-national-level partners. Though GHEs contribute to benchmarking country achievement compared with global health commitments, they may hamper development of NHE capacities. GHEs should encourage and support countries to improve their data systems and develop a data infrastructure that supports the production of empirical data needed to underpin estimation efforts.

  17. Military Participants at U.S. Atmospheric Nuclear Weapons Testing— Methodology for Estimating Dose and Uncertainty

    PubMed Central

    Till, John E.; Beck, Harold L.; Aanenson, Jill W.; Grogan, Helen A.; Mohler, H. Justin; Mohler, S. Shawn; Voillequé, Paul G.

    2014-01-01

    Methods were developed to calculate individual estimates of exposure and dose with associated uncertainties for a sub-cohort (1,857) of 115,329 military veterans who participated in at least one of seven series of atmospheric nuclear weapons tests or the TRINITY shot carried out by the United States. The tests were conducted at the Pacific Proving Grounds and the Nevada Test Site. Dose estimates to specific organs will be used in an epidemiological study to investigate leukemia and male breast cancer. Previous doses had been estimated for the purpose of compensation and were generally high-sided to favor the veteran's claim for compensation in accordance with public law. Recent efforts by the U.S. Department of Defense (DOD) to digitize the historical records supporting the veterans’ compensation assessments make it possible to calculate doses and associated uncertainties. Our approach builds upon available film badge dosimetry and other measurement data recorded at the time of the tests and incorporates detailed scenarios of exposure for each veteran based on personal, unit, and other available historical records. Film badge results were available for approximately 25% of the individuals, and these results assisted greatly in reconstructing doses to unbadged persons and in developing distributions of dose among military units. This article presents the methodology developed to estimate doses for selected cancer cases and a 1% random sample of the total cohort of veterans under study. PMID:24758578

  18. Towards Improving our Understanding on the Retrievals of Key Parameters Characterising Land Surface Interactions from Space: Introduction & First Results from the PREMIER-EO Project

    NASA Astrophysics Data System (ADS)

    Ireland, Gareth; North, Matthew R.; Petropoulos, George P.; Srivastava, Prashant K.; Hodges, Crona

    2015-04-01

    Acquiring accurate information on the spatio-temporal variability of soil moisture content (SM) and evapotranspiration (ET) is of key importance to extend our understanding of the Earth system's physical processes, and is also required in a wide range of multi-disciplinary research studies and applications. The utility and applicability of Earth Observation (EO) technology provides an economically feasible solution to derive continuous spatio-temporal estimates of key parameters characterising land surface interactions, including ET as well as SM. Such information is of key value to practitioners, decision makers and scientists alike. The PREMIER-EO project recently funded by High Performance Computing Wales (HPCW) is a research initiative directed towards the development of a better understanding of EO technology's present ability to derive operational estimations of surface fluxes and SM. Moreover, the project aims at addressing knowledge gaps related to the operational estimation of such parameters, and thus contribute towards current ongoing global efforts towards enhancing the accuracy of those products. In this presentation we introduce the PREMIER-EO project, providing a detailed overview of the research aims and objectives for the 1 year duration of the project's implementation. Subsequently, we make available the initial results of the work carried out herein, in particular, related to an all-inclusive and robust evaluation of the accuracy of existing operational products of ET and SM from different ecosystems globally. The research outcomes of this project, once completed, will provide an important contribution towards addressing the knowledge gaps related to the operational estimation of ET and SM. This project results will also support efforts ongoing globally towards the operational development of related products using technologically advanced EO instruments which were launched recently or planned be launched in the next 1-2 years. Key Words: PREMIER-EO, HPC Wales, Soil Moisture, Evapotranspiration, , Earth Observation

  19. Error Analysis System for Spacecraft Navigation Using the Global Positioning System (GPS)

    NASA Technical Reports Server (NTRS)

    Truong, S. H.; Hart, R. C.; Hartman, K. R.; Tomcsik, T. L.; Searl, J. E.; Bernstein, A.

    1997-01-01

    The Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) is currently developing improved space-navigation filtering algorithms to use the Global Positioning System (GPS) for autonomous real-time onboard orbit determination. In connection with a GPS technology demonstration on the Small Satellite Technology Initiative (SSTI)/Lewis spacecraft, FDD analysts and programmers have teamed with the GSFC Guidance, Navigation, and Control Branch to develop the GPS Enhanced Orbit Determination Experiment (GEODE) system. The GEODE system consists of a Kalman filter operating as a navigation tool for estimating the position, velocity, and additional states required to accurately navigate the orbiting Lewis spacecraft by using astrodynamic modeling and GPS measurements from the receiver. A parallel effort at the FDD is the development of a GPS Error Analysis System (GEAS) that will be used to analyze and improve navigation filtering algorithms during development phases and during in-flight calibration. For GEAS, the Kalman filter theory is extended to estimate the errors in position, velocity, and other error states of interest. The estimation of errors in physical variables at regular intervals will allow the time, cause, and effect of navigation system weaknesses to be identified. In addition, by modeling a sufficient set of navigation system errors, a system failure that causes an observed error anomaly can be traced and accounted for. The GEAS software is formulated using Object Oriented Design (OOD) techniques implemented in the C++ programming language on a Sun SPARC workstation. The Phase 1 of this effort is the development of a basic system to be used to evaluate navigation algorithms implemented in the GEODE system. This paper presents the GEAS mathematical methodology, systems and operations concepts, and software design and implementation. Results from the use of the basic system to evaluate navigation algorithms implemented on GEODE are also discussed. In addition, recommendations for generalization of GEAS functions and for new techniques to optimize the accuracy and control of the GPS autonomous onboard navigation are presented.

  20. IDC Reengineering Phase 2 & 3 Rough Order of Magnitude (ROM) Cost Estimate Summary (Leveraged NDC Case).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, James M.; Prescott, Ryan; Dawson, Jericah M.

    2014-11-01

    Sandia National Laboratories has prepared a ROM cost estimate for budgetary planning for the IDC Reengineering Phase 2 & 3 effort, based on leveraging a fully funded, Sandia executed NDC Modernization project. This report provides the ROM cost estimate and describes the methodology, assumptions, and cost model details used to create the ROM cost estimate. ROM Cost Estimate Disclaimer Contained herein is a Rough Order of Magnitude (ROM) cost estimate that has been provided to enable initial planning for this proposed project. This ROM cost estimate is submitted to facilitate informal discussions in relation to this project and is NOTmore » intended to commit Sandia National Laboratories (Sandia) or its resources. Furthermore, as a Federally Funded Research and Development Center (FFRDC), Sandia must be compliant with the Anti-Deficiency Act and operate on a full-cost recovery basis. Therefore, while Sandia, in conjunction with the Sponsor, will use best judgment to execute work and to address the highest risks and most important issues in order to effectively manage within cost constraints, this ROM estimate and any subsequent approved cost estimates are on a 'full-cost recovery' basis. Thus, work can neither commence nor continue unless adequate funding has been accepted and certified by DOE.« less

  1. Computer-Aided Drug Discovery Approaches against the Tropical Infectious Diseases Malaria, Tuberculosis, Trypanosomiasis, and Leishmaniasis.

    PubMed

    Njogu, Peter M; Guantai, Eric M; Pavadai, Elumalai; Chibale, Kelly

    2016-01-08

    Despite the tremendous improvement in overall global health heralded by the adoption of the Millennium Declaration in the year 2000, tropical infections remain a major health problem in the developing world. Recent estimates indicate that the major tropical infectious diseases, namely, malaria, tuberculosis, trypanosomiasis, and leishmaniasis, account for more than 2.2 million deaths and a loss of approximately 85 million disability-adjusted life years annually. The crucial role of chemotherapy in curtailing the deleterious health and economic impacts of these infections has invigorated the search for new drugs against tropical infectious diseases. The research efforts have involved increased application of computational technologies in mainstream drug discovery programs at the hit identification, hit-to-lead, and lead optimization stages. This review highlights various computer-aided drug discovery approaches that have been utilized in efforts to identify novel antimalarial, antitubercular, antitrypanosomal, and antileishmanial agents. The focus is largely on developments over the past 5 years (2010-2014).

  2. Effects of the 2008 flood on economic performance and food security in Yemen: a simulation analysis.

    PubMed

    Breisinger, Clemens; Ecker, Olivier; Thiele, Rainer; Wiebelt, Manfred

    2016-04-01

    Extreme weather events such as floods and droughts can have devastating consequences for individual well being and economic development, in particular in poor societies with limited availability of coping mechanisms. Combining a dynamic computable general equilibrium model of the Yemeni economy with a household-level calorie consumption simulation model, this paper assesses the economy-wide, agricultural and food security effects of the 2008 tropical storm and flash flood that hit the Hadramout and Al-Mahrah governorates. The estimation results suggest that agricultural value added, farm household incomes and rural food security deteriorated long term in the flood-affected areas. Due to economic spillover effects, significant income losses and increases in food insecurity also occurred in areas that were unaffected by flooding. This finding suggests that while most relief efforts are typically concentrated in directly affected areas, future efforts should also consider surrounding areas and indirectly affected people. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.

  3. Constellation Program Life-cycle Cost Analysis Model (LCAM)

    NASA Technical Reports Server (NTRS)

    Prince, Andy; Rose, Heidi; Wood, James

    2008-01-01

    The Constellation Program (CxP) is NASA's effort to replace the Space Shuttle, return humans to the moon, and prepare for a human mission to Mars. The major elements of the Constellation Lunar sortie design reference mission architecture are shown. Unlike the Apollo Program of the 1960's, affordability is a major concern of United States policy makers and NASA management. To measure Constellation affordability, a total ownership cost life-cycle parametric cost estimating capability is required. This capability is being developed by the Constellation Systems Engineering and Integration (SE&I) Directorate, and is called the Lifecycle Cost Analysis Model (LCAM). The requirements for LCAM are based on the need to have a parametric estimating capability in order to do top-level program analysis, evaluate design alternatives, and explore options for future systems. By estimating the total cost of ownership within the context of the planned Constellation budget, LCAM can provide Program and NASA management with the cost data necessary to identify the most affordable alternatives. LCAM is also a key component of the Integrated Program Model (IPM), an SE&I developed capability that combines parametric sizing tools with cost, schedule, and risk models to perform program analysis. LCAM is used in the generation of cost estimates for system level trades and analyses. It draws upon the legacy of previous architecture level cost models, such as the Exploration Systems Mission Directorate (ESMD) Architecture Cost Model (ARCOM) developed for Simulation Based Acquisition (SBA), and ATLAS. LCAM is used to support requirements and design trade studies by calculating changes in cost relative to a baseline option cost. Estimated costs are generally low fidelity to accommodate available input data and available cost estimating relationships (CERs). LCAM is capable of interfacing with the Integrated Program Model to provide the cost estimating capability for that suite of tools.

  4. Predicting Brook Trout occurrence in stream reaches throughout their native range in the eastern United States

    USGS Publications Warehouse

    DeWeber, Jefferson Tyrell; Wagner, Tyler

    2015-01-01

    The Brook Trout Salvelinus fontinalis is an important species of conservation concern in the eastern USA. We developed a model to predict Brook Trout population status within individual stream reaches throughout the species’ native range in the eastern USA. We utilized hierarchical logistic regression with Bayesian estimation to predict Brook Trout occurrence probability, and we allowed slopes and intercepts to vary among ecological drainage units (EDUs). Model performance was similar for 7,327 training samples and 1,832 validation samples based on the area under the receiver operating curve (∼0.78) and Cohen's kappa statistic (0.44). Predicted water temperature had a strong negative effect on Brook Trout occurrence probability at the stream reach scale and was also negatively associated with the EDU average probability of Brook Trout occurrence (i.e., EDU-specific intercepts). The effect of soil permeability was positive but decreased as EDU mean soil permeability increased. Brook Trout were less likely to occur in stream reaches surrounded by agricultural or developed land cover, and an interaction suggested that agricultural land cover also resulted in an increased sensitivity to water temperature. Our model provides a further understanding of how Brook Trout are shaped by habitat characteristics in the region and yields maps of stream-reach-scale predictions, which together can be used to support ongoing conservation and management efforts. These decision support tools can be used to identify the extent of potentially suitable habitat, estimate historic habitat losses, and prioritize conservation efforts by selecting suitable stream reaches for a given action. Future work could extend the model to account for additional landscape or habitat characteristics, include biotic interactions, or estimate potential Brook Trout responses to climate and land use changes.

  5. Development of visible/infrared/microwave agriculture classification and biomass estimation algorithms. [Guyton, Oklahoma and Dalhart, Texas

    NASA Technical Reports Server (NTRS)

    Rosenthal, W. D.; Mcfarland, M. J.; Theis, S. W.; Jones, C. L. (Principal Investigator)

    1982-01-01

    Agricultural crop classification models using two or more spectral regions (visible through microwave) are considered in an effort to estimate biomass at Guymon, Oklahoma Dalhart, Texas. Both grounds truth and aerial data were used. Results indicate that inclusion of C, L, and P band active microwave data, from look angles greater than 35 deg from nadir, with visible and infrared data improve crop discrimination and biomass estimates compared to results using only visible and infrared data. The microwave frequencies were sensitive to different biomass levels. The K and C band were sensitive to differences at low biomass levels, while P band was sensitive to differences at high biomass levels. Two indices, one using only active microwave data and the other using data from the middle and near infrared bands, were well correlated to total biomass. It is implied that inclusion of active microwave sensors with visible and infrared sensors on future satellites could aid in crop discrimination and biomass estimation.

  6. Resistance of neonates and field-collected garter snakes (Thamnophis spp.) to tetrodotoxin.

    PubMed

    Ridenhour, Benjamin J; Brodie, Edmund D; Brodie, Edmund D

    2004-01-01

    Prior studies of tetrodotoxin (TTX) resistance in garter snakes (Thamnophis spp.) have used laboratory-reared neonates as subjects, but the use of field-caught individuals would reduce cost and effort. We compared estimates of TTX resistance in field-caught and laboratory-born garter snakes. We found that a mass-adjusted dose of TTX administered to field-caught garter snakes produces an estimate of a population 50% dose that is comparable and unbiased with respect to those previously reported using laboratory-born neonates. Dose-response curves estimated for three field-caught populations closely matched the curves estimated from neonate data. The method was tested using populations with levels of TTX resistance ranging between approximately 5-90 mass-adjusted mouse units for their respective 50% doses. The technique of using field-caught snakes as test subjects provides larger genetically independent data sets that are more easily obtained. Our results indicate that changes in mass during development parallel ontogenetic shifts in TTX resistance.

  7. Building Foundations for Nuclear Security Enterprise Analysis Utilizing Nuclear Weapon Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Josserand, Terry Michael; Young, Leone; Chamberlin, Edwin Phillip

    The Nuclear Security Enterprise, managed by the National Nuclear Security Administration - a semiautonomous agency within the Department of Energy - has been associated with numerous assessments with respect to the estimating, management capabilities, and practices pertaining to nuclear weapon modernization efforts. This report identifies challenges in estimating and analyzing the Nuclear Security Enterprise through an analysis of analogous timeframe conditions utilizing two types of nuclear weapon data - (1) a measure of effort and (2) a function of time. The analysis of analogous timeframe conditions that utilizes only two types of nuclear weapon data yields four summary observations thatmore » estimators and analysts of the Nuclear Security Enterprise will find useful.« less

  8. Assessing fish predation on migrating juvenile steelhead and a retrospective comparison to steelhead survival through the Priest Rapids Hydroelectric Project, Columbia River, Washington, 2009-11

    USGS Publications Warehouse

    Hardiman, Jill M.; Counihan, Timothy D.; Burgess, Dave S.; Simmons, Katrina E.; Holmberg, Glen S.; Rogala, Josh; Polacek, Rochelle

    2012-01-01

    The U.S. Geological Survey (USGS) and the Washington Department of Fish and Wildlife (WDFW) have been working with the Public Utility District No. 2 of Grant County, Washington (Grant PUD), to increase their understanding of predator-prey interactions in the Priest Rapids Hydroelectric Project (PRP), Columbia River, Washington. For this study, the PRP is defined as the area approximately 6 kilometers upstream of Wanapum Dam to the Priest Rapids Dam tailrace, 397.1 miles from the mouth of the Columbia River. Past year’s low survival numbers of juvenile steelhead (Oncorhynchus mykiss) through Wanapum and Priest Rapids Dams has prompted Grant PUD, on behalf of the Priest Rapids Coordinating Committee, to focus research efforts on steelhead migration and potential causal mechanisms for low survival. Steelhead passage survival in 2009 was estimated at 0.944 through the Wanapum Development (dam and reservoir) and 0.881 through the Priest Rapids Development and for 2010, steelhead survival was 0.855 for Wanapum Development and 0.904 for Priest Rapids Development. The USGS and WDFW implemented field collection efforts in 2011 for northern pikeminnow (Ptychocheilus oregonensis), smallmouth bass (Micropterus dolomieu), and walleye (Sander vitreus, formerly Stizostedion vitreum) and their diets in the PRP. For predator indexing, we collected 948 northern pikeminnow, 237 smallmouth bass, 18 walleye, and two largemouth bass (Micropterus salmoides). The intent of this study was to provide standardized predation indices within individual reaches of the PRP to discern spatial variability in predation patterns. Furthermore, the results of the 2011 study were compared to results of a concurrent steelhead survival study. Our results do not indicate excessively high predation of Oncorhynchus spp. occurring by northern pikeminnow or smallmouth bass in any particular reach throughout the study area. Although we found Oncorhynchus spp. in the predator diets, the relative proportion was small. Predation index values in 2011 were highest in the Priest Rapids mid-reservoir reach for northern pikeminnow and smallmouth bass. Predation indices generally were high in the tailrace areas for northern pikeminnow, and high in the forebay areas for smallmouth bass. Steelhead survival in 2011 was consistently high throughout the study period and the PRP, although predation indices were relatively low, which suggests that fish predation did not significantly affect steelhead survival throughout the study area. Our efforts to correlate retrospective predation indices with survival estimates for 2009 and 2010 did provide some evidence for high predation occurring in some of the same reaches, which had low steelhead survival, such as the Priest Rapids tailrace in 2009. However, for 2010, our results indicated that the loss of salmonids to predation were more contradictory to the survival results, where predation indices were higher for reaches in the Priest Rapids Development than in the Wanapum Development. Establishing correlations between steelhead survival and observed predation indices for previous research years, in 2009 and 2010 was confounded by the lack of coordination of these two studies during the initial study design, implementation period for such an analysis. Future efforts to correlate steelhead survival with fish predation would benefit from efforts to better coordinate the studies with consistent study reaches, and better timing of concurrent efforts.

  9. Current research efforts with Bacillus thuringiensis

    Treesearch

    Normand R. Dubois

    1991-01-01

    The bioassay of 260 strains of Bacillus thuringiensis (Bt) and 70 commercial preparations show that regression coefficient estimates may be as critical as LC5O estimates when evaluating them for future consideration.

  10. Forecasting the impact of virtual environment technology on maintenance training

    NASA Technical Reports Server (NTRS)

    Schlager, Mark S.; Boman, Duane; Piantanida, Tom; Stephenson, Robert

    1993-01-01

    To assist NASA and the Air Force in determining how and when to invest in virtual environment (VE) technology for maintenance training, we identified possible roles for VE technology in such training, assessed its cost-effectiveness relative to existing technologies, and formulated recommendations for a research agenda that would address instructional and system development issues involved in fielding a VE training system. In the first phase of the study, we surveyed VE developers to forecast capabilities, maturity, and estimated costs for VE component technologies. We then identified maintenance tasks and their training costs through interviews with maintenance technicians, instructors, and training developers. Ten candidate tasks were selected from two classes of maintenance tasks (seven aircraft maintenance and three space maintenance) using five criteria developed to identify types of tasks most likely to benefit from VE training. Three tasks were used as specific cases for cost-benefit analysis. In formulating research recommendations, we considered three aspects of feasibility: technological considerations, cost-effectiveness, and anticipated R&D efforts. In this paper, we describe the major findings in each of these areas and suggest research efforts that we believe will help achieve the goal of a cost-effective VE maintenance training system by the next decade.

  11. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    NASA Technical Reports Server (NTRS)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

    2007-01-01

    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  12. Distributed Sensing and Shape Control of Piezoelectric Bimorph Mirrors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Redmond, James M.; Barney, Patrick S.; Henson, Tammy D.

    1999-07-28

    As part of a collaborative effort between Sandia National Laboratories and the University of Kentucky to develop a deployable mirror for remote sensing applications, research in shape sensing and control algorithms that leverage the distributed nature of electron gun excitation for piezoelectric bimorph mirrors is summarized. A coarse shape sensing technique is developed that uses reflected light rays from the sample surface to provide discrete slope measurements. Estimates of surface profiles are obtained with a cubic spline curve fitting algorithm. Experiments on a PZT bimorph illustrate appropriate deformation trends as a function of excitation voltage. A parallel effort to effectmore » desired shape changes through electron gun excitation is also summarized. A one dimensional model-based algorithm is developed to correct profile errors in bimorph beams. A more useful two dimensional algorithm is also developed that relies on measured voltage-curvature sensitivities to provide corrective excitation profiles for the top and bottom surfaces of bimorph plates. The two algorithms are illustrated using finite element models of PZT bimorph structures subjected to arbitrary disturbances. Corrective excitation profiles that yield desired parabolic forms are computed, and are shown to provide the necessary corrective action.« less

  13. Ocean heat content estimation from in situ observations at the National Centers for Environmental Information: Improvements and Uncertainties

    NASA Astrophysics Data System (ADS)

    Boyer, T.; Locarnini, R. A.; Mishonov, A. V.; Reagan, J. R.; Seidov, D.; Zweng, M.; Levitus, S.

    2017-12-01

    Ocean heat uptake is the major factor in sequestering the Earth's Energy Imbalance (EEI). Since 2000, the National Centers for Environmental Information (NCEI) have been estimating historical ocean heat content (OHC) changes back to the 1950s, as well as monitoring recent OHC. Over these years, through worldwide community efforts, methods of calculating OHC have substantially improved. Similarly, estimation of the uncertainty of ocean heat content calculations provide new insight into how well EEI estimates can be constrained using in situ measurements and models. The changing ocean observing system, especially with the near-global year-round coverage afforded by Argo, has also allowed more confidence in regional and global OHC estimates and provided a benchmark for better understanding of historical OHC changes. NCEI is incorporating knowledge gained through these global efforts into the basic methods, instrument bias corrections, uncertainty measurements, and temporal and spatial resolution capabilities of historic OHC change estimation and recent monitoring. The nature of these improvements and their consequences for estimation of OHC in relation to the EEI will be discussed.

  14. Liquid Oxygen/Liquid Methane Component Technology Development at MSFC

    NASA Technical Reports Server (NTRS)

    Robinson, Joel W.

    2010-01-01

    The National Aeronautics & Space Administration (NASA) has identified Liquid Oxygen (LOX)/Liquid Methane (LCH4) as a potential propellant combination for future space vehicles based upon exploration studies. The technology is estimated to have higher performance and lower overall systems mass compared to existing hypergolic propulsion systems. Besides existing in-house risk reduction activities, NASA has solicited from industry their participation on component technologies based on the potential application to the lunar ascent main engine (AME). Contracted and NASA efforts have ranged from valve technologies to engine system testbeds. The application for the AME is anticipated to be an expendable, pressure-fed engine for ascent from the moon at completion of its lunar stay. Additionally, the hardware is expected to provide an abort capability prior to landing, in the event that descent systems malfunction. For the past 4 years, MSFC has been working with the Glenn Research Center and the Johnson Space Center on methane technology development. This paper will focus on efforts specific to MSFC in pursuing ignition, injector performance, chamber material assessments and cryogenic valve technologies. Ignition studies have examined characteristics for torch, spark and microwave systems. Injector testing has yielded insight into combustion performance for shear, swirl and impinging type injectors. The majority of chamber testing has been conducted with ablative and radiatively cooled chambers with planned activities for regenerative and transpiration cooled chambers. Lastly, an effort is underway to examine the long duration exposure issues of cryogenic valve internal components. The paper will summarize the status of these efforts.

  15. Constrained Kalman Filtering Via Density Function Truncation for Turbofan Engine Health Estimation

    NASA Technical Reports Server (NTRS)

    Simon, Dan; Simon, Donald L.

    2006-01-01

    Kalman filters are often used to estimate the state variables of a dynamic system. However, in the application of Kalman filters some known signal information is often either ignored or dealt with heuristically. For instance, state variable constraints (which may be based on physical considerations) are often neglected because they do not fit easily into the structure of the Kalman filter. This paper develops an analytic method of incorporating state variable inequality constraints in the Kalman filter. The resultant filter truncates the PDF (probability density function) of the Kalman filter estimate at the known constraints and then computes the constrained filter estimate as the mean of the truncated PDF. The incorporation of state variable constraints increases the computational effort of the filter but significantly improves its estimation accuracy. The improvement is demonstrated via simulation results obtained from a turbofan engine model. The turbofan engine model contains 3 state variables, 11 measurements, and 10 component health parameters. It is also shown that the truncated Kalman filter may be a more accurate way of incorporating inequality constraints than other constrained filters (e.g., the projection approach to constrained filtering).

  16. Estimating Evapotranspiration with Land Data Assimilation Systems

    NASA Technical Reports Server (NTRS)

    Peters-Lidard, C. D.; Kumar, S. V.; Mocko, D. M.; Tian, Y.

    2011-01-01

    Advancements in both land surface models (LSM) and land surface data assimilation, especially over the last decade, have substantially advanced the ability of land data assimilation systems (LDAS) to estimate evapotranspiration (ET). This article provides a historical perspective on international LSM intercomparison efforts and the development of LDAS systems, both of which have improved LSM ET skill. In addition, an assessment of ET estimates for current LDAS systems is provided along with current research that demonstrates improvement in LSM ET estimates due to assimilating satellite-based soil moisture products. Using the Ensemble Kalman Filter in the Land Information System, we assimilate both NASA and Land Parameter Retrieval Model (LPRM) soil moisture products into the Noah LSM Version 3.2 with the North American LDAS phase 2 (NLDAS-2) forcing to mimic the NLDAS-2 configuration. Through comparisons with two global reference ET products, one based on interpolated flux tower data and one from a new satellite ET algorithm, over the NLDAS2 domain, we demonstrate improvement in ET estimates only when assimilating the LPRM soil moisture product.

  17. Variance Estimation, Design Effects, and Sample Size Calculations for Respondent-Driven Sampling

    PubMed Central

    2006-01-01

    Hidden populations, such as injection drug users and sex workers, are central to a number of public health problems. However, because of the nature of these groups, it is difficult to collect accurate information about them, and this difficulty complicates disease prevention efforts. A recently developed statistical approach called respondent-driven sampling improves our ability to study hidden populations by allowing researchers to make unbiased estimates of the prevalence of certain traits in these populations. Yet, not enough is known about the sample-to-sample variability of these prevalence estimates. In this paper, we present a bootstrap method for constructing confidence intervals around respondent-driven sampling estimates and demonstrate in simulations that it outperforms the naive method currently in use. We also use simulations and real data to estimate the design effects for respondent-driven sampling in a number of situations. We conclude with practical advice about the power calculations that are needed to determine the appropriate sample size for a study using respondent-driven sampling. In general, we recommend a sample size twice as large as would be needed under simple random sampling. PMID:16937083

  18. Assessing angler effort, catch, and harvest and the efficacy of a use-estimation system on a multi-lake fishery in middle Georgia

    USGS Publications Warehouse

    Roop, Hunter J.; Poudyal, Neelam C.; Jennings, Cecil A.

    2018-01-01

    Creel surveys are valuable tools in recreational fisheries management. However, multiple‐impoundment fisheries of complex spatial structure can complicate survey designs and pose logistical challenges for management agencies. Marben Public Fishing Area in Mansfield, GA is a multi‐impoundment fishery with many access points, and these features prevent or complicate use of traditional on‐site contact methods such as standard roving‐ or access‐point designs because many anglers may be missed during the survey process. Therefore, adaptation of a traditional survey method is often required for sampling this special case of multi‐lake fisheries to develop an accurate fishery profile. Accordingly, a modified non‐uniform probability roving creel survey was conducted at the Marben PFA during 2013 to estimate fishery characteristics relating to fishing effort, catch, and fish harvest. Monthly fishing effort averaged 7,523 angler‐hours (h) (SD = 5,956) and ranged from 1,301 h (SD = 562) in December to 21,856 h (SD = 5909) in May. A generalized linear mixed model was used to determine that angler catch and harvest rates were significantly higher in the spring and summer (all p < 0.05) than in the other seasons, but did not vary by fishing location. Our results demonstrate the utility of modifying existing creel methodology for monitoring small, spatially complex, intensely managed impoundments that support quality recreational fisheries and provide a template for the assessment and management of similar regional fisheries.

  19. Comparing the performance of FA, DFA and DMA using different synthetic long-range correlated time series

    PubMed Central

    Shao, Ying-Hui; Gu, Gao-Feng; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Sornette, Didier

    2012-01-01

    Notwithstanding the significant efforts to develop estimators of long-range correlations (LRC) and to compare their performance, no clear consensus exists on what is the best method and under which conditions. In addition, synthetic tests suggest that the performance of LRC estimators varies when using different generators of LRC time series. Here, we compare the performances of four estimators [Fluctuation Analysis (FA), Detrended Fluctuation Analysis (DFA), Backward Detrending Moving Average (BDMA), and Centred Detrending Moving Average (CDMA)]. We use three different generators [Fractional Gaussian Noises, and two ways of generating Fractional Brownian Motions]. We find that CDMA has the best performance and DFA is only slightly worse in some situations, while FA performs the worst. In addition, CDMA and DFA are less sensitive to the scaling range than FA. Hence, CDMA and DFA remain “The Methods of Choice” in determining the Hurst index of time series. PMID:23150785

  20. Dasymetric high resolution population distribution estimates for improved decision making, with a case study of sea-level rise vulnerability in Boca Raton, Florida

    NASA Astrophysics Data System (ADS)

    Ziegler, Hannes Moritz

    Planners and managers often rely on coarse population distribution data from the census for addressing various social, economic, and environmental problems. In the analysis of physical vulnerabilities to sea-level rise, census units such as blocks or block groups are coarse relative to the required decision-making application. This study explores the benefits offered from integrating image classification and dasymetric mapping at the household level to provide detailed small area population estimates at the scale of residential buildings. In a case study of Boca Raton, FL, a sea-level rise inundation grid based on mapping methods by NOAA is overlaid on the highly detailed population distribution data to identify vulnerable residences and estimate population displacement. The enhanced spatial detail offered through this method has the potential to better guide targeted strategies for future development, mitigation, and adaptation efforts.

  1. System Identification Applied to Dynamic CFD Simulation and Wind Tunnel Data

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Klein, Vladislav; Frink, Neal T.; Vicroy, Dan D.

    2011-01-01

    Demanding aerodynamic modeling requirements for military and civilian aircraft have provided impetus for researchers to improve computational and experimental techniques. Model validation is a key component for these research endeavors so this study is an initial effort to extend conventional time history comparisons by comparing model parameter estimates and their standard errors using system identification methods. An aerodynamic model of an aircraft performing one-degree-of-freedom roll oscillatory motion about its body axes is developed. The model includes linear aerodynamics and deficiency function parameters characterizing an unsteady effect. For estimation of unknown parameters two techniques, harmonic analysis and two-step linear regression, were applied to roll-oscillatory wind tunnel data and to computational fluid dynamics (CFD) simulated data. The model used for this study is a highly swept wing unmanned aerial combat vehicle. Differences in response prediction, parameters estimates, and standard errors are compared and discussed

  2. Estimation of Energy Expenditure Using a Patch-Type Sensor Module with an Incremental Radial Basis Function Neural Network

    PubMed Central

    Li, Meina; Kwak, Keun-Chang; Kim, Youn Tae

    2016-01-01

    Conventionally, indirect calorimetry has been used to estimate oxygen consumption in an effort to accurately measure human body energy expenditure. However, calorimetry requires the subject to wear a mask that is neither convenient nor comfortable. The purpose of our study is to develop a patch-type sensor module with an embedded incremental radial basis function neural network (RBFNN) for estimating the energy expenditure. The sensor module contains one ECG electrode and a three-axis accelerometer, and can perform real-time heart rate (HR) and movement index (MI) monitoring. The embedded incremental network includes linear regression (LR) and RBFNN based on context-based fuzzy c-means (CFCM) clustering. This incremental network is constructed by building a collection of information granules through CFCM clustering that is guided by the distribution of error of the linear part of the LR model. PMID:27669249

  3. Climate Change: Integrating Science and Economics

    NASA Astrophysics Data System (ADS)

    Prinn, R. G.

    2008-12-01

    The world is facing an ever-growing conflict between environment and development. Climate change is a century-scale threat requiring a century-long effort in science, technology and policy analysis, and institutions that can sustain this effort over generations. To inform policy development and implementation there is urgent need for better integration of the diverse components of the problem. Motivated by this challenge, we have developed the Integrated Global System Model (IGSM) at MIT. It comprises coupled sub- models of economic development, atmospheric chemistry, climate dynamics and ecosystems. The results of a recent uncertainty analysis involving hundreds of runs of the IGSM imply that, without mitigation policies, the global average surface temperature may rise much faster than previously estimated. Polar temperatures are projected to rise even faster than the average rate with obvious great risks for high latitude ecosystems and ice sheets at the high end of this range. Analysis of policies for climate mitigation, show that the greatest effect of these policies is to lower the probability of extreme changes as opposed to lowering the medians. Faced with the above estimated impacts, the long lifetimes of most greenhouse gases in the atmosphere, the long delay in ultimate warming due to ocean heat uptake, and the capital-intensive global energy infrastructure, the case is strong for concerted action now. Results of runs of the IGSM indicate the need for transformation of the global energy industry on a very large scale to mitigate climate change. Carbon sequestration, renewable energy sources, and nuclear present new economic, technological, and environmental challenges when implemented at the needed scales. Economic analyses using the IGSM indicate that global implementation of efficient policies could allow the needed transformations at bearable costs.

  4. Comparison of modeling approaches to prioritize chemicals based on estimates of exposure and exposure potential

    PubMed Central

    Mitchell, Jade; Arnot, Jon A.; Jolliet, Olivier; Georgopoulos, Panos G.; Isukapalli, Sastry; Dasgupta, Surajit; Pandian, Muhilan; Wambaugh, John; Egeghy, Peter; Cohen Hubal, Elaine A.; Vallero, Daniel A.

    2014-01-01

    While only limited data are available to characterize the potential toxicity of over 8 million commercially available chemical substances, there is even less information available on the exposure and use-scenarios that are required to link potential toxicity to human and ecological health outcomes. Recent improvements and advances such as high throughput data gathering, high performance computational capabilities, and predictive chemical inherency methodology make this an opportune time to develop an exposure-based prioritization approach that can systematically utilize and link the asymmetrical bodies of knowledge for hazard and exposure. In response to the US EPA’s need to develop novel approaches and tools for rapidly prioritizing chemicals, a “Challenge” was issued to several exposure model developers to aid the understanding of current systems in a broader sense and to assist the US EPA’s effort to develop an approach comparable to other international efforts. A common set of chemicals were prioritized under each current approach. The results are presented herein along with a comparative analysis of the rankings of the chemicals based on metrics of exposure potential or actual exposure estimates. The analysis illustrates the similarities and differences across the domains of information incorporated in each modeling approach. The overall findings indicate a need to reconcile exposures from diffuse, indirect sources (far-field) with exposures from directly, applied chemicals in consumer products or resulting from the presence of a chemical in a microenvironment like a home or vehicle. Additionally, the exposure scenario, including the mode of entry into the environment (i.e. through air, water or sediment) appears to be an important determinant of the level of agreement between modeling approaches. PMID:23707726

  5. Comparison of modeling approaches to prioritize chemicals based on estimates of exposure and exposure potential.

    PubMed

    Mitchell, Jade; Arnot, Jon A; Jolliet, Olivier; Georgopoulos, Panos G; Isukapalli, Sastry; Dasgupta, Surajit; Pandian, Muhilan; Wambaugh, John; Egeghy, Peter; Cohen Hubal, Elaine A; Vallero, Daniel A

    2013-08-01

    While only limited data are available to characterize the potential toxicity of over 8 million commercially available chemical substances, there is even less information available on the exposure and use-scenarios that are required to link potential toxicity to human and ecological health outcomes. Recent improvements and advances such as high throughput data gathering, high performance computational capabilities, and predictive chemical inherency methodology make this an opportune time to develop an exposure-based prioritization approach that can systematically utilize and link the asymmetrical bodies of knowledge for hazard and exposure. In response to the US EPA's need to develop novel approaches and tools for rapidly prioritizing chemicals, a "Challenge" was issued to several exposure model developers to aid the understanding of current systems in a broader sense and to assist the US EPA's effort to develop an approach comparable to other international efforts. A common set of chemicals were prioritized under each current approach. The results are presented herein along with a comparative analysis of the rankings of the chemicals based on metrics of exposure potential or actual exposure estimates. The analysis illustrates the similarities and differences across the domains of information incorporated in each modeling approach. The overall findings indicate a need to reconcile exposures from diffuse, indirect sources (far-field) with exposures from directly, applied chemicals in consumer products or resulting from the presence of a chemical in a microenvironment like a home or vehicle. Additionally, the exposure scenario, including the mode of entry into the environment (i.e. through air, water or sediment) appears to be an important determinant of the level of agreement between modeling approaches. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghanem, Roger

    QUEST was a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, the Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The USC effort centered on the development of reduced models and efficient algorithms for implementing various components of the UQ pipeline. USC personnel were responsible for the development of adaptive bases, adaptive quadrature, and reduced modelsmore » to be used in estimation and inference.« less

  7. An experimental paradigm for team decision processes

    NASA Technical Reports Server (NTRS)

    Serfaty, D.; Kleinman, D. L.

    1986-01-01

    The study of distributed information processing and decision making is presently hampered by two factors: (1) The inherent complexity of the mathematical formulation of decentralized problems has prevented the development of models that could be used to predict performance in a distributed environment; and (2) The lack of comprehensive scientific empirical data on human team decision making has hindered the development of significant descriptive models. As a part of a comprehensive effort to find a new framework for multihuman decision making problems, a novel experimental research paradigm was developed involving human terms in decision making tasks. Attempts to construct parts of an integrated model with ideas from queueing networks, team theory, distributed estimation and decentralized resource management are described.

  8. A Web-based database for pathology faculty effort reporting.

    PubMed

    Dee, Fred R; Haugen, Thomas H; Wynn, Philip A; Leaven, Timothy C; Kemp, John D; Cohen, Michael B

    2008-04-01

    To ensure appropriate mission-based budgeting and equitable distribution of funds for faculty salaries, our compensation committee developed a pathology-specific effort reporting database. Principles included the following: (1) measurement should be done by web-based databases; (2) most entry should be done by departmental administration or be relational to other databases; (3) data entry categories should be aligned with funding streams; and (4) units of effort should be equal across categories of effort (service, teaching, research). MySQL was used for all data transactions (http://dev.mysql.com/downloads), and scripts were constructed using PERL (http://www.perl.org). Data are accessed with forms that correspond to fields in the database. The committee's work resulted in a novel database using pathology value units (PVUs) as a standard quantitative measure of effort for activities in an academic pathology department. The most common calculation was to estimate the number of hours required for a specific task, divide by 2080 hours (a Medicare year) and then multiply by 100. Other methods included assigning a baseline PVU for program, laboratory, or course directorship with an increment for each student or staff in that unit. With these methods, a faculty member should acquire approximately 100 PVUs. Some outcomes include (1) plotting PVUs versus salary to identify outliers for salary correction, (2) quantifying effort in activities outside the department, (3) documenting salary expenditure for unfunded research, (4) evaluating salary equity by plotting PVUs versus salary by sex, and (5) aggregating data by category of effort for mission-based budgeting and long-term planning.

  9. Sampling for Air Chemical Emissions from the Life Sciences Laboratory II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ballinger, Marcel Y.; Lindberg, Michael J.

    Sampling for air chemical emissions from the Life Science Laboratory II (LSL-II) ventilation stack was performed in an effort to determine potential exposure of maintenance staff to laboratory exhaust on the building roof. The concern about worker exposure was raised in December 2015 and several activities were performed to assist in estimating exposure concentrations. Data quality objectives were developed to determine the need for and scope and parameters of a sampling campaign to measure chemical emissions from research and development activities to the outside air. The activities provided data on temporal variation of air chemical concentrations and a basis formore » evaluating calculated emissions. Sampling for air chemical emissions was performed in the LSL-II ventilation stack over the 6-week period from July 26 to September 1, 2016. A total of 12 sampling events were carried out using 16 sample media. Resulting analysis provided concentration data on 49 analytes. All results were below occupational exposure limits and most results were below detection limits. When compared to calculated emissions, only 5 of the 49 chemicals had measured concentrations greater than predicted. This sampling effort will inform other study components to develop a more complete picture of a worker’s potential exposure from LSL-II rooftop activities. Mixing studies were conducted to inform spatial variation in concentrations at other rooftop locations and can be used in conjunction with these results to provide temporal variations in concentrations for estimating the potential exposure to workers working in and around the LSL-II stack.« less

  10. Development of risk assessment tool for foundry workers.

    PubMed

    Mohan, G Madhan; Prasad, P S S; Mokkapati, Anil Kumar; Venkataraman, G

    2008-01-01

    Occupational ill-health and work-related disorders are predominant in manufacturing industries due to the inevitable presence of manual work even after several waves of industrial automation and technological advancements. Ergonomic risk factors and musculoskeletal disorders like low-back symptoms have been noted amongst foundry workers. The purpose of this study was to formulate and develop a Physical Effort Index to assess risk factor. The questionnaire tool applicable to foundry environment has been designed and validated. The data recorded through survey across the foundries has been subjected to regression analysis to correlate between proposed physical effort index and the standard Borg's Ratings of Perceived Exertion (RPE) scale. The physical efforts of sixty seven workers in various foundry shop floors were assessed subjectively. The 'Job factors' and 'Work environment' were the two major parameters considered in assessing the worker discomfort level at workplace. A relation between Borg's RPE scale and the above two parameters were arrived at, through regression analysis. The study demonstrates the prevalence of risk factors amongst foundry workers and the effectiveness of the proposed index in estimating the risk factor levels. RELEVANCE TO THE INDUSTRY: The proposed tool will assist foundry supervisors and managers to assess the risk factors and helps in better understanding of the workplace to avoid work-related disorders, ensuring better output.

  11. High Stability Engine Control (HISTEC) Flight Test Results

    NASA Technical Reports Server (NTRS)

    Southwick, Robert D.; Gallops, George W.; Kerr, Laura J.; Kielb, Robert P.; Welsh, Mark G.; DeLaat, John C.; Orme, John S.

    1998-01-01

    The High Stability Engine Control (HISTEC) Program, managed and funded by the NASA Lewis Research Center, is a cooperative effort between NASA and Pratt & Whitney (P&W). The program objective is to develop and flight demonstrate an advanced high stability integrated engine control system that uses real-time, measurement-based estimation of inlet pressure distortion to enhance engine stability. Flight testing was performed using the NASA Advanced Controls Technologies for Integrated Vehicles (ACTIVE) F-15 aircraft at the NASA Dryden Flight Research Center. The flight test configuration, details of the research objectives, and the flight test matrix to achieve those objectives are presented. Flight test results are discussed that show the design approach can accurately estimate distortion and perform real-time control actions for engine accommodation.

  12. The use of auxiliary variables in capture-recapture and removal experiments

    USGS Publications Warehouse

    Pollock, K.H.; Hines, J.E.; Nichols, J.D.

    1984-01-01

    The dependence of animal capture probabilities on auxiliary variables is an important practical problem which has not been considered in the development of estimation procedures for capture-recapture and removal experiments. In this paper the linear logistic binary regression model is used to relate the probability of capture to continuous auxiliary variables. The auxiliary variables could be environmental quantities such as air or water temperature, or characteristics of individual animals, such as body length or weight. Maximum likelihood estimators of the population parameters are considered for a variety of models which all assume a closed population. Testing between models is also considered. The models can also be used when one auxiliary variable is a measure of the effort expended in obtaining the sample.

  13. Space Life Support Engineering Program

    NASA Technical Reports Server (NTRS)

    Seagrave, Richard C.

    1993-01-01

    This report covers the second year of research relating to the development of closed-loop long-term life support systems. Emphasis was directed toward concentrating on the development of dynamic simulation techniques and software and on performing a thermodynamic systems analysis in an effort to begin optimizing the system needed for water purification. Four appendices are attached. The first covers the ASPEN modeling of the closed loop Environmental Control Life Support System (ECLSS) and its thermodynamic analysis. The second is a report on the dynamic model development for water regulation in humans. The third regards the development of an interactive computer-based model for determining exercise limitations. The fourth attachment is an estimate of the second law thermodynamic efficiency of the various units comprising an ECLSS.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klipstein, David H.; Robinson, Sharon

    The Reaction Engineering Roadmap is a part of an industry- wide effort to create a blueprint of the research and technology milestones that are necessary to achieve longterm industry goals. This report documents the results of a workshop focused on the research needs, technology barriers, and priorities of the chemical industry as they relate to reaction engineering viewed first by industrial use (basic chemicals; specialty chemicals; pharmaceuticals; and polymers) and then by technology segment (reactor system selection, design, and scale-up; chemical mechanism development and property estimation; dealing with catalysis; and new, nonstandard reactor types).

  15. Supporting Data FY 1991 Amended Budget Estimate Submitted to Congress - January 1990: Descriptive Summaries of the Research Development Test and Evaluation Army Appropriation

    DTIC Science & Technology

    1990-01-01

    PERFORMED BY: In-house efforts accomplished by Program Executive Officer for Air De - fense Systems, Program Manager-Line of Sight-Forward- Heavy and U.S...evaluation of mechanisms involved in the recovery of heavy metals from waste sludges * (U) Complete determination of basic mechanisms responsible for...tities for characterization " (U) Refined computer model for design of effective heavy metal spin-insensitive EFP war- head liner * (U) Identified

  16. Design of a modular digital computer system DRL 4 and 5. [design of airborne/spaceborne computer system

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Design and development efforts for a spaceborne modular computer system are reported. An initial baseline description is followed by an interface design that includes definition of the overall system response to all classes of failure. Final versions for the register level designs for all module types were completed. Packaging, support and control executive software, including memory utilization estimates and design verification plan, were formalized to insure a soundly integrated design of the digital computer system.

  17. Economic importance of bats in agriculture

    USGS Publications Warehouse

    Boyles, Justin G.; Cryan, Paul M.; McCracken, Gary F.; Kunz, Thomas H.

    2011-01-01

    White-nose syndrome (WNS) and the increased development of wind-power facilities are threatening populations of insectivorous bats in North America. Bats are voracious predators of nocturnal insects, including many crop and forest pests. We present here analyses suggesting that loss of bats in North America could lead to agricultural losses estimated at more than $3.7 billion/year. Urgent efforts are needed to educate the public and policy-makers about the ecological and economic importance of insectivorous bats and to provide practical conservation solutions.

  18. Estimates of the Officer Force Structure Required to Man the Projected Naval Combatant Forces of the 1980s and 1990s.

    DTIC Science & Technology

    1980-10-01

    Element, 64709N Prototype Manpower/Personnel Systems (U), Project Z1302-PN Officer Career Models (U), funded by the Office of the Deputy Assistant... Models for Navy Officer Billets portion of the proposed NPS research effort to develop an integrated officer system planning model ; the purpose of this...attempting to model the Naval officer force structure as a system. This study considers the primary first order factors which drive the requirements

  19. Digital Database Development and Seismic Characterization and Calibration for the Middle East and North Africa.

    DTIC Science & Technology

    1998-02-24

    step, the earthquake data set must be re-checked in order for the change to take effect. Once changed the new symbol stays changed until the session is...standard methods for discriminating between earthquakes and ripple fired explosions to a new geologic setting (northwest Morocco) in an effort to examine the...Tectonophysics, 217: 217-226. Shapira, A., Avni, R. & Nur, A., 1993. Note: A New Estimate For The Epicenter Of The Jericho Earthquake Of 11 July 1927. Israel

  20. SAMSS: An in-progress review of the Spacecraft Assembly, Maintenance, and Servicing Study

    NASA Technical Reports Server (NTRS)

    Burt, William W.

    1987-01-01

    The Spacecraft Assembly, Maintenance, and Servicing Study (SAMSS) is an effort to define and verify the most cost effective approach to spacecraft servicing, as a alternative to replacement, in the 1990's and beyond. The intent of the study is to assess the servicing of satellites in all orbit regimes. Elements of a space servicing infrastructure are developed and cost estimates are generated. Readiness is assessed and proof of concept demonstrations are identified. Cryogenic fuel resupply is discussed.

  1. International Food Assistance: A U.S. Governmentwide Strategy Could Accelerate Progress Toward Global Food Security

    DTIC Science & Technology

    2009-10-29

    Presidential Initiative to End Hunger in Africa ( IEHA )—which represented the U.S. strategy to help fulfill the MDG goal of halving hunger by 2015...was constrained in funding and limited in scope. In 2005, USAID, the primary agency that implemented IEHA , committed to providing an estimated $200...Development Assistance (DA) and other accounts. IEHA was intended to build an African-led partnership to cut hunger and poverty by investing in efforts

  2. Nuclear Explosion Monitoring Advances and Challenges

    NASA Astrophysics Data System (ADS)

    Baker, G. E.

    2015-12-01

    We address the state-of-the-art in areas important to monitoring, current challenges, specific efforts that illustrate approaches addressing shortcomings in capabilities, and additional approaches that might be helpful. The exponential increase in the number of events that must be screened as magnitude thresholds decrease presents one of the greatest challenges. Ongoing efforts to exploit repeat seismic events using waveform correlation, subspace methods, and empirical matched field processing holds as much "game-changing" promise as anything being done, and further efforts to develop and apply such methods efficiently are critical. Greater accuracy of travel time, signal loss, and full waveform predictions are still needed to better locate and discriminate seismic events. Important developments include methods to model velocities using multiple types of data; to model attenuation with better separation of source, path, and site effects; and to model focusing and defocusing of surface waves. Current efforts to model higher frequency full waveforms are likely to improve source characterization while more effective estimation of attenuation from ambient noise holds promise for filling in gaps. Censoring in attenuation modeling is a critical problem to address. Quantifying uncertainty of discriminants is key to their operational use. Efforts to do so for moment tensor (MT) inversion are particularly important, and fundamental progress on the statistics of MT distributions is the most important advance needed in the near term in this area. Source physics is seeing great progress through theoretical, experimental, and simulation studies. The biggest need is to accurately predict the effects of source conditions on seismic generation. Uniqueness is the challenge here. Progress will depend on studies that probe what distinguishes mechanisms, rather than whether one of many possible mechanisms is consistent with some set of observations.

  3. Software Reliability 2002

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.

  4. Validated Predictions of Metabolic Energy Consumption for Submaximal Effort Movement

    PubMed Central

    Tsianos, George A.; MacFadden, Lisa N.

    2016-01-01

    Physical performance emerges from complex interactions among many physiological systems that are largely driven by the metabolic energy demanded. Quantifying metabolic demand is an essential step for revealing the many mechanisms of physical performance decrement, but accurate predictive models do not exist. The goal of this study was to investigate if a recently developed model of muscle energetics and force could be extended to reproduce the kinematics, kinetics, and metabolic demand of submaximal effort movement. Upright dynamic knee extension against various levels of ergometer load was simulated. Task energetics were estimated by combining the model of muscle contraction with validated models of lower limb musculotendon paths and segment dynamics. A genetic algorithm was used to compute the muscle excitations that reproduced the movement with the lowest energetic cost, which was determined to be an appropriate criterion for this task. Model predictions of oxygen uptake rate (VO2) were well within experimental variability for the range over which the model parameters were confidently known. The model's accurate estimates of metabolic demand make it useful for assessing the likelihood and severity of physical performance decrement for a given task as well as investigating underlying physiologic mechanisms. PMID:27248429

  5. Filling the gap: Using fishers' knowledge to map the extent and intensity of fishing activity.

    PubMed

    Szostek, Claire L; Murray, Lee G; Bell, Ewen; Kaiser, Michel J

    2017-08-01

    Knowledge of the extent and intensity of fishing activities is critical to inform management in relation to fishing impacts on marine conservation features. Such information can also provide insight into the potential socio-economic impacts of closures (or other restrictions) of fishing grounds that could occur through the future designation of Marine Conservation Zones (MCZs). We assessed the accuracy and validity of fishing effort data (spatial extent and relative effort) obtained from Fishers' Local Knowledge (LK) data compared to that derived from Vessel Monitoring System (VMS) data for a high-value shellfish fishery, the king scallop (Pecten maximus L.) dredge fishery in the English Channel. The spatial distribution of fishing effort from LK significantly correlated with VMS data and the correlation increased with increasing grid cell resolution. Using a larger grid cell size for data aggregation increases the estimation of the total area of seabed impacted by the fishery. In the absence of historical VMS data for vessels ≤15 m LOA (Length Overall), LK data for the inshore fleet provided important insights into the relative effort of the inshore (<6 NM from land) king scallop fishing fleet in the English Channel. The LK data provided a good representation of the spatial extent of inshore fishing activity, whereas representation of the offshore fishery was more precautionary in terms of defining total impact. Significantly, the data highlighted frequently fished areas of particular importance to the inshore fleet. In the absence of independent sources of geospatial information, the use of LK can inform the development of marine planning in relation to both sustainable fishing and conservation objectives, and has application in both developed and developing countries where VMS technology is not utilised in fisheries management. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Data Service Provider Cost Estimation Tool

    NASA Technical Reports Server (NTRS)

    Fontaine, Kathy; Hunolt, Greg; Booth, Arthur L.; Banks, Mel

    2011-01-01

    The Data Service Provider Cost Estimation Tool (CET) and Comparables Database (CDB) package provides to NASA s Earth Science Enterprise (ESE) the ability to estimate the full range of year-by-year lifecycle cost estimates for the implementation and operation of data service providers required by ESE to support its science and applications programs. The CET can make estimates dealing with staffing costs, supplies, facility costs, network services, hardware and maintenance, commercial off-the-shelf (COTS) software licenses, software development and sustaining engineering, and the changes in costs that result from changes in workload. Data Service Providers may be stand-alone or embedded in flight projects, field campaigns, research or applications projects, or other activities. The CET and CDB package employs a cost-estimation-by-analogy approach. It is based on a new, general data service provider reference model that provides a framework for construction of a database by describing existing data service providers that are analogs (or comparables) to planned, new ESE data service providers. The CET implements the staff effort and cost estimation algorithms that access the CDB and generates the lifecycle cost estimate for a new data services provider. This data creates a common basis for an ESE proposal evaluator for considering projected data service provider costs.

  7. Airborne vs. Inventory Measurements of Methane Emissions in the Alberta Upstream Oil and Gas Sector

    NASA Astrophysics Data System (ADS)

    Johnson, M.; Tyner, D. R.; Conley, S.; Schwietzke, S.; Zavala Araiza, D.

    2017-12-01

    Airborne measurements of methane emission rates were directly compared with detailed, spatially-resolved inventory estimates for different oil and gas production regions in Alberta, Canada. For a 50 km × 50 km region near Red Deer, Alberta, containing 2700 older gas and oil wells, measured methane emissions were 16 times higher than reported venting and flaring volumes would suggest, but consistent with regional inventory estimates (which include estimates for additional emissions from pneumatic equipment, fugitive leaks, gas migration, etc.). This result highlights how 94% of methane emissions in this region are attributable to sources missing from current reporting requirements. The comparison was even more stark for a 60 km × 60 km region near Lloydminster, dominated by 2300 cold heavy oil with sand (CHOPS) production sites. Aircraft measured methane emissions in this region were 5 times larger than that expected from reported venting and flaring volumes, and more than 3 times greater than regional inventory estimates. This significant discrepancy is most likely attributable to underreported intentional venting of casing gas at CHOPS sites, which is generally estimated based on the product of the measured produced oil volume and an assumed gas to oil ratio (GOR). GOR values at CHOPS sites can be difficult to measure and can be notoriously variable in time. Considering the implications for other CHOPS sites across Alberta only, the present results suggest that total reported venting in Alberta is low by a factor of 2.4 (range of 2.0-2.7) and total methane emissions from the conventional oil and gas sector (excluding mined oil sands) are likely at least 25-41% greater than currently estimated. This work reveals critical gaps in current measurement and reporting, while strongly supporting the need for urgent mitigation efforts in the context of newly proposed federal methane regulations in Canada, and separate regulatory development efforts in the province of Alberta.

  8. Changing perceptions of United States natural-gas resources as shown by successive U. S. Department of the Interior assessments

    USGS Publications Warehouse

    Schmoker, James W.; Dyman, Thaddeus S.

    2001-01-01

    Trends in four successive estimates of United States technically recoverable natural gas resources are examined in this report. The effective dates of these assessments were January 1 of 1975, 1980, 1987, and 1994. The 1994 estimate of the U.S. total gas endowment increased significantly over the previous three estimates, indicating that the technically recoverable endowment of gas is not an absolute volume, but rather is a quantity that can increase through time in response to advances in technology and in geologic understanding. Much of this increase was in the category of reserve growth. Reserve growth refers to additions to the estimated ultimate recovery of fields that typically occur as discovered fields are developed and produced. The potential for U.S. reserve growth, rather than being rapidly used up, appears to be sustainable for many years by intensive engineering efforts coupled with improving technology. Potential additions to reserves in continuous (unconventional) accumulations also represent a type of reserve growth, and were estimated (for the first time) in the 1994 assessment at 358 trillion cubic feet of gas. This resource category provides a significant new contribution to the estimated U.S. total gas endowment.

  9. APOKASC 2.0: Asteroseismology and Spectroscopy for Cool Stars

    NASA Astrophysics Data System (ADS)

    Pinsonneault, Marc H.; Elsworth, Yvonne P.; APOKASC

    2017-01-01

    The APOGEE survey has obtained and analyzed high resolution H band spectra of more than 10,000 cool dwarfs and giants in the original Kepler fields. The APOKASC effort combines this data with asteroseismology and star spot studies, resulting in more than 7,000 stellar mass estimates for dwarfs and giants with high quality abundances, temperatures, and surface gravities. We highlight the main results from this effort so far, which include a tight correlation between surface abundances in giants and stellar mass, precise absolute gravity calibrations, and the discovery of unexpected stellar populations, such as young alpha-enhanced stars. We discuss grid modeling estimates for stellar masses and compare the absolute asteroseismic mass scale to calibrators in star clusters and the halo Directions for future efforts are discussed.

  10. Robust detection, isolation and accommodation for sensor failures

    NASA Technical Reports Server (NTRS)

    Emami-Naeini, A.; Akhter, M. M.; Rock, S. M.

    1986-01-01

    The objective is to extend the recent advances in robust control system design of multivariable systems to sensor failure detection, isolation, and accommodation (DIA), and estimator design. This effort provides analysis tools to quantify the trade-off between performance robustness and DIA sensitivity, which are to be used to achieve higher levels of performance robustness for given levels of DIA sensitivity. An innovations-based DIA scheme is used. Estimators, which depend upon a model of the process and process inputs and outputs, are used to generate these innovations. Thresholds used to determine failure detection are computed based on bounds on modeling errors, noise properties, and the class of failures. The applicability of the newly developed tools are demonstrated on a multivariable aircraft turbojet engine example. A new concept call the threshold selector was developed. It represents a significant and innovative tool for the analysis and synthesis of DiA algorithms. The estimators were made robust by introduction of an internal model and by frequency shaping. The internal mode provides asymptotically unbiased filter estimates.The incorporation of frequency shaping of the Linear Quadratic Gaussian cost functional modifies the estimator design to make it suitable for sensor failure DIA. The results are compared with previous studies which used thresholds that were selcted empirically. Comparison of these two techniques on a nonlinear dynamic engine simulation shows improved performance of the new method compared to previous techniques

  11. Integration, Validation, and Application of a PV Snow Coverage Model in SAM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeman, Janine M.; Ryberg, David Severin

    2017-08-01

    Due to the increasing deployment of PV systems in snowy climates, there is significant interest in a method capable of estimating PV losses resulting from snow coverage that has been verified for a variety of system designs and locations. Many independent snow coverage models have been developed over the last 15 years; however, there has been very little effort verifying these models beyond the system designs and locations on which they were based. Moreover, major PV modeling software products have not yet incorporated any of these models into their workflows. In response to this deficiency, we have integrated the methodologymore » of the snow model developed in the paper by Marion et al. (2013) into the National Renewable Energy Laboratory's (NREL) System Advisor Model (SAM). In this work, we describe how the snow model is implemented in SAM and we discuss our demonstration of the model's effectiveness at reducing error in annual estimations for three PV arrays. Next, we use this new functionality in conjunction with a long term historical data set to estimate average snow losses across the United States for two typical PV system designs. The open availability of the snow loss estimation capability in SAM to the PV modeling community, coupled with our results of the nationwide study, will better equip the industry to accurately estimate PV energy production in areas affected by snowfall.« less

  12. Effort-reward imbalance and social support are associated with chronic fatigue among medical residents in Japan.

    PubMed

    Wada, Koji; Sakata, Yumi; Theriault, Gilles; Aratake, Yutaka; Shimizu, Midori; Tsutsumi, Akizumi; Tanaka, Katsutoshi; Aizawa, Yoshiharu

    2008-01-01

    The purpose of this study was to determine the associations of effort-reward imbalance and social support with chronic fatigue among medical residents in Japan. A total of 104 men and 42 women at 14 teaching hospitals participated in this study. Chronic fatigue was measured by the checklist individual strength questionnaire. Effort, reward and overcommitment were determined by the effort-reward questionnaire developed by Siegrist. Social support was determined by a visual analog scale. Sleeping hours for the last 30 days were estimated based on the number of overnight shifts worked, the average number of sleeping hours, and the number of hours of napping during overnight work. Multiple regression analysis was used to examine the multivariate relationship between these variables and chronic fatigue. In both men and women, effort-reward imbalance was positively associated, and higher social support was negatively associated with chronic fatigue. In men, higher overcommitment was positively associated with chronic fatigue. In women, longer sleeping hours was negatively associated with chronic fatigue and an interaction between sleeping hours and social support was found. The adjusted variance in fatigue explained by the exposure variables was 34% in men and 51% in women. The result of this study suggested that it is desirable to take these factors into consideration in the management of chronic fatigue among medical residents.

  13. Near infrared spectroscopy to estimate the temperature reached on burned soils: strategies to develop robust models.

    NASA Astrophysics Data System (ADS)

    Guerrero, César; Pedrosa, Elisabete T.; Pérez-Bejarano, Andrea; Keizer, Jan Jacob

    2014-05-01

    The temperature reached on soils is an important parameter needed to describe the wildfire effects. However, the methods for measure the temperature reached on burned soils have been poorly developed. Recently, the use of the near-infrared (NIR) spectroscopy has been pointed as a valuable tool for this purpose. The NIR spectrum of a soil sample contains information of the organic matter (quantity and quality), clay (quantity and quality), minerals (such as carbonates and iron oxides) and water contents. Some of these components are modified by the heat, and each temperature causes a group of changes, leaving a typical fingerprint on the NIR spectrum. This technique needs the use of a model (or calibration) where the changes in the NIR spectra are related with the temperature reached. For the development of the model, several aliquots are heated at known temperatures, and used as standards in the calibration set. This model offers the possibility to make estimations of the temperature reached on a burned sample from its NIR spectrum. However, the estimation of the temperature reached using NIR spectroscopy is due to changes in several components, and cannot be attributed to changes in a unique soil component. Thus, we can estimate the temperature reached by the interaction between temperature and the thermo-sensible soil components. In addition, we cannot expect the uniform distribution of these components, even at small scale. Consequently, the proportion of these soil components can vary spatially across the site. This variation will be present in the samples used to construct the model and also in the samples affected by the wildfire. Therefore, the strategies followed to develop robust models should be focused to manage this expected variation. In this work we compared the prediction accuracy of models constructed with different approaches. These approaches were designed to provide insights about how to distribute the efforts needed for the development of robust models, since this step is the bottle-neck of this technique. In the first approach, a plot-scale model was used to predict the temperature reached in samples collected in other plots from the same site. In a plot-scale model, all the heated aliquots come from a unique plot-scale sample. As expected, the results obtained with this approach were deceptive, because this approach was assuming that a plot-scale model would be enough to represent the whole variability of the site. The accuracy (measured as the root mean square error of prediction, thereinafter RMSEP) was 86ºC, and the bias was also high (>30ºC). In the second approach, the temperatures predicted through several plot-scale models were averaged. The accuracy was improved (RMSEP=65ºC) respect the first approach, because the variability from several plots was considered and biased predictions were partially counterbalanced. However, this approach implies more efforts, since several plot-scale models are needed. In the third approach, the predictions were obtained with site-scale models. These models were constructed with aliquots from several plots. In this case, the results were accurate, since the RMSEP was around 40ºC, the bias was very small (<1ºC) and the R2 was 0.92. As expected, this approach clearly outperformed the second approach, in spite of the fact that the same efforts were needed. In a plot-scale model, only one interaction between temperature and soil components was modelled. However, several different interactions between temperature and soil components were present in the calibration matrix of a site-scale model. Consequently, the site-scale models were able to model the temperature reached excluding the influence of the differences in soil composition, resulting in more robust models respect that variation. Summarizing, the results were highlighting the importance of an adequate strategy to develop robust and accurate models with moderate efforts, and how a wrong strategy can result in deceptive predictions.

  14. [Sampling optimization for tropical invertebrates: an example using dung beetles (Coleoptera: Scarabaeinae) in Venezuela].

    PubMed

    Ferrer-Paris, José Rafael; Sánchez-Mercado, Ada; Rodríguez, Jon Paul

    2013-03-01

    The development of efficient sampling protocols is an essential prerequisite to evaluate and identify priority conservation areas. There are f ew protocols for fauna inventory and monitoring in wide geographical scales for the tropics, where the complexity of communities and high biodiversity levels, make the implementation of efficient protocols more difficult. We proposed here a simple strategy to optimize the capture of dung beetles, applied to sampling with baited traps and generalizable to other sampling methods. We analyzed data from eight transects sampled between 2006-2008 withthe aim to develop an uniform sampling design, that allows to confidently estimate species richness, abundance and composition at wide geographical scales. We examined four characteristics of any sampling design that affect the effectiveness of the sampling effort: the number of traps, sampling duration, type and proportion of bait, and spatial arrangement of the traps along transects. We used species accumulation curves, rank-abundance plots, indicator species analysis, and multivariate correlograms. We captured 40 337 individuals (115 species/morphospecies of 23 genera). Most species were attracted by both dung and carrion, but two thirds had greater relative abundance in traps baited with human dung. Different aspects of the sampling design influenced each diversity attribute in different ways. To obtain reliable richness estimates, the number of traps was the most important aspect. Accurate abundance estimates were obtained when the sampling period was increased, while the spatial arrangement of traps was determinant to capture the species composition pattern. An optimum sampling strategy for accurate estimates of richness, abundance and diversity should: (1) set 50-70 traps to maximize the number of species detected, (2) get samples during 48-72 hours and set trap groups along the transect to reliably estimate species abundance, (3) set traps in groups of at least 10 traps to suitably record the local species composition, and (4) separate trap groups by a distance greater than 5-10km to avoid spatial autocorrelation. For the evaluation of other sampling protocols we recommend to, first, identify the elements of sampling design that could affect the sampled effort (the number of traps, sampling duration, type and proportion of bait) and their spatial distribution (spatial arrangement of the traps) and then, to evaluate how they affect richness, abundance and species composition estimates.

  15. A method for estimating abundance of mobile populations using telemetry and counts of unmarked animals

    USGS Publications Warehouse

    Clement, Matthew; O'Keefe, Joy M; Walters, Brianne

    2015-01-01

    While numerous methods exist for estimating abundance when detection is imperfect, these methods may not be appropriate due to logistical difficulties or unrealistic assumptions. In particular, if highly mobile taxa are frequently absent from survey locations, methods that estimate a probability of detection conditional on presence will generate biased abundance estimates. Here, we propose a new estimator for estimating abundance of mobile populations using telemetry and counts of unmarked animals. The estimator assumes that the target population conforms to a fission-fusion grouping pattern, in which the population is divided into groups that frequently change in size and composition. If assumptions are met, it is not necessary to locate all groups in the population to estimate abundance. We derive an estimator, perform a simulation study, conduct a power analysis, and apply the method to field data. The simulation study confirmed that our estimator is asymptotically unbiased with low bias, narrow confidence intervals, and good coverage, given a modest survey effort. The power analysis provided initial guidance on survey effort. When applied to small data sets obtained by radio-tracking Indiana bats, abundance estimates were reasonable, although imprecise. The proposed method has the potential to improve abundance estimates for mobile species that have a fission-fusion social structure, such as Indiana bats, because it does not condition detection on presence at survey locations and because it avoids certain restrictive assumptions.

  16. Acoustic correlate of vocal effort in spasmodic dysphonia.

    PubMed

    Eadie, Tanya L; Stepp, Cara E

    2013-03-01

    This study characterized the relationship between relative fundamental frequency (RFF) and listeners' perceptions of vocal effort and overall spasmodic dysphonia severity in the voices of 19 individuals with adductor spasmodic dysphonia. Twenty inexperienced listeners evaluated the vocal effort and overall severity of voices using visual analog scales. The squared correlation coefficients (R2) between average vocal effort and overall severity and RFF measures were calculated as a function of the number of acoustic instances used for the RFF estimate (from 1 to 9, of a total of 9 voiced-voiceless-voiced instances). Increases in the number of acoustic instances used for the RFF average led to increases in the variance predicted by the RFF at the first cycle of voicing onset (onset RFF) in the perceptual measures; the use of 6 or more instances resulted in a stable estimate. The variance predicted by the onset RFF for vocal effort (R2 range, 0.06 to 0.43) was higher than that for overall severity (R2 range, 0.06 to 0.35). The offset RFF was not related to the perceptual measures, irrespective of the sample size. This study indicates that onset RFF measures are related to perceived vocal effort in patients with adductor spasmodic dysphonia. These results have implications for measuring outcomes in this population.

  17. A pilot outreach program for small quantity generators of hazardous waste.

    PubMed Central

    Brown, M S; Kelley, B G; Gutensohn, J

    1988-01-01

    The Massachusetts Department of Environmental Management initiated a pilot project to improve compliance with hazardous waste regulations and management of hazardous wastes with auto body shops around the state. The program consisted of mass mailings, a series of workshops throughout the state, a coordinated inspection program by the state regulatory agency, and technology transfer. At the start of the program in January 1986, approximately 650 of the estimated 2,350 auto body shops in the state had notified EPA of their waste generating activities; by January 1987, approximately 1,200 shops had done so. Suggestions for improving program efforts include tailoring the outreach effort to the industry, government-sponsored research and development directed at the needs of small firms, mandatory participation in hazardous waste transportation programs, and better coordination by EPA of its information collection and distribution program. PMID:3421393

  18. Progress and Challenges in Developing Reference Data Layers for Human Population Distribution and Built Infrastructure

    NASA Astrophysics Data System (ADS)

    Chen, R. S.; Yetman, G.; de Sherbinin, A. M.

    2015-12-01

    Understanding the interactions between environmental and human systems, and in particular supporting the applications of Earth science data and knowledge in place-based decision making, requires systematic assessment of the distribution and dynamics of human population and the built human infrastructure in conjunction with environmental variability and change. The NASA Socioeconomic Data and Applications Center (SEDAC) operated by the Center for International Earth Science Information Network (CIESIN) at Columbia University has had a long track record in developing reference data layers for human population and settlements and is expanding its efforts on topics such as intercity roads, reservoirs and dams, and energy infrastructure. SEDAC has set as a strategic priority the acquisition, development, and dissemination of data resources derived from remote sensing and socioeconomic data on urban land use change, including temporally and spatially disaggregated data on urban change and rates of change, the built infrastructure, and critical facilities. We report here on a range of past and ongoing activities, including the Global Human Settlements Layer effort led by the European Commission's Joint Research Centre (JRC), the Global Exposure Database for the Global Earthquake Model (GED4GEM) project, the Global Roads Open Access Data Working Group (gROADS) of the Committee on Data for Science and Technology (CODATA), and recent work with ImageCat, Inc. to improve estimates of the exposure and fragility of buildings, road and rail infrastructure, and other facilities with respect to selected natural hazards. New efforts such as the proposed Global Human Settlement indicators initiative of the Group on Earth Observations (GEO) could help fill critical gaps and link potential reference data layers with user needs. We highlight key sectors and themes that require further attention, and the many significant challenges that remain in developing comprehensive, high quality, up-to-date, and well maintained reference data layers on population and built infrastructure. The need for improved indicators of sustainable development in the context of the post-2015 development framework provides an opportunity to link data efforts directly with international development needs and investments.

  19. Estimating the Effectiveness of Health-Risk Communications with Propensity-Score Matching: Application to Arsenic Groundwater Contamination in Four US Locations

    PubMed Central

    Leidner, Andrew J.

    2014-01-01

    This paper provides a demonstration of propensity-score matching estimation methods to evaluate the effectiveness of health-risk communication efforts. This study develops a two-stage regression model to investigate household and respondent characteristics as they contribute to aversion behavior to reduce exposure to arsenic-contaminated groundwater. The aversion activity under study is a household-level point-of-use filtration device. Since the acquisition of arsenic contamination information and the engagement in an aversion activity may be codetermined, a two-stage propensity-score model is developed. In the first stage, the propensity for households to acquire arsenic contamination information is estimated. Then, the propensity scores are used to weight observations in a probit regression on the decision to avert the arsenic-related health risk. Of four potential sources of information, utility, media, friend, or others, information received from a friend appears to be the source of information most associated with aversion behavior. Other statistically significant covariates in the household's decision to avert contamination include reported household income, the presence of children in household, and region-level indicator variables. These findings are primarily illustrative and demonstrate the usefulness of propensity-score methods to estimate health-risk communication effectiveness. They may also be suggestive of areas for future research. PMID:25349622

  20. Transient high frequency signal estimation: A model-based processing approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, F.L.

    1985-03-22

    By utilizing the superposition property of linear systems a method of estimating the incident signal from reflective nondispersive data is developed. One of the basic merits of this approach is that, the reflections were removed by direct application of a Weiner type estimation algorithm, after the appropriate input was synthesized. The structure of the nondispersive signal model is well documented, and thus its' credence is established. The model is stated and more effort is devoted to practical methods of estimating the model parameters. Though a general approach was developed for obtaining the reflection weights, a simpler approach was employed here,more » since a fairly good reflection model is available. The technique essentially consists of calculating ratios of the autocorrelation function at lag zero and that lag where the incident and first reflection coincide. We initially performed our processing procedure on a measurement of a single signal. Multiple application of the processing procedure was required when we applied the reflection removal technique on a measurement containing information from the interaction of two physical phenomena. All processing was performed using SIG, an interactive signal processing package. One of the many consequences of using SIG was that repetitive operations were, for the most part, automated. A custom menu was designed to perform the deconvolution process.« less

  1. Thermal Protection System Mass Estimating Relationships for Blunt-Body, Earth Entry Spacecraft

    NASA Technical Reports Server (NTRS)

    Sepka, Steven A.; Samareh, Jamshid A.

    2015-01-01

    System analysis and design of any entry system must balance the level fidelity for each discipline against the project timeline. One way to inject high fidelity analysis earlier in the design effort is to develop surrogate models for the high-fidelity disciplines. Surrogate models for the Thermal Protection System (TPS) are formulated as Mass Estimating Relationships (MERs). The TPS MERs are presented that predict the amount of TPS necessary for safe Earth entry for blunt-body spacecraft using simple correlations that closely match estimates from NASA's high-fidelity ablation modeling tool, the Fully Implicit Ablation and Thermal Analysis Program (FIAT). These MERs provide a first order estimate for rapid feasibility studies. There are 840 different trajectories considered in this study, and each TPS MER has a peak heating limit. MERs for the vehicle forebody include the ablators Phenolic Impregnated Carbon Ablator (PICA) and Carbon Phenolic atop Advanced Carbon-Carbon. For the aftbody, the materials are Silicone Impregnated Reusable Ceramic Ablator (SIRCA), Acusil II, SLA-561V, and LI-900. The MERs are accurate to within 14% (at one standard deviation) of FIAT prediction, and the most any MER under predicts FIAT TPS thickness is 18.7%. This work focuses on the development of these MERs, the resulting equations, model limitations, and model accuracy.

  2. Object-oriented productivity metrics

    NASA Technical Reports Server (NTRS)

    Connell, John L.; Eller, Nancy

    1992-01-01

    Software productivity metrics are useful for sizing and costing proposed software and for measuring development productivity. Estimating and measuring source lines of code (SLOC) has proven to be a bad idea because it encourages writing more lines of code and using lower level languages. Function Point Analysis is an improved software metric system, but it is not compatible with newer rapid prototyping and object-oriented approaches to software development. A process is presented here for counting object-oriented effort points, based on a preliminary object-oriented analysis. It is proposed that this approach is compatible with object-oriented analysis, design, programming, and rapid prototyping. Statistics gathered on actual projects are presented to validate the approach.

  3. Hydraulic model and flood-inundation maps developed for the Pee Dee National Wildlife Refuge, North Carolina

    USGS Publications Warehouse

    Smith, Douglas G.; Wagner, Chad R.

    2016-04-08

    A series of digital flood-inundation maps were developed on the basis of the water-surface profiles produced by the model. The inundation maps, which can be accessed through the USGS Flood Inundation Mapping Program Web site at http://water.usgs.gov/osw/flood_inundation, depict estimates of the areal extent and depth of flooding corresponding to selected water levels at the USGS streamgage Pee Dee River at Pee Dee Refuge near Ansonville, N.C. These maps, when combined with real-time water-level information from USGS streamgages, provide managers with critical information to help plan flood-response activities and resource protection efforts.

  4. Reconciling Basin-Scale Top-Down and Bottom-Up Methane Emission Measurements for Onshore Oil and Gas Development: Cooperative Research and Development Final Report, CRADA Number CRD-14-572

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heath, Garvin A.

    The overall objective of the Research Partnership to Secure Energy for America (RPSEA)-funded research project is to develop independent estimates of methane emissions using top-down and bottom-up measurement approaches and then to compare the estimates, including consideration of uncertainty. Such approaches will be applied at two scales: basin and facility. At facility scale, multiple methods will be used to measure methane emissions of the whole facility (controlled dual tracer and single tracer releases, aircraft-based mass balance and Gaussian back-trajectory), which are considered top-down approaches. The bottom-up approach will sum emissions from identified point sources measured using appropriate source-level measurement techniquesmore » (e.g., high-flow meters). At basin scale, the top-down estimate will come from boundary layer airborne measurements upwind and downwind of the basin, using a regional mass balance model plus approaches to separate atmospheric methane emissions attributed to the oil and gas sector. The bottom-up estimate will result from statistical modeling (also known as scaling up) of measurements made at selected facilities, with gaps filled through measurements and other estimates based on other studies. The relative comparison of the bottom-up and top-down estimates made at both scales will help improve understanding of the accuracy of the tested measurement and modeling approaches. The subject of this CRADA is NREL's contribution to the overall project. This project resulted from winning a competitive solicitation no. RPSEA RFP2012UN001, proposal no. 12122-95, which is the basis for the overall project. This Joint Work Statement (JWS) details the contributions of NREL and Colorado School of Mines (CSM) in performance of the CRADA effort.« less

  5. A Bayesian approach to multisource forest area estimation

    Treesearch

    Andrew O. Finley

    2007-01-01

    In efforts such as land use change monitoring, carbon budgeting, and forecasting ecological conditions and timber supply, demand is increasing for regional and national data layers depicting forest cover. These data layers must permit small area estimates of forest and, most importantly, provide associated error estimates. This paper presents a model-based approach for...

  6. A Cost Estimation Tool for Charter Schools

    ERIC Educational Resources Information Center

    Hayes, Cheryl D.; Keller, Eric

    2009-01-01

    To align their financing strategies and fundraising efforts with their fiscal needs, charter school leaders need to know how much funding they need and what that funding will support. This cost estimation tool offers a simple set of worksheets to help start-up charter school operators identify and estimate the range of costs and timing of…

  7. Estimating effective roughness parameters of the L-MEB model for soil moisture retrieval using passive microwave observations from SMAPVEX12

    USDA-ARS?s Scientific Manuscript database

    Although there have been efforts to improve existing soil moisture retrieval algorithms, the ability to estimate soil moisture from passive microwave observations is still hampered by problems in accurately modeling the observed microwave signal. This paper focuses on the estimation of effective sur...

  8. Using Data Envelopment Analysis to Improve Estimates of Higher Education Institution's Per-Student Education Costs

    ERIC Educational Resources Information Center

    Salerno, Carlo

    2006-01-01

    This paper puts forth a data envelopment analysis (DEA) approach to estimating higher education institutions' per-student education costs (PSCs) in an effort to redress a number of methodological problems endemic to such estimations, particularly the allocation of shared expenditures between education and other institutional activities. An example…

  9. Preliminary estimates of the economic implications of addiction in the United Arab Emirates.

    PubMed

    Doran, C M

    2017-01-23

    This study aimed to provide preliminary estimates of the economic implications of addiction in the United Arab Emirates (UAE). Local and international data sources were used to derive estimates of substancerelated healthcare costs, lost productivity and criminal behaviour. From an estimated population of 8.26 million: ~1.47 million used tobacco (20.5% of adults); 380 085 used cannabis (> 5%); 14 077 used alcohol in a harmful manner (0.2%); and 1408 used opiates (0.02%). The cost of addiction was estimated at US$ 5.47 billion in 2012, equivalent to 1.4% of gross domestic product. Productivity costs were the largest contributor at US$ 4.79 billion (88%) followed by criminal behaviour at US$ 0.65 billion (12%). There were no data to estimate cost of: treating tobacco-related diseases, community education and prevention efforts, or social disharmony. Current data collection efforts are limited in their capacity to fully inform an appropriate response to addiction in the UAE. Resources are required to improve indicators of drug use, monitor harm and evaluate treatment.

  10. Reducing Contingency through Sampling at the Luckey FUSRAP Site - 13186

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frothingham, David; Barker, Michelle; Buechi, Steve

    2013-07-01

    Typically, the greatest risk in developing accurate cost estimates for the remediation of hazardous, toxic, and radioactive waste sites is the uncertainty in the estimated volume of contaminated media requiring remediation. Efforts to address this risk in the remediation cost estimate can result in large cost contingencies that are often considered unacceptable when budgeting for site cleanups. Such was the case for the Luckey Formerly Utilized Sites Remedial Action Program (FUSRAP) site near Luckey, Ohio, which had significant uncertainty surrounding the estimated volume of site soils contaminated with radium, uranium, thorium, beryllium, and lead. Funding provided by the American Recoverymore » and Reinvestment Act (ARRA) allowed the U.S. Army Corps of Engineers (USACE) to conduct additional environmental sampling and analysis at the Luckey Site between November 2009 and April 2010, with the objective to further delineate the horizontal and vertical extent of contaminated soils in order to reduce the uncertainty in the soil volume estimate. Investigative work included radiological, geophysical, and topographic field surveys, subsurface borings, and soil sampling. Results from the investigative sampling were used in conjunction with Argonne National Laboratory's Bayesian Approaches for Adaptive Spatial Sampling (BAASS) software to update the contaminated soil volume estimate for the site. This updated volume estimate was then used to update the project cost-to-complete estimate using the USACE Cost and Schedule Risk Analysis process, which develops cost contingencies based on project risks. An investment of $1.1 M of ARRA funds for additional investigative work resulted in a reduction of 135,000 in-situ cubic meters (177,000 in-situ cubic yards) in the estimated base volume estimate. This refinement of the estimated soil volume resulted in a $64.3 M reduction in the estimated project cost-to-complete, through a reduction in the uncertainty in the contaminated soil volume estimate and the associated contingency costs. (authors)« less

  11. A new framework for analysing automated acoustic species-detection data: occupancy estimation and optimization of recordings post-processing

    USGS Publications Warehouse

    Chambert, Thierry A.; Waddle, J. Hardin; Miller, David A.W.; Walls, Susan; Nichols, James D.

    2018-01-01

    The development and use of automated species-detection technologies, such as acoustic recorders, for monitoring wildlife are rapidly expanding. Automated classification algorithms provide a cost- and time-effective means to process information-rich data, but often at the cost of additional detection errors. Appropriate methods are necessary to analyse such data while dealing with the different types of detection errors.We developed a hierarchical modelling framework for estimating species occupancy from automated species-detection data. We explore design and optimization of data post-processing procedures to account for detection errors and generate accurate estimates. Our proposed method accounts for both imperfect detection and false positive errors and utilizes information about both occurrence and abundance of detections to improve estimation.Using simulations, we show that our method provides much more accurate estimates than models ignoring the abundance of detections. The same findings are reached when we apply the methods to two real datasets on North American frogs surveyed with acoustic recorders.When false positives occur, estimator accuracy can be improved when a subset of detections produced by the classification algorithm is post-validated by a human observer. We use simulations to investigate the relationship between accuracy and effort spent on post-validation, and found that very accurate occupancy estimates can be obtained with as little as 1% of data being validated.Automated monitoring of wildlife provides opportunity and challenges. Our methods for analysing automated species-detection data help to meet key challenges unique to these data and will prove useful for many wildlife monitoring programs.

  12. Combined Yamamoto approach for simultaneous estimation of adsorption isotherm and kinetic parameters in ion-exchange chromatography.

    PubMed

    Rüdt, Matthias; Gillet, Florian; Heege, Stefanie; Hitzler, Julian; Kalbfuss, Bernd; Guélat, Bertrand

    2015-09-25

    Application of model-based design is appealing to support the development of protein chromatography in the biopharmaceutical industry. However, the required efforts for parameter estimation are frequently perceived as time-consuming and expensive. In order to speed-up this work, a new parameter estimation approach for modelling ion-exchange chromatography in linear conditions was developed. It aims at reducing the time and protein demand for the model calibration. The method combines the estimation of kinetic and thermodynamic parameters based on the simultaneous variation of the gradient slope and the residence time in a set of five linear gradient elutions. The parameters are estimated from a Yamamoto plot and a gradient-adjusted Van Deemter plot. The combined approach increases the information extracted per experiment compared to the individual methods. As a proof of concept, the combined approach was successfully applied for a monoclonal antibody on a cation-exchanger and for a Fc-fusion protein on an anion-exchange resin. The individual parameter estimations for the mAb confirmed that the new approach maintained the accuracy of the usual Yamamoto and Van Deemter plots. In the second case, offline size-exclusion chromatography was performed in order to estimate the thermodynamic parameters of an impurity (high molecular weight species) simultaneously with the main product. Finally, the parameters obtained from the combined approach were used in a lumped kinetic model to simulate the chromatography runs. The simulated chromatograms obtained for a wide range of gradient lengths and residence times showed only small deviations compared to the experimental data. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Spatio-temporal variation in male white-tailed deer harvest rates in Pennsylvania: Implications for estimating abundance

    USGS Publications Warehouse

    Norton, Andrew S.; Diefenbach, Duane R.; Wallingford, Bret D.; Rosenberry, Christopher S.

    2012-01-01

    The performance of 2 popular methods that use age-at-harvest data to estimate abundance of white-tailed deer is contingent on assumptions about variation in estimates of subadult (1.5 yr old) and adult (≥2.5 yr old) male harvest rates. Auxiliary data (e.g., estimates of survival or harvest rates from radiocollared animals) can be used to relax some assumptions, but unless these population parameters exhibit limited temporal or spatial variation, these auxiliary data may not improve accuracy. Unfortunately maintaining sufficient sample sizes of radiocollared deer for parameter estimation in every wildlife management unit (WMU) is not feasible for most state agencies. We monitored the fates of 397 subadult and 225 adult male white-tailed deer across 4 WMUs from 2002 to 2008 using radio telemetry. We investigated spatial and temporal variation in harvest rates and investigated covariates related to the patterns observed. We found that most variation in harvest rates was explained spatially and that adult harvest rates (0.36–0.69) were more variable among study areas than subadult harvest rates (0.26–0.42). We found that hunter effort during the archery and firearms season best explained variation in harvest rates of adult males among WMUs, whereas hunter effort during only the firearms season best explained harvest rates for subadult males. From a population estimation perspective, it is advantageous that most variation was spatial and explained by a readily obtained covariate (hunter effort). However, harvest rates may vary if hunting regulations or hunter behavior change, requiring additional field studies to obtain accurate estimates of harvest rates. 

  14. A Generalized Distance’ Estimation Procedure for Intra-Urban Interaction

    DTIC Science & Technology

    Bettinger . It is found that available estimation techniques necessarily result in non-integer solutions. A mathematical device is therefore...The estimation of urban and regional travel patterns has been a necessary part of current efforts to establish land use guidelines for the Texas...paper details computational experience with travel estimation within Corpus Christi, Texas, using a new convex programming approach of Charnes, Raike and

  15. Electrofishing Effort Required to Estimate Biotic Condition in Southern Idaho Rivers

    EPA Science Inventory

    An important issue surrounding biomonitoring in large rivers is the minimum sampling effort required to collect an adequate number of fish for accurate and precise determinations of biotic condition. During the summer of 2002, we sampled 15 randomly selected large-river sites in...

  16. Student Effort and Performance over the Semester

    ERIC Educational Resources Information Center

    Krohn, Gregory A.; O'Connor, Catherine M.

    2005-01-01

    The authors extend the standard education production function and student time allocation analysis to focus on the interactions between student effort and performance over the semester. The purged instrumental variable technique is used to obtain consistent estimators of the structural parameters of the model using data from intermediate…

  17. Systems Analysis Initiated for All-Electric Aircraft Propulsion

    NASA Technical Reports Server (NTRS)

    Kohout, Lisa L.

    2003-01-01

    A multidisciplinary effort is underway at the NASA Glenn Research Center to develop concepts for revolutionary, nontraditional fuel cell power and propulsion systems for aircraft applications. There is a growing interest in the use of fuel cells as a power source for electric propulsion as well as an auxiliary power unit to substantially reduce or eliminate environmentally harmful emissions. A systems analysis effort was initiated to assess potential concepts in an effort to identify those configurations with the highest payoff potential. Among the technologies under consideration are advanced proton exchange membrane (PEM) and solid oxide fuel cells, alternative fuels and fuel processing, and fuel storage. Prior to this effort, the majority of fuel cell analysis done at Glenn was done for space applications. Because of this, a new suite of models was developed. These models include the hydrogen-air PEM fuel cell; internal reforming solid oxide fuel cell; balance-of-plant components (compressor, humidifier, separator, and heat exchangers); compressed gas, cryogenic, and liquid fuel storage tanks; and gas turbine/generator models for hybrid system applications. Initial mass, volume, and performance estimates of a variety of PEM systems operating on hydrogen and reformate have been completed for a baseline general aviation aircraft. Solid oxide/turbine hybrid systems are being analyzed. In conjunction with the analysis efforts, a joint effort has been initiated with Glenn s Computer Services Division to integrate fuel cell stack and component models with the visualization environment that supports the GRUVE lab, Glenn s virtual reality facility. The objective of this work is to provide an environment to assist engineers in the integration of fuel cell propulsion systems into aircraft and provide a better understanding of the interaction between system components and the resulting effect on the overall design and performance of the aircraft. Initially, three-dimensional computer-aided design (CAD) models of representative PEM fuel cell stack and components were developed and integrated into the virtual reality environment along with an Excel-based model used to calculate fuel cell electrical performance on the basis of cell dimensions (see the figure). CAD models of a representative general aviation aircraft were also developed and added to the environment. With the use of special headgear, users will be able to virtually manipulate the fuel cell s physical characteristics and its placement within the aircraft while receiving information on the resultant fuel cell output power and performance. As the systems analysis effort progresses, we will add more component models to the GRUVE environment to help us more fully understand the effect of various system configurations on the aircraft.

  18. Collecting costs of community prevention programs: communities putting prevention to work initiative.

    PubMed

    Khavjou, Olga A; Honeycutt, Amanda A; Hoerger, Thomas J; Trogdon, Justin G; Cash, Amanda J

    2014-08-01

    Community-based programs require substantial investments of resources; however, evaluations of these programs usually lack analyses of program costs. Costs of community-based programs reported in previous literature are limited and have been estimated retrospectively. To describe a prospective cost data collection approach developed for the Communities Putting Prevention to Work (CPPW) program capturing costs for community-based tobacco use and obesity prevention strategies. A web-based cost data collection instrument was developed using an activity-based costing approach. Respondents reported quarterly expenditures on labor; consultants; materials, travel, and services; overhead; partner efforts; and in-kind contributions. Costs were allocated across CPPW objectives and strategies organized around five categories: media, access, point of decision/promotion, price, and social support and services. The instrument was developed in 2010, quarterly data collections took place in 2011-2013, and preliminary analysis was conducted in 2013. Preliminary descriptive statistics are presented for the cost data collected from 51 respondents. More than 50% of program costs were for partner organizations, and over 20% of costs were for labor hours. Tobacco communities devoted the majority of their efforts to media strategies. Obesity communities spent more than half of their resources on access strategies. Collecting accurate cost information on health promotion and disease prevention programs presents many challenges. The approach presented in this paper is one of the first efforts successfully collecting these types of data and can be replicated for collecting costs from other programs. Copyright © 2014 American Journal of Preventive Medicine. All rights reserved.

  19. Maturation and fecundity of a stock-enhanced population of striped bass in the Savannah River Estuary, U.S.A.

    USGS Publications Warehouse

    Will, T.A.; Reinert, T.R.; Jennings, C.A.

    2002-01-01

    The striped bass Morone saxatilis population in the Savannah River (south-eastern U.S.A.) collapsed in the 1980s, and recent efforts to restore the population have resulted in increased catch-per-unit-effort (CPUE) of striped bass in the Savannah River Estuary (SRE). The abundance of eggs and larvae, however, remain well below historic levels. The primary cause of the population decline was remedied, and environmental conditions seem suitable for striped bass spawning. Regression analysis of data derived from ultrasonic imaging of 31 striped bass resulted in a statistical model that predicted ovary volume well (r2=0.95). The enumeration of oocytes from ovarian tissue samples and the prediction of ovary volume allowed fecundity to be estimated without sacrificing the fish. Oocyte maturation in Savannah River striped bass seemed to progress normally, with oocytes developing to final stages of maturity in larger fish (>750 mm LT). Additionally, fecundity estimates were comparable to a neighbouring striped bass population. The environmental cues needed to trigger development and release of striped bass oocytes into the SRE appeared to be present. If most of the striped bass females in the SRE are still young (<7 years), the ability to produce large numbers of eggs will be limited. As these young fish mature, egg production probably will increase and the density of striped bass eggs eventually will approach historic levels, provided suitable habitat and water quality are maintained. ?? 2002 The Fisheries Society of the British Isles.

  20. Inferences about landbird abundance from count data: recent advances and future directions

    USGS Publications Warehouse

    Nichols, J.D.; Thomas, L.; Conn, P.B.; Thomson, David L.; Cooch, Evan G.; Conroy, Michael J.

    2009-01-01

    We summarize results of a November 2006 workshop dealing with recent research on the estimation of landbird abundance from count data. Our conceptual framework includes a decomposition of the probability of detecting a bird potentially exposed to sampling efforts into four separate probabilities. Primary inference methods are described and include distance sampling, multiple observers, time of detection, and repeated counts. The detection parameters estimated by these different approaches differ, leading to different interpretations of resulting estimates of density and abundance. Simultaneous use of combinations of these different inference approaches can not only lead to increased precision but also provides the ability to decompose components of the detection process. Recent efforts to test the efficacy of these different approaches using natural systems and a new bird radio test system provide sobering conclusions about the ability of observers to detect and localize birds in auditory surveys. Recent research is reported on efforts to deal with such potential sources of error as bird misclassification, measurement error, and density gradients. Methods for inference about spatial and temporal variation in avian abundance are outlined. Discussion topics include opinions about the need to estimate detection probability when drawing inference about avian abundance, methodological recommendations based on the current state of knowledge and suggestions for future research.

  1. Establishment of a center of excellence for applied mathematical and statistical research

    NASA Technical Reports Server (NTRS)

    Woodward, W. A.; Gray, H. L.

    1983-01-01

    The state of the art was assessed with regards to efforts in support of the crop production estimation problem and alternative generic proportion estimation techniques were investigated. Topics covered include modeling the greeness profile (Badhwarmos model), parameter estimation using mixture models such as CLASSY, and minimum distance estimation as an alternative to maximum likelihood estimation. Approaches to the problem of obtaining proportion estimates when the underlying distributions are asymmetric are examined including the properties of Weibull distribution.

  2. Integrating High-Resolution Datasets to Target Mitigation Efforts for Improving Air Quality and Public Health in Urban Neighborhoods

    PubMed Central

    Shandas, Vivek; Voelkel, Jackson; Rao, Meenakshi; George, Linda

    2016-01-01

    Reducing exposure to degraded air quality is essential for building healthy cities. Although air quality and population vary at fine spatial scales, current regulatory and public health frameworks assess human exposures using county- or city-scales. We build on a spatial analysis technique, dasymetric mapping, for allocating urban populations that, together with emerging fine-scale measurements of air pollution, addresses three objectives: (1) evaluate the role of spatial scale in estimating exposure; (2) identify urban communities that are disproportionately burdened by poor air quality; and (3) estimate reduction in mobile sources of pollutants due to local tree-planting efforts using nitrogen dioxide. Our results show a maximum value of 197% difference between cadastrally-informed dasymetric system (CIDS) and standard estimations of population exposure to degraded air quality for small spatial extent analyses, and a lack of substantial difference for large spatial extent analyses. These results provide the foundation for improving policies for managing air quality, and targeting mitigation efforts to address challenges of environmental justice. PMID:27527205

  3. Analysis of low flows and selected methods for estimating low-flow characteristics at partial-record and ungaged stream sites in western Washington

    USGS Publications Warehouse

    Curran, Christopher A.; Eng, Ken; Konrad, Christopher P.

    2012-01-01

    Regional low-flow regression models for estimating Q7,10 at ungaged stream sites are developed from the records of daily discharge at 65 continuous gaging stations (including 22 discontinued gaging stations) for the purpose of evaluating explanatory variables. By incorporating the base-flow recession time constant τ as an explanatory variable in the regression model, the root-mean square error for estimating Q7,10 at ungaged sites can be lowered to 72 percent (for known values of τ), which is 42 percent less than if only basin area and mean annual precipitation are used as explanatory variables. If partial-record sites are included in the regression data set, τ must be estimated from pairs of discharge measurements made during continuous periods of declining low flows. Eight measurement pairs are optimal for estimating τ at partial-record sites, and result in a lowering of the root-mean square error by 25 percent. A low-flow survey strategy that includes paired measurements at partial-record sites requires additional effort and planning beyond a standard strategy, but could be used to enhance regional estimates of τ and potentially reduce the error of regional regression models for estimating low-flow characteristics at ungaged sites.

  4. An Investigation Into the Effects of Frequency Response Function Estimators on Model Updating

    NASA Astrophysics Data System (ADS)

    Ratcliffe, M. J.; Lieven, N. A. J.

    1999-03-01

    Model updating is a very active research field, in which significant effort has been invested in recent years. Model updating methodologies are invariably successful when used on noise-free simulated data, but tend to be unpredictable when presented with real experimental data that are—unavoidably—corrupted with uncorrelated noise content. In the development and validation of model-updating strategies, a random zero-mean Gaussian variable is added to simulated test data to tax the updating routines more fully. This paper proposes a more sophisticated model for experimental measurement noise, and this is used in conjunction with several different frequency response function estimators, from the classical H1and H2to more refined estimators that purport to be unbiased. Finite-element model case studies, in conjunction with a genuine experimental test, suggest that the proposed noise model is a more realistic representation of experimental noise phenomena. The choice of estimator is shown to have a significant influence on the viability of the FRF sensitivity method. These test cases find that the use of the H2estimator for model updating purposes is contraindicated, and that there is no advantage to be gained by using the sophisticated estimators over the classical H1estimator.

  5. Breeding season survival and breeding incidence of female Mottled Ducks on the upper Texas gulf coast

    USGS Publications Warehouse

    Rigby, Elizabeth A.; Haukos, David A.

    2012-01-01

    Previous Mottled Duck (Anas fulvigula) studies suggested that high female breeding season survival may be caused by low nesting effort, but few breeding season estimates of survival associated with nesting effort exist on the western Gulf Coast. Here, breeding season survival (N = 40) and breeding incidence (N = 39) were estimated for female Mottled Ducks on the upper Texas coast, 2006–2008. Females were fitted with backpack radio transmitters and visually relocated every 3–4 days. Weekly survival was estimated using the Known Fate procedure of program MARK with breeding incidence estimated as the annual proportion of females observed nesting or with broods. The top-ranked survival model included a body mass covariate and held weekly female survival constant across weeks and years (SW = 0.986, SE = 0.006). When compared to survival across the entire year estimated from previous band recovery and age ratio analysis, survival rate during the breeding season did not differ. Breeding incidence was well below 100% in all years and highly variable among years (15%–63%). Breeding season survival and breeding incidence were similar to estimates obtained with implant transmitters from the mid-coast of Texas. The greatest breeding incidence for both studies occurred when drought indices indicated average environmental moisture during the breeding season. The observed combination of low breeding incidence and high breeding season survival support the hypothesis of a trade-off between the ecological cost of nesting effort and survival for Mottled Duck females. Habitat cues that trigger nesting are unknown and should be investigated.

  6. Testing Software Development Project Productivity Model

    NASA Astrophysics Data System (ADS)

    Lipkin, Ilya

    Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control, Simulation and etc... This research validates findings from previous work concerning software project productivity and leverages said results in this study. The hypothesized project productivity model provides statistical support and validation of expert opinions used by practitioners in the field of software project estimation.

  7. Global inverse modeling of CH4 sources and sinks: an overview of methods

    NASA Astrophysics Data System (ADS)

    Houweling, Sander; Bergamaschi, Peter; Chevallier, Frederic; Heimann, Martin; Kaminski, Thomas; Krol, Maarten; Michalak, Anna M.; Patra, Prabir

    2017-01-01

    The aim of this paper is to present an overview of inverse modeling methods that have been developed over the years for estimating the global sources and sinks of CH4. It provides insight into how techniques and estimates have evolved over time and what the remaining shortcomings are. As such, it serves a didactical purpose of introducing apprentices to the field, but it also takes stock of developments so far and reflects on promising new directions. The main focus is on methodological aspects that are particularly relevant for CH4, such as its atmospheric oxidation, the use of methane isotopologues, and specific challenges in atmospheric transport modeling of CH4. The use of satellite retrievals receives special attention as it is an active field of methodological development, with special requirements on the sampling of the model and the treatment of data uncertainty. Regional scale flux estimation and attribution is still a grand challenge, which calls for new methods capable of combining information from multiple data streams of different measured parameters. A process model representation of sources and sinks in atmospheric transport inversion schemes allows the integrated use of such data. These new developments are needed not only to improve our understanding of the main processes driving the observed global trend but also to support international efforts to reduce greenhouse gas emissions.

  8. Energy Cost Impact of Non-Residential Energy Code Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jian; Hart, Philip R.; Rosenberg, Michael I.

    2016-08-22

    The 2012 International Energy Conservation Code contains 396 separate requirements applicable to non-residential buildings; however, there is no systematic analysis of the energy cost impact of each requirement. Consequently, limited code department budgets for plan review, inspection, and training cannot be focused on the most impactful items. An inventory and ranking of code requirements based on their potential energy cost impact is under development. The initial phase focuses on office buildings with simple HVAC systems in climate zone 4C. Prototype building simulations were used to estimate the energy cost impact of varying levels of non-compliance. A preliminary estimate of themore » probability of occurrence of each level of non-compliance was combined with the estimated lost savings for each level to rank the requirements according to expected savings impact. The methodology to develop and refine further energy cost impacts, specific to building type, system type, and climate location is demonstrated. As results are developed, an innovative alternative method for compliance verification can focus efforts so only the most impactful requirements from an energy cost perspective are verified for every building and a subset of the less impactful requirements are verified on a random basis across a building population. The results can be further applied in prioritizing training material development and specific areas of building official training.« less

  9. Leveraging object-oriented development at Ames

    NASA Technical Reports Server (NTRS)

    Wenneson, Greg; Connell, John

    1994-01-01

    This paper presents lessons learned by the Software Engineering Process Group (SEPG) from results of supporting two projects at NASA Ames using an Object Oriented Rapid Prototyping (OORP) approach supported by a full featured visual development environment. Supplemental lessons learned from a large project in progress and a requirements definition are also incorporated. The paper demonstrates how productivity gains can be made by leveraging the developer with a rich development environment, correct and early requirements definition using rapid prototyping, and earlier and better effort estimation and software sizing through object-oriented methods and metrics. Although the individual elements of OO methods, RP approach and OO metrics had been used on other separate projects, the reported projects were the first integrated usage supported by a rich development environment. Overall the approach used was twice as productive (measured by hours per OO Unit) as a C++ development.

  10. The distribution of biomedical research resources and international justice.

    PubMed

    Resnik, David B

    2004-05-01

    According to some estimates, less than 10% of the world's biomedical research funds are dedicated to addressing problems that are responsible for 90% of the world's burden of disease. This paper explains why this disparity exists and what should be done about it. It argues that the disparity exists because: 1) multinational pharmaceutical and biotechnology companies do not regard research and development investments on the health problems of developing nations to be economically lucrative; and 2) governmental agencies that sponsor biomedical research face little political pressure to allocate funds for the problems of developing nations. This paper argues that developed nations have an obligation to address disparities related to biomedical research funding. To facilitate this effort, developed countries should establish a trust fund dedicated to research on the health problems of developing nations similar to the Global AIDS Fund.

  11. Application of Satellite Data for Early Season Assessment of Fallowed Agricultural Lands for Drought Impact Reporting

    NASA Astrophysics Data System (ADS)

    Rosevelt, C.; Melton, F. S.; Johnson, L.; Verdin, J. P.; Thenkabail, P. S.; mueller, R.; Zakzeski, A.; Jones, J.

    2013-12-01

    Rapid assessment of drought impacts can aid water managers in assessing mitigation options, and guide decision making with respect to requests for local water transfers, county drought disaster designations, or state emergency proclamations. Satellite remote sensing offers an efficient way to provide quantitative assessments of drought impacts on agricultural production and land fallowing associated with reductions in water supply. A key advantage of satellite-based assessments is that they can provide a measure of land fallowing that is consistent across both space and time. Here we describe an approach for monthly mapping of land fallowing developed as part of a joint effort by USGS, USDA, and NASA to provide timely assessments of land fallowing during drought events. This effort has used the Central Valley of California as a pilot region for development and testing of an operational approach. To provide quantitative measures of fallowed land from satellite data early in the season, we developed a decision tree algorithm and applied it to timeseries of normalized difference vegetation index (NDVI) data from Landsat TM, ETM+, and MODIS. Our effort has been focused on development of leading indicators of drought impacts in the March - June timeframe based on measures of crop development patterns relative to a reference period with average or above average rainfall. This capability complements ongoing work by USDA to produce and publicly release within-season estimates of fallowed acreage from the USDA Cropland Data Layer. To assess the accuracy of the algorithms, monthly ground validation surveys were conducted along transects across the Central Valley at more than 200 fields per month from March - June, 2013. Here we present the algorithm for mapping fallowed acreage early in the season along with results from the accuracy assessment, and discuss potential applications to other regions.

  12. Utah Cancer Survivors: A Comprehensive Comparison of Health-Related Outcomes Between Survivors and Individuals Without a History of Cancer.

    PubMed

    Fowler, Brynn; Ding, Qian; Pappas, Lisa; Wu, Yelena P; Linder, Lauri; Yancey, Jeff; Wright, Jennifer; Clayton, Margaret; Kepka, Deanna; Kirchhoff, Anne C

    2018-02-01

    Assessments of cancer survivors' health-related needs are often limited to national estimates. State-specific information is vital to inform state comprehensive cancer control efforts developed to support patients and providers. We investigated demographics, health status/quality of life, health behaviors, and health care characteristics of long-term Utah cancer survivors compared to Utahans without a history of cancer. Utah Behavioral Risk Factor Surveillance System (BRFSS) 2009 and 2010 data were used. Individuals diagnosed with cancer within the past 5 years were excluded. Multivariable survey weighted logistic regressions and computed predictive marginals were used to estimate age-adjusted percentages and 95 % confidence intervals (CI). A total of 11,320 eligible individuals (727 cancer survivors, 10,593 controls) were included. Respondents were primarily non-Hispanic White (95.3 % of survivors, 84.1 % of controls). Survivors were older (85 % of survivors ≥40 years of age vs. 47 % of controls). Survivors reported the majority of their cancer survivorship care was managed by primary care physicians or non-cancer specialists (93.5 %, 95 % CI = 87.9-99.1). Furthermore, 71.1 % (95 % CI = 59.2-82.9) of survivors reported that they did not receive a cancer treatment summary. In multivariable estimates, fair/poor general health was more common among survivors compared to controls (17.8 %, 95 % CI = 12.5-23.1 vs. 14.2 %, 95 % CI = 12.4-16.0). Few survivors in Utah receive follow-up care from a cancer specialist. Provider educational efforts are needed to promote knowledge of cancer survivor issues. Efforts should be made to improve continuity in follow-up care that addresses the known issues of long-term survivors that preclude optimal quality of life, resulting in a patient-centered approach to survivorship.

  13. SA45. Amotivation in Schizophrenia, Bipolar Disorder, and Major Depressive Disorder: A Preliminary Comparison Study

    PubMed Central

    Zou, Ying-min; Ni, Ke; Wang, Yang-yu; Yu, En-qing; Lui, Simon S. Y.; Cheung, Eric F. C.; Chan, Raymond C. K.

    2017-01-01

    Abstract Background: Deficits in reward processing, such as approaching motivation, reward learning and effort-based decision-making, have been observed in patients with schizophrenia (SCZ), bipolar disorder (BD), and major depressive disorder (MDD). However, little is known about the nature of reward-processing deficits in these 3 diagnostic groups. The present study aimed to compare and contrast amotivation in these 3 diagnostic groups using an effort-based decision-making task. Methods: Sixty patients (19 SCZ patients, 18 BD patients and 23 MDD patients) and 27 healthy controls (HC) were recruited for the present study. The Effort Expenditure for Reward Task (EEfRT) was administered to evaluate their effort allocation pattern. This task required participants to choose easy or hard tasks in response to different levels of reward magnitude and reward probability. Results: Results showed that SCZ, BD, and MDD patients chose fewer hard tasks compared to HC. As reward magnitude increased, MDD patients made the least effort to gain reward compared to the other groups. When reward probability was intermediate, MDD patients chose fewer hard tasks than SCZ patients, whereas BD patients and HC chose more hard tasks than MDD and SCZ patients. When the reward probability was high, all 3 groups of patients tried fewer hard tasks than HC. Moreover, SCZ and MDD patients were less likely to choose hard tasks than BD patients and HC in the intermediate estimated value conditions. However, in the highest estimated value condition, there was no group difference in hard task choices between these 3 clinical groups, and they were all less motivated than HC. Conclusion: SCZ, BD, and MDD patients shared common deficits in gaining reward if the reward probability and estimated value were high. SCZ and MDD patients showed less motivation than BD patients in gaining reward when the reward probability and estimated value was intermediate.

  14. Estimating time-based instantaneous total mortality rate based on the age-structured abundance index

    NASA Astrophysics Data System (ADS)

    Wang, Yingbin; Jiao, Yan

    2015-05-01

    The instantaneous total mortality rate ( Z) of a fish population is one of the important parameters in fisheries stock assessment. The estimation of Z is crucial to fish population dynamics analysis, abundance and catch forecast, and fisheries management. A catch curve-based method for estimating time-based Z and its change trend from catch per unit effort (CPUE) data of multiple cohorts is developed. Unlike the traditional catch-curve method, the method developed here does not need the assumption of constant Z throughout the time, but the Z values in n continuous years are assumed constant, and then the Z values in different n continuous years are estimated using the age-based CPUE data within these years. The results of the simulation analyses show that the trends of the estimated time-based Z are consistent with the trends of the true Z, and the estimated rates of change from this approach are close to the true change rates (the relative differences between the change rates of the estimated Z and the true Z are smaller than 10%). Variations of both Z and recruitment can affect the estimates of Z value and the trend of Z. The most appropriate value of n can be different given the effects of different factors. Therefore, the appropriate value of n for different fisheries should be determined through a simulation analysis as we demonstrated in this study. Further analyses suggested that selectivity and age estimation are also two factors that can affect the estimated Z values if there is error in either of them, but the estimated change rates of Z are still close to the true change rates. We also applied this approach to the Atlantic cod ( Gadus morhua) fishery of eastern Newfoundland and Labrador from 1983 to 1997, and obtained reasonable estimates of time-based Z.

  15. Modeling the environmental suitability of anthrax in Ghana and estimating populations at risk: Implications for vaccination and control.

    PubMed

    Kracalik, Ian T; Kenu, Ernest; Ayamdooh, Evans Nsoh; Allegye-Cudjoe, Emmanuel; Polkuu, Paul Nokuma; Frimpong, Joseph Asamoah; Nyarko, Kofi Mensah; Bower, William A; Traxler, Rita; Blackburn, Jason K

    2017-10-01

    Anthrax is hyper-endemic in West Africa. Despite the effectiveness of livestock vaccines in controlling anthrax, underreporting, logistics, and limited resources makes implementing vaccination campaigns difficult. To better understand the geographic limits of anthrax, elucidate environmental factors related to its occurrence, and identify human and livestock populations at risk, we developed predictive models of the environmental suitability of anthrax in Ghana. We obtained data on the location and date of livestock anthrax from veterinary and outbreak response records in Ghana during 2005-2016, as well as livestock vaccination registers and population estimates of characteristically high-risk groups. To predict the environmental suitability of anthrax, we used an ensemble of random forest (RF) models built using a combination of climatic and environmental factors. From 2005 through the first six months of 2016, there were 67 anthrax outbreaks (851 cases) in livestock; outbreaks showed a seasonal peak during February through April and primarily involved cattle. There was a median of 19,709 vaccine doses [range: 0-175 thousand] administered annually. Results from the RF model suggest a marked ecological divide separating the broad areas of environmental suitability in northern Ghana from the southern part of the country. Increasing alkaline soil pH was associated with a higher probability of anthrax occurrence. We estimated 2.2 (95% CI: 2.0, 2.5) million livestock and 805 (95% CI: 519, 890) thousand low income rural livestock keepers were located in anthrax risk areas. Based on our estimates, the current anthrax vaccination efforts in Ghana cover a fraction of the livestock potentially at risk, thus control efforts should be focused on improving vaccine coverage among high risk groups.

  16. Modeling the environmental suitability of anthrax in Ghana and estimating populations at risk: Implications for vaccination and control

    PubMed Central

    Allegye-Cudjoe, Emmanuel; Polkuu, Paul Nokuma; Frimpong, Joseph Asamoah; Nyarko, Kofi Mensah; Bower, William A.; Traxler, Rita

    2017-01-01

    Anthrax is hyper-endemic in West Africa. Despite the effectiveness of livestock vaccines in controlling anthrax, underreporting, logistics, and limited resources makes implementing vaccination campaigns difficult. To better understand the geographic limits of anthrax, elucidate environmental factors related to its occurrence, and identify human and livestock populations at risk, we developed predictive models of the environmental suitability of anthrax in Ghana. We obtained data on the location and date of livestock anthrax from veterinary and outbreak response records in Ghana during 2005–2016, as well as livestock vaccination registers and population estimates of characteristically high-risk groups. To predict the environmental suitability of anthrax, we used an ensemble of random forest (RF) models built using a combination of climatic and environmental factors. From 2005 through the first six months of 2016, there were 67 anthrax outbreaks (851 cases) in livestock; outbreaks showed a seasonal peak during February through April and primarily involved cattle. There was a median of 19,709 vaccine doses [range: 0–175 thousand] administered annually. Results from the RF model suggest a marked ecological divide separating the broad areas of environmental suitability in northern Ghana from the southern part of the country. Increasing alkaline soil pH was associated with a higher probability of anthrax occurrence. We estimated 2.2 (95% CI: 2.0, 2.5) million livestock and 805 (95% CI: 519, 890) thousand low income rural livestock keepers were located in anthrax risk areas. Based on our estimates, the current anthrax vaccination efforts in Ghana cover a fraction of the livestock potentially at risk, thus control efforts should be focused on improving vaccine coverage among high risk groups. PMID:29028799

  17. Methodology Development for the Reconstruction of the ESA Huygens Probe Entry and Descent Trajectory

    NASA Astrophysics Data System (ADS)

    Kazeminejad, B.

    2005-01-01

    The European Space Agency's (ESA) Huygens probe performed a successful entry and descent into Titan's atmosphere on January 14, 2005, and landed safely on the satellite's surface. A methodology was developed, implemented, and tested to reconstruct the Huygens probe trajectory from its various science and engineering measurements, which were performed during the probe's entry and descent to the surface of Titan, Saturn's largest moon. The probe trajectory reconstruction is an essential effort that has to be done as early as possible in the post-flight data analysis phase as it guarantees a correct and consistent interpretation of all the experiment data and furthermore provides a reference set of data for "ground-truthing" orbiter remote sensing measurements. The entry trajectory is reconstructed from the measured probe aerodynamic drag force, which also provides a means to derive the upper atmospheric properties like density, pressure, and temperature. The descent phase reconstruction is based upon a combination of various atmospheric measurements such as pressure, temperature, composition, speed of sound, and wind speed. A significant amount of effort was spent to outline and implement a least-squares trajectory estimation algorithm that provides a means to match the entry and descent trajectory portions in case of discontinuity. An extensive test campaign of the algorithm is presented which used the Huygens Synthetic Dataset (HSDS) developed by the Huygens Project Scientist Team at ESA/ESTEC as a test bed. This dataset comprises the simulated sensor output (and the corresponding measurement noise and uncertainty) of all the relevant probe instruments. The test campaign clearly showed that the proposed methodology is capable of utilizing all the relevant probe data, and will provide the best estimate of the probe trajectory once real instrument measurements from the actual probe mission are available. As a further test case using actual flight data the NASA Mars Pathfinder entry and descent trajectory and the space craft attitude was reconstructed from the 3-axis accelerometer measurements which are archived on the Planetary Data System. The results are consistent with previously published reconstruction efforts.

  18. Nuclear thermal propulsion engine system design analysis code development

    NASA Astrophysics Data System (ADS)

    Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.; Ivanenok, Joseph F.

    1992-01-01

    A Nuclear Thermal Propulsion (NTP) Engine System Design Analyis Code has recently been developed to characterize key NTP engine system design features. Such a versatile, standalone NTP system performance and engine design code is required to support ongoing and future engine system and vehicle design efforts associated with proposed Space Exploration Initiative (SEI) missions of interest. Key areas of interest in the engine system modeling effort were the reactor, shielding, and inclusion of an engine multi-redundant propellant pump feed system design option. A solid-core nuclear thermal reactor and internal shielding code model was developed to estimate the reactor's thermal-hydraulic and physical parameters based on a prescribed thermal output which was integrated into a state-of-the-art engine system design model. The reactor code module has the capability to model graphite, composite, or carbide fuels. Key output from the model consists of reactor parameters such as thermal power, pressure drop, thermal profile, and heat generation in cooled structures (reflector, shield, and core supports), as well as the engine system parameters such as weight, dimensions, pressures, temperatures, mass flows, and performance. The model's overall analysis methodology and its key assumptions and capabilities are summarized in this paper.

  19. H-P adaptive methods for finite element analysis of aerothermal loads in high-speed flows

    NASA Technical Reports Server (NTRS)

    Chang, H. J.; Bass, J. M.; Tworzydlo, W.; Oden, J. T.

    1993-01-01

    The commitment to develop the National Aerospace Plane and Maneuvering Reentry Vehicles has generated resurgent interest in the technology required to design structures for hypersonic flight. The principal objective of this research and development effort has been to formulate and implement a new class of computational methodologies for accurately predicting fine scale phenomena associated with this class of problems. The initial focus of this effort was to develop optimal h-refinement and p-enrichment adaptive finite element methods which utilize a-posteriori estimates of the local errors to drive the adaptive methodology. Over the past year this work has specifically focused on two issues which are related to overall performance of a flow solver. These issues include the formulation and implementation (in two dimensions) of an implicit/explicit flow solver compatible with the hp-adaptive methodology, and the design and implementation of computational algorithm for automatically selecting optimal directions in which to enrich the mesh. These concepts and algorithms have been implemented in a two-dimensional finite element code and used to solve three hypersonic flow benchmark problems (Holden Mach 14.1, Edney shock on shock interaction Mach 8.03, and the viscous backstep Mach 4.08).

  20. Gender equality and human rights approaches to female genital mutilation: a review of international human rights norms and standards.

    PubMed

    Khosla, Rajat; Banerjee, Joya; Chou, Doris; Say, Lale; Fried, Susana T

    2017-05-12

    Two hundred million girls and women in the world are estimated to have undergone female genital mutilation (FGM), and another 15 million girls are at risk of experiencing it by 2020 in high prevalence countries (UNICEF, 2016. Female genital mutilation/cutting: a global concern. 2016). Despite decades of concerted efforts to eradicate or abandon the practice, and the increased need for clear guidance on the treatment and care of women who have undergone FGM, present efforts have not yet been able to effectively curb the number of women and girls subjected to this practice (UNICEF. Female genital mutilation/cutting: a statistical overview and exploration of the dynamics of change. 2013), nor are they sufficient to respond to health needs of millions of women and girls living with FGM. International efforts to address FGM have thus far focused primarily on preventing the practice, with less attention to treating associated health complications, caring for survivors, and engaging health care providers as key stakeholders. Recognizing this imperative, WHO developed guidelines on management of health complications of FGM. In this paper, based on foundational research for the development of WHO's guidelines, we situate the practice of FGM as a rights violation in the context of international and national policy and efforts, and explore the role of health providers in upholding health-related human rights of women at girls who are survivors, or who are at risk. Findings are based on a literature review of relevant international human rights treaties and UN Treaty Monitoring Bodies.

Top