Sample records for estimated time needed

  1. Pricing Secrets Revealed: An Insider's Perspective on How Custom Courses Are Priced.

    ERIC Educational Resources Information Center

    Hartnett, John

    2002-01-01

    Describes one vendor's methods for pricing the development of online learning courses. Highlights include estimating the time needed; estimating the size of the course by counting the number of potential screens; estimating time spent by learners; comparing similar projects; and estimating time needed by each member of the project team. (LRW)

  2. Guillain-Barré Syndrome and Healthcare Needs during Zika Virus Transmission, Puerto Rico, 2016.

    PubMed

    Dirlikov, Emilio; Kniss, Krista; Major, Chelsea; Thomas, Dana; Virgen, Cesar A; Mayshack, Marrielle; Asher, Jason; Mier-Y-Teran-Romero, Luis; Salinas, Jorge L; Pastula, Daniel M; Sharp, Tyler M; Sejvar, James; Johansson, Michael A; Rivera-Garcia, Brenda

    2017-01-01

    To assist with public health preparedness activities, we estimated the number of expected cases of Zika virus in Puerto Rico and associated healthcare needs. Estimated annual incidence is 3.2-5.1 times the baseline, and long-term care needs are predicted to be 3-5 times greater than in years with no Zika virus.

  3. A Cost Estimation Tool for Charter Schools

    ERIC Educational Resources Information Center

    Hayes, Cheryl D.; Keller, Eric

    2009-01-01

    To align their financing strategies and fundraising efforts with their fiscal needs, charter school leaders need to know how much funding they need and what that funding will support. This cost estimation tool offers a simple set of worksheets to help start-up charter school operators identify and estimate the range of costs and timing of…

  4. A cellular automata approach to estimate incident-related travel time on Interstate 66 in near real time : final contract report.

    DOT National Transportation Integrated Search

    2010-03-01

    Incidents account for a large portion of all congestion and a need clearly exists for tools to predict and estimate incident effects. This study examined (1) congestion back propagation to estimate the length of the queue and travel time from upstrea...

  5. On the estimation of intracluster correlation for time-to-event outcomes in cluster randomized trials.

    PubMed

    Kalia, Sumeet; Klar, Neil; Donner, Allan

    2016-12-30

    Cluster randomized trials (CRTs) involve the random assignment of intact social units rather than independent subjects to intervention groups. Time-to-event outcomes often are endpoints in CRTs. Analyses of such data need to account for the correlation among cluster members. The intracluster correlation coefficient (ICC) is used to assess the similarity among binary and continuous outcomes that belong to the same cluster. However, estimating the ICC in CRTs with time-to-event outcomes is a challenge because of the presence of censored observations. The literature suggests that the ICC may be estimated using either censoring indicators or observed event times. A simulation study explores the effect of administrative censoring on estimating the ICC. Results show that ICC estimators derived from censoring indicators or observed event times are negatively biased. Analytic work further supports these results. Observed event times are preferred to estimate the ICC under minimum frequency of administrative censoring. To our knowledge, the existing literature provides no practical guidance on the estimation of ICC when substantial amount of administrative censoring is present. The results from this study corroborate the need for further methodological research on estimating the ICC for correlated time-to-event outcomes. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. AIRSLUG: A fortran program for the computation of type curves to estimate transmissivity and storativity from prematurely terminated air-pressurized slug tests

    USGS Publications Warehouse

    Greene, E.A.; Shapiro, A.M.

    1998-01-01

    The Fortran code AIRSLUG can be used to generate the type curves needed to analyze the recovery data from prematurely terminated air-pressurized slug tests. These type curves, when used with a graphical software package, enable the engineer or scientist to analyze field tests to estimate transmissivity and storativity. Prematurely terminating the slug test can significantly reduce the overall time needed to conduct the test, especially at low-permeability sites, thus saving time and money.The Fortran code AIRSLUG can be used to generate the type curves needed to analyze the recovery data from prematurely terminated air-pressurized slug tests. These type curves, when used with a graphical software package, enable the engineer or scientist to analyze field tests to estimate transmissivity and storativity. Prematurely terminating the slug test can significantly reduce the overall time needed to conduct the test, especially at low-permeability sites, thus saving time and money.

  7. A modified cluster-sampling method for post-disaster rapid assessment of needs.

    PubMed Central

    Malilay, J.; Flanders, W. D.; Brogan, D.

    1996-01-01

    The cluster-sampling method can be used to conduct rapid assessment of health and other needs in communities affected by natural disasters. It is modelled on WHO's Expanded Programme on Immunization method of estimating immunization coverage, but has been modified to provide (1) estimates of the population remaining in an area, and (2) estimates of the number of people in the post-disaster area with specific needs. This approach differs from that used previously in other disasters where rapid needs assessments only estimated the proportion of the population with specific needs. We propose a modified n x k survey design to estimate the remaining population, severity of damage, the proportion and number of people with specific needs, the number of damaged or destroyed and remaining housing units, and the changes in these estimates over a period of time as part of the survey. PMID:8823962

  8. Freeway travel time estimation using existing fixed traffic sensors : phase 2.

    DOT National Transportation Integrated Search

    2015-03-01

    Travel time, one of the most important freeway performance metrics, can be easily estimated using the : data collected from fixed traffic sensors, avoiding the need to install additional travel time data collectors. : This project is aimed at fully u...

  9. Strategic Methodologies in Public Health Cost Analyses.

    PubMed

    Whittington, Melanie; Atherly, Adam; VanRaemdonck, Lisa; Lampe, Sarah

    The National Research Agenda for Public Health Services and Systems Research states the need for research to determine the cost of delivering public health services in order to assist the public health system in communicating financial needs to decision makers, partners, and health reform leaders. The objective of this analysis is to compare 2 cost estimation methodologies, public health manager estimates of employee time spent and activity logs completed by public health workers, to understand to what degree manager surveys could be used in lieu of more time-consuming and burdensome activity logs. Employees recorded their time spent on communicable disease surveillance for a 2-week period using an activity log. Managers then estimated time spent by each employee on a manager survey. Robust and ordinary least squares regression was used to measure the agreement between the time estimated by the manager and the time recorded by the employee. The 2 outcomes for this study included time recorded by the employee on the activity log and time estimated by the manager on the manager survey. This study was conducted in local health departments in Colorado. Forty-one Colorado local health departments (82%) agreed to participate. Seven of the 8 models showed that managers underestimate their employees' time, especially for activities on which an employee spent little time. Manager surveys can best estimate time for time-intensive activities, such as total time spent on a core service or broad public health activity, and yet are less precise when estimating discrete activities. When Public Health Services and Systems Research researchers and health departments are conducting studies to determine the cost of public health services, there are many situations in which managers can closely approximate the time required and produce a relatively precise approximation of cost without as much time investment by practitioners.

  10. Why are they late? Timing abilities and executive control among students with learning disabilities.

    PubMed

    Grinblat, Nufar; Rosenblum, Sara

    2016-12-01

    While a deficient ability to perform daily tasks on time has been reported among students with learning disabilities (LD), the underlying mechanism behind their 'being late' is still unclear. This study aimed to evaluate the organization in time, time estimation abilities, actual performance time pertaining to specific daily activities, as well as the executive functions of students with LD in comparison to those of controls, and to assess the relationships between these domains among each group. The participants were 27 students with LD, aged 20-30, and 32 gender and age-matched controls who completed the Time Organization and Participation Scale (TOPS) and the Behavioral Rating Inventory of Executive Function-Adult version (BRIEF-A). In addition, their ability to estimate the time needed to complete the task of preparing a cup of coffee as well as their actual performance time were evaluated. The results indicated that in comparison to controls, students with LD showed significantly inferior organization in time (TOPS) and executive function abilities (BRIEF-A). Furthermore, their time estimation abilities were significantly inferior and they required significantly more time to prepare a cup of coffee. Regression analysis identified the variables that predicted organization in time and task performance time among each group. The significance of the results for both theoretical and clinical implications are discussed. What this paper adds? This study examines the underlying mechanism of the phenomena of being late among students with LD. Following a recent call for using ecologically valid assessments, the functional daily ability of students with LD to prepare a cup of coffee and to organize time were investigated. Furthermore, their time estimation and executive control abilities were examined as a possible underlying mechanism for their lateness. Although previous studies have indicated executive control deficits among students with LD, to our knowledge, this is the first analysis of the relationships between their executive control and time estimation deficits and their influence upon their daily function and organization in time abilities. Our findings demonstrate that students with LD need more time in order to execute simple daily activities, such as preparing a cup of coffee. Deficient working memory, retrospective time estimation ability and inhibition predicted their performance time and organization in time abilities. Therefore, this paper sheds light on the mechanism behind daily performance in time among students with LD and emphasizes the need for future development of focused intervention programs to meet their unique needs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. 43 CFR 11.73 - Quantification phase-resource recoverability analysis.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... analysis. (a) Requirement. The time needed for the injured resources to recover to the state that the... been acquired to baseline levels shall be estimated. The time estimated for recovery or any lesser period of time as determined in the Assessment Plan must be used as the recovery period for purposes of...

  12. Estimating pumping time and ground-water withdrawals using energy- consumption data

    USGS Publications Warehouse

    Hurr, R.T.; Litke, D.W.

    1989-01-01

    Evaluation of the hydrology of an aquifer requires knowledge about the volume of groundwater in storage and also about the volume of groundwater withdrawals. Totalizer flow meters may be installed at pumping plants to measure withdrawals; however, it generally is impractical to equip all pumping plants in an area with meters. A viable alternative is the use of rate-time methods. Rate-time methods may be used at individual pumping plants to decrease the data collection necessary for determining withdrawals. At sites where pumping-time measurement devices are not installed, pumping time may be determined on the basis of energy consumption and power demand. At pumping plants where energy consumption is metered, data acquired by reading of meters is used to estimate pumping time. Care needs to be taken to read these meters correctly. At pumping plants powered by electricity, the calculations need to be modified if transformers are present. At pumping plants powered by natural gas, the effects of the pressure-correction factor need to be included in the calculations. At pumping plants powered by gasoline, diesel oil, or liquid petroleum gas, the geometry of storage tanks needs to be analyzed as part of the calculations. The relation between power demand and pumping rate at a pumping plant can be described through the use of the power-consumption coefficient. Where equipment and hydrologic conditions are stable, this coefficient can be applied to total energy consumption at a site to estimate total groundwater withdrawals. Random sampling of power consumption coefficients can be used to estimate area-wide groundwater withdrawal. (USGS)

  13. Shift level analysis of cable yarder availability, utilization, and productive time

    Treesearch

    James R. Sherar; Chris B. LeDoux

    1989-01-01

    Decision makers, loggers, managers, and planners need to understand and have methods for estimating utilization and productive time of cable logging systems. In making an accurate prediction of how much area and volume a machine will log per unit time and the associated cable yarding costs, a reliable estimate of the availability, utilization, and productive time of...

  14. On Location Estimation Technique Based of the Time of Flight in Low-power Wireless Systems

    NASA Astrophysics Data System (ADS)

    Botta, Miroslav; Simek, Milan; Krajsa, Ondrej; Cervenka, Vladimir; Pal, Tamas

    2015-04-01

    This study deals with the distance estimation issue in low-power wireless systems being usually used for sensor networking and interconnecting the Internet of Things. There is an effort to locate or track these sensor entities for different needs the radio signal time of flight principle from the theoretical and practical side of application research is evaluated. Since these sensor devices are mainly targeted for low power consumption appliances, there is always need for optimization of any aspects needed for regular sensor operation. For the distance estimation we benefit from IEEE 802.15.4a technology, which offers the precise ranging capabilities. There is no need for additional hardware to be used for the ranging task and all fundamental measurements are acquired within the 15.4a standard compliant hardware in the real environment. The proposed work examines the problems and the solutions for implementation of distance estimation algorithms for WSN devices. The main contribution of the article is seen in this real testbed evaluation of the ranging technology.

  15. Temporal interpolation alters motion in fMRI scans: Magnitudes and consequences for artifact detection.

    PubMed

    Power, Jonathan D; Plitt, Mark; Kundu, Prantik; Bandettini, Peter A; Martin, Alex

    2017-01-01

    Head motion can be estimated at any point of fMRI image processing. Processing steps involving temporal interpolation (e.g., slice time correction or outlier replacement) often precede motion estimation in the literature. From first principles it can be anticipated that temporal interpolation will alter head motion in a scan. Here we demonstrate this effect and its consequences in five large fMRI datasets. Estimated head motion was reduced by 10-50% or more following temporal interpolation, and reductions were often visible to the naked eye. Such reductions make the data seem to be of improved quality. Such reductions also degrade the sensitivity of analyses aimed at detecting motion-related artifact and can cause a dataset with artifact to falsely appear artifact-free. These reduced motion estimates will be particularly problematic for studies needing estimates of motion in time, such as studies of dynamics. Based on these findings, it is sensible to obtain motion estimates prior to any image processing (regardless of subsequent processing steps and the actual timing of motion correction procedures, which need not be changed). We also find that outlier replacement procedures change signals almost entirely during times of motion and therefore have notable similarities to motion-targeting censoring strategies (which withhold or replace signals entirely during times of motion).

  16. Temporal interpolation alters motion in fMRI scans: Magnitudes and consequences for artifact detection

    PubMed Central

    Plitt, Mark; Kundu, Prantik; Bandettini, Peter A.; Martin, Alex

    2017-01-01

    Head motion can be estimated at any point of fMRI image processing. Processing steps involving temporal interpolation (e.g., slice time correction or outlier replacement) often precede motion estimation in the literature. From first principles it can be anticipated that temporal interpolation will alter head motion in a scan. Here we demonstrate this effect and its consequences in five large fMRI datasets. Estimated head motion was reduced by 10–50% or more following temporal interpolation, and reductions were often visible to the naked eye. Such reductions make the data seem to be of improved quality. Such reductions also degrade the sensitivity of analyses aimed at detecting motion-related artifact and can cause a dataset with artifact to falsely appear artifact-free. These reduced motion estimates will be particularly problematic for studies needing estimates of motion in time, such as studies of dynamics. Based on these findings, it is sensible to obtain motion estimates prior to any image processing (regardless of subsequent processing steps and the actual timing of motion correction procedures, which need not be changed). We also find that outlier replacement procedures change signals almost entirely during times of motion and therefore have notable similarities to motion-targeting censoring strategies (which withhold or replace signals entirely during times of motion). PMID:28880888

  17. Can real time location system technology (RTLS) provide useful estimates of time use by nursing personnel?

    PubMed

    Jones, Terry L; Schlegel, Cara

    2014-02-01

    Accurate, precise, unbiased, reliable, and cost-effective estimates of nursing time use are needed to insure safe staffing levels. Direct observation of nurses is costly, and conventional surrogate measures have limitations. To test the potential of electronic capture of time and motion through real time location systems (RTLS), a pilot study was conducted to assess efficacy (method agreement) of RTLS time use; inter-rater reliability of RTLS time-use estimates; and associated costs. Method agreement was high (mean absolute difference = 28 seconds); inter-rater reliability was high (ICC = 0.81-0.95; mean absolute difference = 2 seconds); and costs for obtaining RTLS time-use estimates on a single nursing unit exceeded $25,000. Continued experimentation with RTLS to obtain time-use estimates for nursing staff is warranted. © 2013 Wiley Periodicals, Inc.

  18. Estimating Treatment and Treatment Times for Special and Nonspecial Patients in Hospital Ambulatory Dental Clinics.

    ERIC Educational Resources Information Center

    Rosenberg, Dara J.; And Others

    1986-01-01

    A study compared the treatments and the amount of time needed for treatment of the dental needs of developmentally disabled, severely compromised, and moderately compromised patients with those of nondisabled patients in a hospital ambulatory dental clinic. (MSE)

  19. CO2 storage capacity estimation: Issues and development of standards

    USGS Publications Warehouse

    Bradshaw, J.; Bachu, S.; Bonijoly, D.; Burruss, R.; Holloway, S.; Christensen, N.P.; Mathiassen, O.M.

    2007-01-01

    Associated with the endeavours of geoscientists to pursue the promise that geological storage of CO2 has of potentially making deep cuts into greenhouse gas emissions, Governments around the world are dependent on reliable estimates of CO2 storage capacity and insightful indications of the viability of geological storage in their respective jurisdictions. Similarly, industry needs reliable estimates for business decisions regarding site selection and development. If such estimates are unreliable, and decisions are made based on poor advice, then valuable resources and time could be wasted. Policies that have been put in place to address CO2 emissions could be jeopardised. Estimates need to clearly state the limitations that existed (data, time, knowledge) at the time of making the assessment and indicate the purpose and future use to which the estimates should be applied. A set of guidelines for estimation of storage capacity will greatly assist future deliberations by government and industry on the appropriateness of geological storage of CO2 in different geological settings and political jurisdictions. This work has been initiated under the auspices of the Carbon Sequestration Leadership Forum (www.cslforum.org), and it is intended that it will be an ongoing taskforce to further examine issues associated with storage capacity estimation. Crown Copyright ?? 2007.

  20. [Simulation model for estimating the cancer care infrastructure required by the public health system].

    PubMed

    Gomes Junior, Saint Clair Santos; Almeida, Rosimary Terezinha

    2009-02-01

    To develop a simulation model using public data to estimate the cancer care infrastructure required by the public health system in the state of São Paulo, Brazil. Public data from the Unified Health System database regarding cancer surgery, chemotherapy, and radiation therapy, from January 2002-January 2004, were used to estimate the number of cancer cases in the state. The percentages recorded for each therapy in the Hospital Cancer Registry of Brazil were combined with the data collected from the database to estimate the need for services. Mixture models were used to identify subgroups of cancer cases with regard to the length of time that chemotherapy and radiation therapy were required. A simulation model was used to estimate the infrastructure required taking these parameters into account. The model indicated the need for surgery in 52.5% of the cases, radiation therapy in 42.7%, and chemotherapy in 48.5%. The mixture models identified two subgroups for radiation therapy and four subgroups for chemotherapy with regard to mean usage time for each. These parameters allowed the following estimated infrastructure needs to be made: 147 operating rooms, 2 653 operating beds, 297 chemotherapy chairs, and 102 radiation therapy devices. These estimates suggest the need for a 1.2-fold increase in the number of chemotherapy services and a 2.4-fold increase in the number of radiation therapy services when compared with the parameters currently used by the public health system. A simulation model, such as the one used in the present study, permits better distribution of health care resources because it is based on specific, local needs.

  1. Forest carbon estimation using the Forest Vegetation Simulator: Seven things you need to know

    Treesearch

    Coeli M. Hoover; Stephanie A. Rebain

    2011-01-01

    Interest in options for forest-related greenhouse gas mitigation is growing, and so is the need to assess the carbon implications of forest management actions. Generating estimates of key carbon pools can be time consuming and cumbersome, and exploring the carbon consequences of management alternatives is often a complicated task. In response to this, carbon reporting...

  2. Efficient multidimensional regularization for Volterra series estimation

    NASA Astrophysics Data System (ADS)

    Birpoutsoukis, Georgios; Csurcsia, Péter Zoltán; Schoukens, Johan

    2018-05-01

    This paper presents an efficient nonparametric time domain nonlinear system identification method. It is shown how truncated Volterra series models can be efficiently estimated without the need of long, transient-free measurements. The method is a novel extension of the regularization methods that have been developed for impulse response estimates of linear time invariant systems. To avoid the excessive memory needs in case of long measurements or large number of estimated parameters, a practical gradient-based estimation method is also provided, leading to the same numerical results as the proposed Volterra estimation method. Moreover, the transient effects in the simulated output are removed by a special regularization method based on the novel ideas of transient removal for Linear Time-Varying (LTV) systems. Combining the proposed methodologies, the nonparametric Volterra models of the cascaded water tanks benchmark are presented in this paper. The results for different scenarios varying from a simple Finite Impulse Response (FIR) model to a 3rd degree Volterra series with and without transient removal are compared and studied. It is clear that the obtained models capture the system dynamics when tested on a validation dataset, and their performance is comparable with the white-box (physical) models.

  3. Estimation of Time-Varying Pilot Model Parameters

    NASA Technical Reports Server (NTRS)

    Zaal, Peter M. T.; Sweet, Barbara T.

    2011-01-01

    Human control behavior is rarely completely stationary over time due to fatigue or loss of attention. In addition, there are many control tasks for which human operators need to adapt their control strategy to vehicle dynamics that vary in time. In previous studies on the identification of time-varying pilot control behavior wavelets were used to estimate the time-varying frequency response functions. However, the estimation of time-varying pilot model parameters was not considered. Estimating these parameters can be a valuable tool for the quantification of different aspects of human time-varying manual control. This paper presents two methods for the estimation of time-varying pilot model parameters, a two-step method using wavelets and a windowed maximum likelihood estimation method. The methods are evaluated using simulations of a closed-loop control task with time-varying pilot equalization and vehicle dynamics. Simulations are performed with and without remnant. Both methods give accurate results when no pilot remnant is present. The wavelet transform is very sensitive to measurement noise, resulting in inaccurate parameter estimates when considerable pilot remnant is present. Maximum likelihood estimation is less sensitive to pilot remnant, but cannot detect fast changes in pilot control behavior.

  4. Method for interconverting drying and heating times between round and square cross sections of ponderosa pine

    Treesearch

    William T. Simpson

    2005-01-01

    To use small-diameter trees effectively as square timbers, we need to be able to estimate the amount of time it takes for these timbers to air-dry. Since experimental data on estimating air-drying time for small-diameter logs have been developed, this study looked at a way to relate that method to square timbers. Drying times were determined for a group of round cross-...

  5. The United States Marine Corps in Cyberspace: Every Marine a Cyber Warrior

    DTIC Science & Technology

    2008-01-01

    information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...TillS COLLECTION OF INFORMATION IS ESTIMATED TO AVERAGE I HOUR PER RESPONSE, INCWDING TIlE TIME FOR REVIEWING INSTRUCTIONS. SEARCHING E.’\\’lSTING

  6. Problems in the estimation of human exposure to components of acid precipitation precursors.

    PubMed Central

    Ferris, B G; Spengler, J D

    1985-01-01

    Problems associated with estimation of human exposure to ambient air pollutants are discussed. Ideally, we would prefer to have some indication of actual dose. For most pollutants this is not presently feasible. Specific problems discussed are adequacy of outdoor monitors; the need to correct for exposures and time spent indoors; the need to have particle size distributions described and the chemistry of the particles presented. These indicate the need to develop lightweight accurate and reliable personal monitors. Images FIGURE 1. PMID:4076094

  7. Traffic evacuation time under nonhomogeneous conditions.

    PubMed

    Fazio, Joseph; Shetkar, Rohan; Mathew, Tom V

    2017-06-01

    During many manmade and natural crises such as terrorist threats, floods, hazardous chemical and gas leaks, emergency personnel need to estimate the time in which people can evacuate from the affected urban area. Knowing an estimated evacuation time for a given crisis, emergency personnel can plan and prepare accordingly with the understanding that the actual evacuation time will take longer. Given the urban area to be evacuated, street widths exiting the area's perimeter, the area's population density, average vehicle occupancy, transport mode share and crawl speed, an estimation of traffic evacuation time can be derived. Peak-hour traffic data collected at three, midblock, Mumbai sites of varying geometric features and traffic composition were used in calibrating a model that estimates peak-hour traffic flow rates. Model validation revealed a correlation coefficient of +0.98 between observed and predicted peak-hour flow rates. A methodology is developed that estimates traffic evacuation time using the model.

  8. Towards a publicly available, map-based regional software tool to estimate unregulated daily streamflow at ungauged rivers

    USGS Publications Warehouse

    Archfield, Stacey A.; Steeves, Peter A.; Guthrie, John D.; Ries, Kernell G.

    2013-01-01

    Streamflow information is critical for addressing any number of hydrologic problems. Often, streamflow information is needed at locations that are ungauged and, therefore, have no observations on which to base water management decisions. Furthermore, there has been increasing need for daily streamflow time series to manage rivers for both human and ecological functions. To facilitate negotiation between human and ecological demands for water, this paper presents the first publicly available, map-based, regional software tool to estimate historical, unregulated, daily streamflow time series (streamflow not affected by human alteration such as dams or water withdrawals) at any user-selected ungauged river location. The map interface allows users to locate and click on a river location, which then links to a spreadsheet-based program that computes estimates of daily streamflow for the river location selected. For a demonstration region in the northeast United States, daily streamflow was, in general, shown to be reliably estimated by the software tool. Estimating the highest and lowest streamflows that occurred in the demonstration region over the period from 1960 through 2004 also was accomplished but with more difficulty and limitations. The software tool provides a general framework that can be applied to other regions for which daily streamflow estimates are needed.

  9. Rand National Security Division Annual Report 2006

    DTIC Science & Technology

    2007-01-01

    collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any...this opportunity, they need to control enough personnel and material resources to secure and supply at least the capital. Marshaling Resources to Meet

  10. Meeting the oral health needs of 12-year-olds in China: human resources for oral health.

    PubMed

    Sun, Xiangyu; Bernabé, Eduardo; Liu, Xuenan; Zheng, Shuguo; Gallagher, Jennifer E

    2017-06-20

    An appropriate level of human resources for oral health [HROH] is required to meet the oral health needs of population, and enable maximum improvement in health outcomes. The aim of this study was to estimate the required HROH to meet the oral health needs of the World Health Organization [WHO] reference group of 12-year-olds in China and consider the implications for education, practice, policy and HROH nationally. We estimated the need of HROH to meet the needs of 12-year-olds based on secondary analysis of the epidemiological and questionnaire data from the 3rd Chinese National Oral Health Survey, including caries experience and periodontal factors (calculus), dentally-related behaviour (frequency of toothbrushing and sugar intake), and social factors (parental education). Children's risk for dental caries was classified in four levels from low (level 1) to high (level 4). We built maximum and minimum intervention models of dental care for each risk level, informed by contemporary evidence-based practice. The needs-led HROH model we used in the present study incorporated need for treatment and risk-based prevention using timings verified by experts in China. These findings were used to estimate HROH for the survey sample, extrapolated to 12-year-olds nationally and the total population, taking account of urban and rural coverage, based on different levels of clinical commitment (60-90%). We found that between 40,139 and 51,906 dental professionals were required to deliver care for 12-year-olds nationally based on 80% clinical commitment. We demonstrated that the majority of need for HROH was in the rural population (72.5%). Over 93% of HROH time was dedicated to prevention within the model. Extrapolating the results to the total population, the estimate for HROH nationally was 3.16-4.09 million to achieve national coverage; however, current HROH are only able to serve an estimated 5% of the population with minimum intervention based on a HROH spending 90% of their time in providing clinical care. The findings highlight the gap between dental workforce needs and workforce capacity in China. Significant implications for health policy and human resources for oral health in this country with a developing health system are discussed including the need for public health action.

  11. [Supplementary device for a dynamometer to evaluate and register muscular endurance indices].

    PubMed

    Timoshenko, D A; Bokser, O Ia

    1986-01-01

    In practice of psychophysiologic research muscular endurance index is used for estimation of CNS function. Muscular endurance index is defined as relative time needed for maintaining the preset muscular effort. The described device widens the possibilities of a digital dynamometer for automatic estimation and recording of muscular endurance index in real time.

  12. F-35 Risk during Department of Defense Financial Crisis

    DTIC Science & Technology

    2013-03-01

    information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...program, an approach intended to save time and money by launching construction at an early stage and at the same time the aircraft was put through

  13. A channel estimation scheme for MIMO-OFDM systems

    NASA Astrophysics Data System (ADS)

    He, Chunlong; Tian, Chu; Li, Xingquan; Zhang, Ce; Zhang, Shiqi; Liu, Chaowen

    2017-08-01

    In view of the contradiction of the time-domain least squares (LS) channel estimation performance and the practical realization complexity, a reduced complexity channel estimation method for multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) based on pilot is obtained. This approach can transform the complexity of MIMO-OFDM channel estimation problem into a simple single input single output-orthogonal frequency division multiplexing (SISO-OFDM) channel estimation problem and therefore there is no need for large matrix pseudo-inverse, which greatly reduces the complexity of algorithms. Simulation results show that the bit error rate (BER) performance of the obtained method with time orthogonal training sequences and linear minimum mean square error (LMMSE) criteria is better than that of time-domain LS estimator and nearly optimal performance.

  14. On-line 3D motion estimation using low resolution MRI

    NASA Astrophysics Data System (ADS)

    Glitzner, M.; de Senneville, B. Denis; Lagendijk, J. J. W.; Raaymakers, B. W.; Crijns, S. P. M.

    2015-08-01

    Image processing such as deformable image registration finds its way into radiotherapy as a means to track non-rigid anatomy. With the advent of magnetic resonance imaging (MRI) guided radiotherapy, intrafraction anatomy snapshots become technically feasible. MRI provides the needed tissue signal for high-fidelity image registration. However, acquisitions, especially in 3D, take a considerable amount of time. Pushing towards real-time adaptive radiotherapy, MRI needs to be accelerated without degrading the quality of information. In this paper, we investigate the impact of image resolution on the quality of motion estimations. Potentially, spatially undersampled images yield comparable motion estimations. At the same time, their acquisition times would reduce greatly due to the sparser sampling. In order to substantiate this hypothesis, exemplary 4D datasets of the abdomen were downsampled gradually. Subsequently, spatiotemporal deformations are extracted consistently using the same motion estimation for each downsampled dataset. Errors between the original and the respectively downsampled version of the dataset are then evaluated. Compared to ground-truth, results show high similarity of deformations estimated from downsampled image data. Using a dataset with {{≤ft(2.5 \\text{mm}\\right)}3} voxel size, deformation fields could be recovered well up to a downsampling factor of 2, i.e. {{≤ft(5 \\text{mm}\\right)}3} . In a therapy guidance scenario MRI, imaging speed could accordingly increase approximately fourfold, with acceptable loss of estimated motion quality.

  15. Estimating adolescent sleep need using dose-response modeling.

    PubMed

    Short, Michelle A; Weber, Nathan; Reynolds, Chelsea; Coussens, Scott; Carskadon, Mary A

    2018-04-01

    This study will (1) estimate the nightly sleep need of human adolescents, (2) determine the time course and severity of sleep-related deficits when sleep is reduced below this optimal quantity, and (3) determine whether sleep restriction perturbs the circadian system as well as the sleep homeostat. Thirty-four adolescents aged 15 to 17 years spent 10 days and nine nights in the sleep laboratory. Between two baseline nights and two recovery nights with 10 hours' time in bed (TIB) per night, participants experienced either severe sleep restriction (5-hour TIB), moderate sleep restriction (7.5-hour TIB), or no sleep restriction (10-hour TIB) for five nights. A 10-minute psychomotor vigilance task (PVT; lapse = response after 500 ms) and the Karolinska Sleepiness Scale were administered every 3 hours during wake. Salivary dim-light melatonin onset was calculated at baseline and after four nights of each sleep dose to estimate circadian phase. Dose-dependent deficits to sleep duration, circadian phase timing, lapses of attention, and subjective sleepiness occurred. Less TIB resulted in less sleep, more lapses of attention, greater subjective sleepiness, and larger circadian phase delays. Sleep need estimated from 10-hour TIB sleep opportunities was approximately 9 hours, while modeling PVT lapse data suggested that 9.35 hours of sleep is needed to maintain optimal sustained attention performance. Sleep restriction perturbs homeostatic and circadian systems, leading to dose-dependent deficits to sustained attention and sleepiness. Adolescents require more sleep for optimal functioning than typically obtained.

  16. Assessment of Voting Assistance Programs for Calendar Year 2011

    DTIC Science & Technology

    2012-03-30

    is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of...emphasis. We reviewed the Service IG reports and certain supporting data , as needed; met with senior IG representatives from the Army, Navy, Air Force

  17. Estimating aboveground biomass of mariola (Parthenium incanum) from plant dimensions

    Treesearch

    Carlos Villalobos

    2007-01-01

    The distribution and abundance of plant biomass in space and time are important properties of rangeland ecosystem. Land managers and researchers require reliable shrub weight estimates to evaluate site productivity, food abundance, treatment effects, and stocking rates. Rapid, nondestructive methods are needed to estimate shrub biomass in semi-arid ecosystems. Shrub...

  18. A Scalable, Reconfigurable, and Dependable Time-Triggered Architecture

    DTIC Science & Technology

    2003-07-01

    burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...existing data sources, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of...information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this

  19. Estimating the Value of Life, Injury, and Travel Time Saved Using a Stated Preference Framework.

    PubMed

    Niroomand, Naghmeh; Jenkins, Glenn P

    2016-06-01

    The incidence of fatality over the period 2010-2014 from automobile accidents in North Cyprus is 2.75 times greater than the average for the EU. With the prospect of North Cyprus entering the EU, many investments will need to be undertaken to improve road safety in order to reach EU benchmarks. The objective of this study is to provide local estimates of the value of a statistical life and injury along with the value of time savings. These are among the parameter values needed for the evaluation of the change in the expected incidence of automotive accidents and time savings brought about by such projects. In this study we conducted a stated choice experiment to identify the preferences and tradeoffs of automobile drivers in North Cyprus for improved travel times, travel costs, and safety. The choice of route was examined using mixed logit models to obtain the marginal utilities associated with each attribute of the routes that consumers choose. These estimates were used to assess the individuals' willingness to pay (WTP) to avoid fatalities and injuries and to save travel time. We then used the results to obtain community-wide estimates of the value of a statistical life (VSL) saved, the value of injury (VI) prevented, and the value per hour of travel time saved. The estimates for the VSL range from €315,293 to €1,117,856 and the estimates of VI from € 5,603 to € 28,186. These values are consistent, after adjusting for differences in incomes, with the median results of similar studies done for EU countries. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Transmission overhaul estimates for partial and full replacement at repair

    NASA Technical Reports Server (NTRS)

    Savage, M.; Lewicki, D. G.

    1991-01-01

    Timely transmission overhauls increase in-flight service reliability greater than the calculated design reliabilities of the individual aircraft transmission components. Although necessary for aircraft safety, transmission overhauls contribute significantly to aircraft expense. Predictions of a transmission's maintenance needs at the design stage should enable the development of more cost effective and reliable transmissions in the future. The frequency is estimated of overhaul along with the number of transmissions or components needed to support the overhaul schedule. Two methods based on the two parameter Weibull statistical distribution for component life are used to estimate the time between transmission overhauls. These methods predict transmission lives for maintenance schedules which repair the transmission with a complete system replacement or repair only failed components of the transmission. An example illustrates the methods.

  1. Improving the Discipline of Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Piland, William M.; Pine, David J.; Wilson, Delano M.

    2000-01-01

    The need to improve the quality and accuracy of cost estimates of proposed new aerospace systems has been widely recognized. The industry has done the best job of maintaining related capability with improvements in estimation methods and giving appropriate priority to the hiring and training of qualified analysts. Some parts of Government, and National Aeronautics and Space Administration (NASA) in particular, continue to need major improvements in this area. Recently, NASA recognized that its cost estimation and analysis capabilities had eroded to the point that the ability to provide timely, reliable estimates was impacting the confidence in planning man), program activities. As a result, this year the Agency established a lead role for cost estimation and analysis. The Independent Program Assessment Office located at the Langley Research Center was given this responsibility.

  2. 76 FR 17151 - Agency Information Collection Activities: Proposed Collection; Comments Requested

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-28

    ... Communities Together (PACT) 360 Needs Assessment Survey. The Department of Justice (DOJ) Office of Community... (PACT) 360 Needs Assessment Survey. (3) Agency form number, if any, and the applicable component of the... estimated public burden or associated response time, suggestions, or need a copy of the proposed information...

  3. An estimation of dental treatment needs in two groups of refugees in Sweden.

    PubMed

    Zimmerman, M; Bornstein, R; Martinsson, T

    1990-06-01

    The aim of this study was to estimate dental treatment need in groups of Chilean and Polish refugees in Sweden. Of the Nordic countries, Sweden accepts the greatest number of refugees. An average of 5000 refugees arrived annually in 1981-85, increasing to 15,000 during 1986-87. Refugees and their families now comprise 93% of non-Nordic immigration. In 1981-83 a sample of 193 Chilean and 92 Polish refugees in the county of Stockholm was selected for this study. Dental treatment needs were calculated in accordance with CPITN and the working study of Swedish dentistry, which formed the basis for the Swedish scale of dental fees for the National Dental Insurance Scheme. The estimated mean treatment time (+/- SD) in the Chilean sample was 6.9 +/- 2.3 h and in the Polish group 8.4 +/- 3.0; in comparison with estimated treatment needs in a Swedish material, both would be classified as extreme risk groups. There was no correlation between the number of months in Sweden and the estimated treatment needs. The results indicate a cumulative, unmet need for dental care in these groups. Barriers to ensuring adequate health care for immigrants persist; special outreach programmes, conducted by dental health personnel, may be an effective means of introducing immigrants to the Swedish dental care system.

  4. How many physicians will be needed to provide medical care for older persons? Physician manpower needs for the twenty-first century.

    PubMed

    Reuben, D B; Zwanziger, J; Bradley, T B; Fink, A; Hirsch, S H; Williams, A P; Solomon, D H; Beck, J C

    1993-04-01

    To estimate the number of full-time-equivalent (FTE) physicians and geriatricians needed to provide medical care in the years 2000 to 2030, we developed utilization-based models of need for non-surgical physicians and need for geriatricians. Based on projected utilization, the number of FTE physicians required to care for the elderly will increase two- or threefold over the next 40 years. Alternate economic scenarios have very little effect on estimates of FTE physicians needed but exert large effects on the projected number of FTE geriatricians needed. We conclude that during the years 2000 to 2030, population growth will be the major factor determining the number of physicians needed to provide medicare care; economic forces will have a greater influence on the number of geriatricians needed.

  5. Tactical decision making for selective expansion of operating room resources incorporating financial criteria and uncertainty in subspecialties' future workloads.

    PubMed

    Dexter, Franklin; Ledolter, Johannes; Wachtel, Ruth E

    2005-05-01

    We considered the allocation of operating room (OR) time at facilities where the strategic decision had been made to increase the number of ORs. Allocation occurs in two stages: a long-term tactical stage followed by short-term operational stage. Tactical decisions, approximately 1 yr in advance, determine what specialized equipment and expertise will be needed. Tactical decisions are based on estimates of future OR workload for each subspecialty or surgeon. We show that groups of surgeons can be excluded from consideration at this tactical stage (e.g., surgeons who need intensive care beds or those with below average contribution margins per OR hour). Lower and upper limits are estimated for the future demand of OR time by the remaining surgeons. Thus, initial OR allocations can be accomplished with only partial information on future OR workload. Once the new ORs open, operational decision-making based on OR efficiency is used to fill the OR time and adjust staffing. Surgeons who were not allocated additional time at the tactical stage are provided increased OR time through operational adjustments based on their actual workload. In a case study from a tertiary hospital, future demand estimates were needed for only 15% of surgeons, illustrating the practicality of these methods for use in tactical OR allocation decisions.

  6. Discrete-time neural network for fast solving large linear L1 estimation problems and its application to image restoration.

    PubMed

    Xia, Youshen; Sun, Changyin; Zheng, Wei Xing

    2012-05-01

    There is growing interest in solving linear L1 estimation problems for sparsity of the solution and robustness against non-Gaussian noise. This paper proposes a discrete-time neural network which can calculate large linear L1 estimation problems fast. The proposed neural network has a fixed computational step length and is proved to be globally convergent to an optimal solution. Then, the proposed neural network is efficiently applied to image restoration. Numerical results show that the proposed neural network is not only efficient in solving degenerate problems resulting from the nonunique solutions of the linear L1 estimation problems but also needs much less computational time than the related algorithms in solving both linear L1 estimation and image restoration problems.

  7. Development of regional stump-to-mill logging cost estimators

    Treesearch

    Chris B. LeDoux; John E. Baumgras

    1989-01-01

    Planning logging operations requires estimating the logging costs for the sale or tract being harvested. Decisions need to be made on equipment selection and its application to terrain. In this paper a methodology is described that has been developed and implemented to solve the problem of accurately estimating logging costs by region. The methodology blends field time...

  8. Parameter Estimates in Differential Equation Models for Chemical Kinetics

    ERIC Educational Resources Information Center

    Winkel, Brian

    2011-01-01

    We discuss the need for devoting time in differential equations courses to modelling and the completion of the modelling process with efforts to estimate the parameters in the models using data. We estimate the parameters present in several differential equation models of chemical reactions of order n, where n = 0, 1, 2, and apply more general…

  9. Pathfinder. Volume 9, Number 2, March/April 2011

    DTIC Science & Technology

    2011-03-01

    vides audio, video, desktop sharing and chat.” The platform offers a real-time, Web- based presentation tool to create information and general...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or

  10. High Resolution Bathymetry Estimation Improvement with Single Image Super-Resolution Algorithm Super-Resolution Forests

    DTIC Science & Technology

    2017-01-26

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/5514--17-9692 High Resolution Bathymetry Estimation Improvement with Single Image Super...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate

  11. Reconstructing the hidden states in time course data of stochastic models.

    PubMed

    Zimmer, Christoph

    2015-11-01

    Parameter estimation is central for analyzing models in Systems Biology. The relevance of stochastic modeling in the field is increasing. Therefore, the need for tailored parameter estimation techniques is increasing as well. Challenges for parameter estimation are partial observability, measurement noise, and the computational complexity arising from the dimension of the parameter space. This article extends the multiple shooting for stochastic systems' method, developed for inference in intrinsic stochastic systems. The treatment of extrinsic noise and the estimation of the unobserved states is improved, by taking into account the correlation between unobserved and observed species. This article demonstrates the power of the method on different scenarios of a Lotka-Volterra model, including cases in which the prey population dies out or explodes, and a Calcium oscillation system. Besides showing how the new extension improves the accuracy of the parameter estimates, this article analyzes the accuracy of the state estimates. In contrast to previous approaches, the new approach is well able to estimate states and parameters for all the scenarios. As it does not need stochastic simulations, it is of the same order of speed as conventional least squares parameter estimation methods with respect to computational time. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Antiretroviral therapy needs: the effect of changing global guidelines.

    PubMed

    Stanecki, Karen; Daher, Juliana; Stover, John; Beusenberg, Michel; Souteyrand, Yves; García Calleja, Jesus M

    2010-12-01

    In 2010 the WHO issued a revision of the guidelines on antiretroviral therapy (ART) for HIV infection in adults and adolescents. The recommendations included earlier diagnosis and treatment of HIV in the interest of a longer and healthier life. The current analysis explores the impact on the estimates of treatment needs of the new criteria for initiating ART compared with the previous guidelines. The analyses are based on the national models of HIV estimates for the years 1990-2009. These models produce time series estimates of ART treatment need and HIV-related mortality. The ART need estimates based on ART eligibility criteria promoted by the 2010 WHO guidelines were compared with the need estimates based on the 2006 WHO guidelines. With the 2010 eligibility criteria, the proportion of people living with HIV currently in need of ART is estimated to increase from 34% to 49%. Globally, the need increases from 11.4 million (10.2-12.5 million) to 16.2 million (14.8-17.1 million). Regional differences include 7.4 million (6.4-8.4 million) to 10.6 million (9.7-11.5 million) in sub-Saharan Africa, 1.6 million (1.3-1.7 million) to 2.4 million (2.1-2.5 million) in Asia and 710 000 (610 000-780 000) to 950 000 (810 000-1.0 million) in Latin America and the Caribbean. When adopting the new recommendations, countries have to adapt their planning process in order to accelerate access to life saving drugs to those in need. These recommendations have a significant impact on resource needs. In addition to improving and prolonging the lives of the infected individuals, it will have the expected benefit of reducing HIV transmission and the future HIV/AIDS burden.

  13. An Analysis of Image Segmentation Time in Beam’s-Eye-View Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Chun; Spelbring, D.R.; Chen, George T.Y.

    In this work we tabulate and histogram the image segmentation time for beam’s eye view (BEV) treatment planning in our center. The average time needed to generate contours on CT images delineating normal structures and treatment target volumes is calculated using a data base containing over 500 patients’ BEV plans. The average number of contours and total image segmentation time needed for BEV plans in three common treatment sites, namely, head/neck, lung/chest, and prostate, were estimated.

  14. A needs-based workforce model to deliver tertiary-level community mental health care for distressed infants, children, and adolescents in South Australia: a mixed-methods study.

    PubMed

    Segal, Leonie; Guy, Sophie; Leach, Matthew; Groves, Aaron; Turnbull, Catherine; Furber, Gareth

    2018-06-01

    High-quality mental health services for infants, children, adolescents, and their families can improve outcomes for children exposed to early trauma. We sought to estimate the workforce needed to deliver tertiary-level community mental health care to all infants, children, adolescents, and their families in need using a generalisable model, applied to South Australia (SA). Workforce estimates were determined using a workforce planning model. Clinical need was established using data from the Longitudinal Study of Australian Children and the Young Minds Matter survey. Care requirements were derived by workshopping clinical pathways with multiprofessional panels, testing derived estimates through an online survey of clinicians. Prevalence of tertiary-level need, defined by severity and exposure to childhood adversities, was estimated at 5-8% across infancy and childhood, and 16% in mid-adolescence. The derived care pathway entailed reception, triage, and follow-up (mean 3 h per patient), core clinical management (mean 27 h per patient per year), psychiatric oversight (mean 4 h per patient per year), specialised clinical role (mean 12 h per patient per year), and socioeconomic support (mean 12 h per patient per year). The modelled clinical full-time equivalent was 947 people and budget was AU$126 million, more than five times the current service level. Our novel needs-based workforce model produced actionable estimates of the community workforce needed to address tertiary-level mental health needs in infants, children, adolescents, and their families in SA. A considerable expansion in the skilled workforce is needed to support young people facing current distress and associated family-based adversities. Because mental illness is implicated in so many burgeoning social ills, addressing this shortfall could have wide-ranging benefits. National Health and Medical Research Council (Australia), Department of Health SA. Copyright © 2018 The Authors. Published by Elsevier Ltd. This is an Open Access article under the CC BY-NC-ND 4.0 license. Published by Elsevier Ltd.. All rights reserved.

  15. Optimal wavefront estimation of incoherent sources

    NASA Astrophysics Data System (ADS)

    Riggs, A. J. Eldorado; Kasdin, N. Jeremy; Groff, Tyler

    2014-08-01

    Direct imaging is in general necessary to characterize exoplanets and disks. A coronagraph is an instrument used to create a dim (high-contrast) region in a star's PSF where faint companions can be detected. All coronagraphic high-contrast imaging systems use one or more deformable mirrors (DMs) to correct quasi-static aberrations and recover contrast in the focal plane. Simulations show that existing wavefront control algorithms can correct for diffracted starlight in just a few iterations, but in practice tens or hundreds of control iterations are needed to achieve high contrast. The discrepancy largely arises from the fact that simulations have perfect knowledge of the wavefront and DM actuation. Thus, wavefront correction algorithms are currently limited by the quality and speed of wavefront estimates. Exposures in space will take orders of magnitude more time than any calculations, so a nonlinear estimation method that needs fewer images but more computational time would be advantageous. In addition, current wavefront correction routines seek only to reduce diffracted starlight. Here we present nonlinear estimation algorithms that include optimal estimation of sources incoherent with a star such as exoplanets and debris disks.

  16. Identifying demand for health resources using waiting times information.

    PubMed

    Blundell, R; Windmeijer, F

    2000-09-01

    In this paper the differences in average waiting times are utilized to identify the determinants of demand for health services. The equilibrium waiting time framework is used, but the full equilibrium assumption is relaxed by selecting areas with low waiting times and by estimating a (semi-)parametric selection model. Determinants of supply are used as instruments for the endogeneity of waiting times. A model for the demand for acute services at the ward level in the UK is estimated. The model estimates, and their implications for health service allocations in the UK, are contrasted against more standard allocation models. The present results show that it is critically important to account for rationing by waiting times when identifying needs from care utilization data. Copyright 2000 John Wiley & Sons, Ltd.

  17. Parametric estimation for reinforced concrete relief shelter for Aceh cases

    NASA Astrophysics Data System (ADS)

    Atthaillah; Saputra, Eri; Iqbal, Muhammad

    2018-05-01

    This paper was a work in progress (WIP) to discover a rapid parametric framework for post-disaster permanent shelter’s materials estimation. The intended shelters were reinforced concrete construction with bricks as its wall. Inevitably, in post-disaster cases, design variations were needed to help suited victims condition. It seemed impossible to satisfy a beneficiary with a satisfactory design utilizing the conventional method. This study offered a parametric framework to overcome slow construction-materials estimation issue against design variations. Further, this work integrated parametric tool, which was Grasshopper to establish algorithms that simultaneously model, visualize, calculate and write the calculated data to a spreadsheet in a real-time. Some customized Grasshopper components were created using GHPython scripting for a more optimized algorithm. The result from this study was a partial framework that successfully performed modeling, visualization, calculation and writing the calculated data simultaneously. It meant design alterations did not escalate time needed for modeling, visualization, and material estimation. Further, the future development of the parametric framework will be made open source.

  18. Adaptive Training in an Unmanned Aerial Vehicel: Examination of Several Candidate Real-time Metrics

    DTIC Science & Technology

    2010-01-01

    for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data... sources , gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden... estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services

  19. Real-time airborne gamma-ray background estimation using NASVD with MLE and radiation transport for calibration

    NASA Astrophysics Data System (ADS)

    Kulisek, J. A.; Schweppe, J. E.; Stave, S. C.; Bernacki, B. E.; Jordan, D. V.; Stewart, T. N.; Seifert, C. E.; Kernan, W. J.

    2015-06-01

    Helicopter-mounted gamma-ray detectors can provide law enforcement officials the means to quickly and accurately detect, identify, and locate radiological threats over a wide geographical area. The ability to accurately distinguish radiological threat-generated gamma-ray signatures from background gamma radiation in real time is essential in order to realize this potential. This problem is non-trivial, especially in urban environments for which the background may change very rapidly during flight. This exacerbates the challenge of estimating background due to the poor counting statistics inherent in real-time airborne gamma-ray spectroscopy measurements. To address this challenge, we have developed a new technique for real-time estimation of background gamma radiation from aerial measurements without the need for human analyst intervention. The method can be calibrated using radiation transport simulations along with data from previous flights over areas for which the isotopic composition need not be known. Over the examined measured and simulated data sets, the method generated accurate background estimates even in the presence of a strong, 60Co source. The potential to track large and abrupt changes in background spectral shape and magnitude was demonstrated. The method can be implemented fairly easily in most modern computing languages and environments.

  20. ANN based Real-Time Estimation of Power Generation of Different PV Module Types

    NASA Astrophysics Data System (ADS)

    Syafaruddin; Karatepe, Engin; Hiyama, Takashi

    Distributed generation is expected to become more important in the future generation system. Utilities need to find solutions that help manage resources more efficiently. Effective smart grid solutions have been experienced by using real-time data to help refine and pinpoint inefficiencies for maintaining secure and reliable operating conditions. This paper proposes the application of Artificial Neural Network (ANN) for the real-time estimation of the maximum power generation of PV modules of different technologies. An intelligent technique is necessary required in this case due to the relationship between the maximum power of PV modules and the open circuit voltage and temperature is nonlinear and can't be easily expressed by an analytical expression for each technology. The proposed ANN method is using input signals of open circuit voltage and cell temperature instead of irradiance and ambient temperature to determine the estimated maximum power generation of PV modules. It is important for the utility to have the capability to perform this estimation for optimal operating points and diagnostic purposes that may be an early indicator of a need for maintenance and optimal energy management. The proposed method is accurately verified through a developed real-time simulator on the daily basis of irradiance and cell temperature changes.

  1. Methods for estimating confidence intervals in interrupted time series analyses of health interventions.

    PubMed

    Zhang, Fang; Wagner, Anita K; Soumerai, Stephen B; Ross-Degnan, Dennis

    2009-02-01

    Interrupted time series (ITS) is a strong quasi-experimental research design, which is increasingly applied to estimate the effects of health services and policy interventions. We describe and illustrate two methods for estimating confidence intervals (CIs) around absolute and relative changes in outcomes calculated from segmented regression parameter estimates. We used multivariate delta and bootstrapping methods (BMs) to construct CIs around relative changes in level and trend, and around absolute changes in outcome based on segmented linear regression analyses of time series data corrected for autocorrelated errors. Using previously published time series data, we estimated CIs around the effect of prescription alerts for interacting medications with warfarin on the rate of prescriptions per 10,000 warfarin users per month. Both the multivariate delta method (MDM) and the BM produced similar results. BM is preferred for calculating CIs of relative changes in outcomes of time series studies, because it does not require large sample sizes when parameter estimates are obtained correctly from the model. Caution is needed when sample size is small.

  2. Estimating rainfall time series and model parameter distributions using model data reduction and inversion techniques

    NASA Astrophysics Data System (ADS)

    Wright, Ashley J.; Walker, Jeffrey P.; Pauwels, Valentijn R. N.

    2017-08-01

    Floods are devastating natural hazards. To provide accurate, precise, and timely flood forecasts, there is a need to understand the uncertainties associated within an entire rainfall time series, even when rainfall was not observed. The estimation of an entire rainfall time series and model parameter distributions from streamflow observations in complex dynamic catchments adds skill to current areal rainfall estimation methods, allows for the uncertainty of entire rainfall input time series to be considered when estimating model parameters, and provides the ability to improve rainfall estimates from poorly gauged catchments. Current methods to estimate entire rainfall time series from streamflow records are unable to adequately invert complex nonlinear hydrologic systems. This study aims to explore the use of wavelets in the estimation of rainfall time series from streamflow records. Using the Discrete Wavelet Transform (DWT) to reduce rainfall dimensionality for the catchment of Warwick, Queensland, Australia, it is shown that model parameter distributions and an entire rainfall time series can be estimated. Including rainfall in the estimation process improves streamflow simulations by a factor of up to 1.78. This is achieved while estimating an entire rainfall time series, inclusive of days when none was observed. It is shown that the choice of wavelet can have a considerable impact on the robustness of the inversion. Combining the use of a likelihood function that considers rainfall and streamflow errors with the use of the DWT as a model data reduction technique allows the joint inference of hydrologic model parameters along with rainfall.

  3. Reducing the number of reconstructions needed for estimating channelized observer performance

    NASA Astrophysics Data System (ADS)

    Pineda, Angel R.; Miedema, Hope; Brenner, Melissa; Altaf, Sana

    2018-03-01

    A challenge for task-based optimization is the time required for each reconstructed image in applications where reconstructions are time consuming. Our goal is to reduce the number of reconstructions needed to estimate the area under the receiver operating characteristic curve (AUC) of the infinitely-trained optimal channelized linear observer. We explore the use of classifiers which either do not invert the channel covariance matrix or do feature selection. We also study the assumption that multiple low contrast signals in the same image of a non-linear reconstruction do not significantly change the estimate of the AUC. We compared the AUC of several classifiers (Hotelling, logistic regression, logistic regression using Firth bias reduction and the least absolute shrinkage and selection operator (LASSO)) with a small number of observations both for normal simulated data and images from a total variation reconstruction in magnetic resonance imaging (MRI). We used 10 Laguerre-Gauss channels and the Mann-Whitney estimator for AUC. For this data, our results show that at small sample sizes feature selection using the LASSO technique can decrease bias of the AUC estimation with increased variance and that for large sample sizes the difference between these classifiers is small. We also compared the use of multiple signals in a single reconstructed image to reduce the number of reconstructions in a total variation reconstruction for accelerated imaging in MRI. We found that AUC estimation using multiple low contrast signals in the same image resulted in similar AUC estimates as doing a single reconstruction per signal leading to a 13x reduction in the number of reconstructions needed.

  4. Estimating psychiatric manpower requirements based on patients' needs.

    PubMed

    Faulkner, L R; Goldman, C R

    1997-05-01

    To provide a better understanding of the complexities of estimating psychiatric manpower requirements, the authors describe several approaches to estimation and present a method based on patients' needs. A five-step method for psychiatric manpower estimation is used, with estimates of data pertinent to each step, to calculate the total psychiatric manpower requirements for the United States. The method is also used to estimate the hours of psychiatric service per patient per year that might be available under current psychiatric practice and under a managed care scenario. Depending on assumptions about data at each step in the method, the total psychiatric manpower requirements for the U.S. population range from 2,989 to 358,696 full-time-equivalent psychiatrists. The number of available hours of psychiatric service per patient per year is 14.1 hours under current psychiatric practice and 2.8 hours under the managed care scenario. The key to psychiatric manpower estimation lies in clarifying the assumptions that underlie the specific method used. Even small differences in assumptions mean large differences in estimates. Any credible manpower estimation process must include discussions and negotiations between psychiatrists, other clinicians, administrators, and patients and families to clarify the treatment needs of patients and the roles, responsibilities, and job description of psychiatrists.

  5. Convenience Sampling of Children Presenting to Hospital-Based Outpatient Clinics to Estimate Childhood Obesity Levels in Local Surroundings.

    PubMed

    Gilliland, Jason; Clark, Andrew F; Kobrzynski, Marta; Filler, Guido

    2015-07-01

    Childhood obesity is a critical public health matter associated with numerous pediatric comorbidities. Local-level data are required to monitor obesity and to help administer prevention efforts when and where they are most needed. We hypothesized that samples of children visiting hospital clinics could provide representative local population estimates of childhood obesity using data from 2007 to 2013. Such data might provide more accurate, timely, and cost-effective obesity estimates than national surveys. Results revealed that our hospital-based sample could not serve as a population surrogate. Further research is needed to confirm this finding.

  6. Enterprise Information Lifecycle Management

    DTIC Science & Technology

    2011-01-01

    Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden to Washington...Need for Information Lifecycle Management .......................................................... 6 3.3 Challenges of Information Lifecycle

  7. Benchmarking real-time RGBD odometry for light-duty UAVs

    NASA Astrophysics Data System (ADS)

    Willis, Andrew R.; Sahawneh, Laith R.; Brink, Kevin M.

    2016-06-01

    This article describes the theoretical and implementation challenges associated with generating 3D odometry estimates (delta-pose) from RGBD sensor data in real-time to facilitate navigation in cluttered indoor environments. The underlying odometry algorithm applies to general 6DoF motion; however, the computational platforms, trajectories, and scene content are motivated by their intended use on indoor, light-duty UAVs. Discussion outlines the overall software pipeline for sensor processing and details how algorithm choices for the underlying feature detection and correspondence computation impact the real-time performance and accuracy of the estimated odometry and associated covariance. This article also explores the consistency of odometry covariance estimates and the correlation between successive odometry estimates. The analysis is intended to provide users information needed to better leverage RGBD odometry within the constraints of their systems.

  8. Head movement compensation in real-time magnetoencephalographic recordings.

    PubMed

    Little, Graham; Boe, Shaun; Bardouille, Timothy

    2014-01-01

    Neurofeedback- and brain-computer interface (BCI)-based interventions can be implemented using real-time analysis of magnetoencephalographic (MEG) recordings. Head movement during MEG recordings, however, can lead to inaccurate estimates of brain activity, reducing the efficacy of the intervention. Most real-time applications in MEG have utilized analyses that do not correct for head movement. Effective means of correcting for head movement are needed to optimize the use of MEG in such applications. Here we provide preliminary validation of a novel analysis technique, real-time source estimation (rtSE), that measures head movement and generates corrected current source time course estimates in real-time. rtSE was applied while recording a calibrated phantom to determine phantom position localization accuracy and source amplitude estimation accuracy under stationary and moving conditions. Results were compared to off-line analysis methods to assess validity of the rtSE technique. The rtSE method allowed for accurate estimation of current source activity at the source-level in real-time, and accounted for movement of the source due to changes in phantom position. The rtSE technique requires modifications and specialized analysis of the following MEG work flow steps.•Data acquisition•Head position estimation•Source localization•Real-time source estimation This work explains the technical details and validates each of these steps.

  9. Using a Modification of the Capture-Recapture Model To Estimate the Need for Substance Abuse Treatment.

    ERIC Educational Resources Information Center

    Maxwell, Jane Carlisle; Pullum, Thomas W.

    2001-01-01

    Applied the capture-recapture model, through a Poisson regression to a time series of data for admissions to treatment from 1987 to 1996 to estimate the number of heroin addicts in Texas who are "at-risk" for treatment. The entire data set produced estimates that were lower and more plausible than those produced by drawing samples,…

  10. Simulation of TRMM Microwave Imager Brightness Temperature using Precipitation Radar Reflectivity for Convective and Stratiform Rain Areas over Land

    NASA Technical Reports Server (NTRS)

    Prabhakara, C.; Iacovazzi, R., Jr.; Yoo, J.-M.; Lau, William K. M. (Technical Monitor)

    2002-01-01

    Rain is highly variable in space and time. In order to measure rainfall over global land with satellites, we need observations with very high spatial resolution and frequency in time. On board the Tropical Rainfall Measuring Mission (TRMM) satellite, the Precipitation Radar (PR) and Microwave Imager (TMI) are flown together for the purpose of estimating rain rate. The basic method to estimate rain from PR has been developed over the past several decades. On the other hand, the TMI method of rain estimation is still in the state development, particularly over land. The objective of this technical memorandum is to develop a theoretical framework that helps relate the observations made by these two instruments. The principle result of this study is that in order to match the PR observations with the TMI observations in convective rain areas, a mixed layer of graupel and supercooled water drops above the freezing level is needed. On the other hand, to match these observations in the stratiform region, a layer of snowflakes with appropriate densities above the freezing level, and a melting layer below the freezing level, are needed. This understanding can lead to a robust rainfall estimation technique from the microwave radiometer observations.

  11. Advances in Time Estimation Methods for Molecular Data.

    PubMed

    Kumar, Sudhir; Hedges, S Blair

    2016-04-01

    Molecular dating has become central to placing a temporal dimension on the tree of life. Methods for estimating divergence times have been developed for over 50 years, beginning with the proposal of molecular clock in 1962. We categorize the chronological development of these methods into four generations based on the timing of their origin. In the first generation approaches (1960s-1980s), a strict molecular clock was assumed to date divergences. In the second generation approaches (1990s), the equality of evolutionary rates between species was first tested and then a strict molecular clock applied to estimate divergence times. The third generation approaches (since ∼2000) account for differences in evolutionary rates across the tree by using a statistical model, obviating the need to assume a clock or to test the equality of evolutionary rates among species. Bayesian methods in the third generation require a specific or uniform prior on the speciation-process and enable the inclusion of uncertainty in clock calibrations. The fourth generation approaches (since 2012) allow rates to vary from branch to branch, but do not need prior selection of a statistical model to describe the rate variation or the specification of speciation model. With high accuracy, comparable to Bayesian approaches, and speeds that are orders of magnitude faster, fourth generation methods are able to produce reliable timetrees of thousands of species using genome scale data. We found that early time estimates from second generation studies are similar to those of third and fourth generation studies, indicating that methodological advances have not fundamentally altered the timetree of life, but rather have facilitated time estimation by enabling the inclusion of more species. Nonetheless, we feel an urgent need for testing the accuracy and precision of third and fourth generation methods, including their robustness to misspecification of priors in the analysis of large phylogenies and data sets. © The Author(s) 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Intra-individual variability in day-to-day and month-to-month measurements of physical activity and sedentary behaviour at work and in leisure-time among Danish adults.

    PubMed

    Pedersen, E S L; Danquah, I H; Petersen, C B; Tolstrup, J S

    2016-12-03

    Accelerometers can obtain precise measurements of movements during the day. However, the individual activity pattern varies from day-to-day and there is limited evidence on measurement days needed to obtain sufficient reliability. The aim of this study was to examine variability in accelerometer derived data on sedentary behaviour and physical activity at work and in leisure-time during week days among Danish office employees. We included control participants (n = 135) from the Take a Stand! Intervention; a cluster randomized controlled trial conducted in 19 offices. Sitting time and physical activity were measured using an ActiGraph GT3X+ fixed on the thigh and data were processed using Acti4 software. Variability was examined for sitting time, standing time, steps and time spent in moderate-to-vigorous physical activity (MVPA) per day by multilevel mixed linear regression modelling. Results of this study showed that the number of days needed to obtain a reliability of 80% when measuring sitting time was 4.7 days for work and 5.5 days for leisure time. For physical activity at work, 4.0 days and 4.2 days were required to measure steps and MVPA, respectively. During leisure time, more monitoring time was needed to reliably estimate physical activity (6.8 days for steps and 5.8 days for MVPA). The number of measurement days needed to reliably estimate activity patterns was greater for leisure time than for work time. The domain specific variability is of great importance to researchers and health promotion workers planning to use objective measures of sedentary behaviour and physical activity. Clinical trials NCT01996176 .

  13. On-Board Real-Time State and Fault Identification for Rovers

    NASA Technical Reports Server (NTRS)

    Washington, Richard

    2000-01-01

    For extended autonomous operation, rovers must identify potential faults to determine whether its execution needs to be halted or not. At the same time, rovers present particular challenges for state estimation techniques: they are subject to environmental influences that affect senior readings during normal and anomalous operation, and the sensors fluctuate rapidly both because of noise and because of the dynamics of the rover's interaction with its environment. This paper presents MAKSI, an on-board method for state estimation and fault diagnosis that is particularly appropriate for rovers. The method is based on a combination of continuous state estimation, wing Kalman filters, and discrete state estimation, wing a Markov-model representation.

  14. Using remote sensing to calculate plant available nitrogen needed by crops on swine factory farm sprayfields in North Carolina

    NASA Astrophysics Data System (ADS)

    Christenson, Elizabeth; Serre, Marc

    2015-10-01

    North Carolina (NC) is the second largest producer of hogs in the United States with Duplin county, NC having the densest population of hogs in the world. In NC, liquid swine manure is generally stored in open-air lagoons and sprayed onto sprayfields with sprinkler systems to be used as fertilizer for crops. Swine factory farms, termed concentrated animal feeding operations (CAFOs), are regulated by the Department of Environment and Natural Resources (DENR) based on nutrient management plans (NMPs) having balanced plant available nitrogen (PAN). The estimated PAN in liquid manure being sprayed must be less than the estimated PAN needed crops during irrigation. Estimates for PAN needed by crops are dependent on crop and soil types. Objectives of this research were to develop a new, time-efficient method to identify PAN needed by crops on Duplin county sprayfields for years 2010-2014. Using remote sensing data instead of NMP data to identify PAN needed by crops allowed calendar year identification of which crops were grown on sprayfields instead of a five-year range of values. Although permitted data have more detailed crop information than remotely sensed data, identification of PAN needed by crops using remotely sensed data is more time efficient, internally consistent, easily publically accessible, and has the ability to identify annual changes in PAN on sprayfields. Once PAN needed by crops is known, remote sensing can be used to quantify PAN at other spatial scales, such as sub-watershed levels, and can be used to inform targeted water quality monitoring of swine CAFOs.

  15. Re-analysis of Alaskan benchmark glacier mass-balance data using the index method

    USGS Publications Warehouse

    Van Beusekom, Ashely E.; O'Nell, Shad R.; March, Rod S.; Sass, Louis C.; Cox, Leif H.

    2010-01-01

    At Gulkana and Wolverine Glaciers, designated the Alaskan benchmark glaciers, we re-analyzed and re-computed the mass balance time series from 1966 to 2009 to accomplish our goal of making more robust time series. Each glacier's data record was analyzed with the same methods. For surface processes, we estimated missing information with an improved degree-day model. Degree-day models predict ablation from the sum of daily mean temperatures and an empirical degree-day factor. We modernized the traditional degree-day model and derived new degree-day factors in an effort to match the balance time series more closely. We estimated missing yearly-site data with a new balance gradient method. These efforts showed that an additional step needed to be taken at Wolverine Glacier to adjust for non-representative index sites. As with the previously calculated mass balances, the re-analyzed balances showed a continuing trend of mass loss. We noted that the time series, and thus our estimate of the cumulative mass loss over the period of record, was very sensitive to the data input, and suggest the need to add data-collection sites and modernize our weather stations.

  16. 75 FR 61784 - Proposed Collection; Comment Request for Review of a Revised Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-06

    ... response time of ten minutes per form reporting a missing check is estimated; the same amount of time is needed to report the missing checks or electronic funds transfer (EFT) payments using the telephone. The...

  17. Application of independent component analysis for speech-music separation using an efficient score function estimation

    NASA Astrophysics Data System (ADS)

    Pishravian, Arash; Aghabozorgi Sahaf, Masoud Reza

    2012-12-01

    In this paper speech-music separation using Blind Source Separation is discussed. The separating algorithm is based on the mutual information minimization where the natural gradient algorithm is used for minimization. In order to do that, score function estimation from observation signals (combination of speech and music) samples is needed. The accuracy and the speed of the mentioned estimation will affect on the quality of the separated signals and the processing time of the algorithm. The score function estimation in the presented algorithm is based on Gaussian mixture based kernel density estimation method. The experimental results of the presented algorithm on the speech-music separation and comparing to the separating algorithm which is based on the Minimum Mean Square Error estimator, indicate that it can cause better performance and less processing time

  18. Estimating allowable-cut by area-scheduling

    Treesearch

    William B. Leak

    2011-01-01

    Estimation of the regulated allowable-cut is an important step in placing a forest property under management and ensuring a continued supply of timber over time. Regular harvests also provide for the maintenance of needed wildlife habitat. There are two basic approaches: (1) volume, and (2) area/volume regulation, with many variations of each. Some require...

  19. Corner smoothing of 2D milling toolpath using b-spline curve by optimizing the contour error and the feedrate

    NASA Astrophysics Data System (ADS)

    Özcan, Abdullah; Rivière-Lorphèvre, Edouard; Ducobu, François

    2018-05-01

    In part manufacturing, efficient process should minimize the cycle time needed to reach the prescribed quality on the part. In order to optimize it, the machining time needs to be as low as possible and the quality needs to meet some requirements. For a 2D milling toolpath defined by sharp corners, the programmed feedrate is different from the reachable feedrate due to kinematic limits of the motor drives. This phenomena leads to a loss of productivity. Smoothing the toolpath allows to reduce significantly the machining time but the dimensional accuracy should not be neglected. Therefore, a way to address the problem of optimizing a toolpath in part manufacturing is to take into account the manufacturing time and the part quality. On one hand, maximizing the feedrate will minimize the manufacturing time and, on the other hand, the maximum of the contour error needs to be set under a threshold to meet the quality requirements. This paper presents a method to optimize sharp corner smoothing using b-spline curves by adjusting the control points defining the curve. The objective function used in the optimization process is based on the contour error and the difference between the programmed feedrate and an estimation of the reachable feedrate. The estimation of the reachable feedrate is based on geometrical information. Some simulation results are presented in the paper and the machining times are compared in each cases.

  20. A New Way to Estimate the Potential Unmet Need for Infertility Services Among Women in the United States

    PubMed Central

    Slauson-Blevins, Kathleen S.; Tiemeyer, Stacy; McQuillan, Julia; Shreffler, Karina M.

    2016-01-01

    Abstract Background: Fewer than 50% of women who meet the medical/behavioral criteria for infertility receive medical services. Estimating the number of women who both meet the medical/behavioral criteria for infertility and who have pro-conception attitudes will allow for better estimates of the potential need and unmet need for infertility services in the United States. Methods: The National Survey of Fertility Barriers was administered by telephone to a probability sample of 4,712 women in the United States. The sample for this analysis was 292 women who reported an experience of infertility within 3 years of the time of the interview. Infertile women were asked if they were trying to conceive at the time of their infertility experience and if they wanted to have a child to determine who could be considered in need of services. Results: Among U.S. women who have met medical criteria for infertility within the past three years, 15.9% report that they were neither trying to have a child nor wanted to have a child and can be classified as not in need of treatment. Of the 84.9% of infertile women in need of treatment, 58.1% did not even talk to a doctor about ways to become pregnant. Discussion: Even after taking into account that not all infertile women are in need of treatment, there is still a large unmet need for infertility treatment in the United States. Conclusion: Studies of the incidence of infertility should include measures of both trying to have a child and wanting to have a child. PMID:26555685

  1. Sharing simulation-based training courses between institutions: opportunities and challenges.

    PubMed

    Laack, Torrey A; Lones, Ellen A; Schumacher, Donna R; Todd, Frances M; Cook, David A

    2017-01-01

    Sharing simulation-based training (SBT) courses between institutions could reduce time to develop new content but also presents challenges. We evaluate the process of sharing SBT courses across institutions in a mixed method study estimating the time required and identifying barriers and potential solutions. Two US academic medical institutions explored instructor experiences with the process of sharing four courses (two at each site) using personal interviews and a written survey and estimated the time needed to develop new content vs implement existing SBT courses. The project team spent approximately 618 h creating a collaboration infrastructure to support course sharing. Sharing two SBT courses was estimated to save 391 h compared with developing two new courses. In the qualitative analysis, participants noted the primary benefit of course sharing was time savings. Barriers included difficulty finding information and understanding overall course flow. Suggestions for improvement included establishing a standardized template, clearly identifying the target audience, providing a course overview, communicating with someone familiar with the original SBT course, employing an intuitive file-sharing platform, and considering local culture, context, and needs. Sharing SBT courses between institutions is feasible but not without challenges. An initial investment in a sharing infrastructure may facilitate downstream time savings compared with developing content de novo.

  2. A double hit model for the distribution of time to AIDS onset

    NASA Astrophysics Data System (ADS)

    Chillale, Nagaraja Rao

    2013-09-01

    Incubation time is a key epidemiologic descriptor of an infectious disease. In the case of HIV infection this is a random variable and is probably the longest one. The probability distribution of incubation time is the major determinant of the relation between the incidences of HIV infection and its manifestation to Aids. This is also one of the key factors used for accurate estimation of AIDS incidence in a region. The present article i) briefly reviews the work done, points out uncertainties in estimation of AIDS onset time and stresses the need for its precise estimation, ii) highlights some of the modelling features of onset distribution including immune failure mechanism, and iii) proposes a 'Double Hit' model for the distribution of time to AIDS onset in the cases of (a) independent and (b) dependent time variables of the two markers and examined the applicability of a few standard probability models.

  3. Estimating medical practice expenses from administering adult influenza vaccinations.

    PubMed

    Coleman, Margaret S; Fontanesi, John; Meltzer, Martin I; Shefer, Abigail; Fishbein, Daniel B; Bennett, Nancy M; Stryker, David

    2005-01-04

    Potential business losses incurred vaccinating adults against influenza have not been defined because of a lack of estimates for medical practice costs incurred delivering vaccines. We collected data on vaccination labor time and other associated expenses. We modeled estimates of per-vaccination medical practice business costs associated with delivering adult influenza vaccine in different sized practices. Per-shot costs ranged from USD 13.87 to USD 46.27 (2001 dollars). When compared with average Medicare payments of USD 11.71, per-shot losses ranged from US$ 2.16 to USD 34.56. More research is needed to determine less expensive delivery settings and/or whether third-party payers need to make higher payments for adult vaccinations.

  4. Implementation Of Fuzzy Approach To Improve Time Estimation [Case Study Of A Thermal Power Plant Is Considered

    NASA Astrophysics Data System (ADS)

    Pradhan, Moumita; Pradhan, Dinesh; Bandyopadhyay, G.

    2010-10-01

    Fuzzy System has demonstrated their ability to solve different kinds of problem in various application domains. There is an increasing interest to apply fuzzy concept to improve tasks of any system. Here case study of a thermal power plant is considered. Existing time estimation represents time to complete tasks. Applying fuzzy linear approach it becomes clear that after each confidence level least time is taken to complete tasks. As time schedule is less than less amount of cost is needed. Objective of this paper is to show how one system becomes more efficient in applying Fuzzy Linear approach. In this paper we want to optimize the time estimation to perform all tasks in appropriate time schedules. For the case study, optimistic time (to), pessimistic time (tp), most likely time(tm) is considered as data collected from thermal power plant. These time estimates help to calculate expected time(te) which represents time to complete particular task to considering all happenings. Using project evaluation and review technique (PERT) and critical path method (CPM) concept critical path duration (CPD) of this project is calculated. This tells that the probability of fifty percent of the total tasks can be completed in fifty days. Using critical path duration and standard deviation of the critical path, total completion of project can be completed easily after applying normal distribution. Using trapezoidal rule from four time estimates (to, tm, tp, te), we can calculate defuzzyfied value of time estimates. For range of fuzzy, we consider four confidence interval level say 0.4, 0.6, 0.8,1. From our study, it is seen that time estimates at confidence level between 0.4 and 0.8 gives the better result compared to other confidence levels.

  5. Joint estimation of 2D-DOA and frequency based on space-time matrix and conformal array.

    PubMed

    Wan, Liang-Tian; Liu, Lu-Tao; Si, Wei-Jian; Tian, Zuo-Xi

    2013-01-01

    Each element in the conformal array has a different pattern, which leads to the performance deterioration of the conventional high resolution direction-of-arrival (DOA) algorithms. In this paper, a joint frequency and two-dimension DOA (2D-DOA) estimation algorithm for conformal array are proposed. The delay correlation function is used to suppress noise. Both spatial and time sampling are utilized to construct the spatial-time matrix. The frequency and 2D-DOA estimation are accomplished based on parallel factor (PARAFAC) analysis without spectral peak searching and parameter pairing. The proposed algorithm needs only four guiding elements with precise positions to estimate frequency and 2D-DOA. Other instrumental elements can be arranged flexibly on the surface of the carrier. Simulation results demonstrate the effectiveness of the proposed algorithm.

  6. On Time Delay Margin Estimation for Adaptive Control and Optimal Control Modification

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.

    2011-01-01

    This paper presents methods for estimating time delay margin for adaptive control of input delay systems with almost linear structured uncertainty. The bounded linear stability analysis method seeks to represent an adaptive law by a locally bounded linear approximation within a small time window. The time delay margin of this input delay system represents a local stability measure and is computed analytically by three methods: Pade approximation, Lyapunov-Krasovskii method, and the matrix measure method. These methods are applied to the standard model-reference adaptive control, s-modification adaptive law, and optimal control modification adaptive law. The windowing analysis results in non-unique estimates of the time delay margin since it is dependent on the length of a time window and parameters which vary from one time window to the next. The optimal control modification adaptive law overcomes this limitation in that, as the adaptive gain tends to infinity and if the matched uncertainty is linear, then the closed-loop input delay system tends to a LTI system. A lower bound of the time delay margin of this system can then be estimated uniquely without the need for the windowing analysis. Simulation results demonstrates the feasibility of the bounded linear stability method for time delay margin estimation.

  7. Introduction to State Estimation of High-Rate System Dynamics.

    PubMed

    Hong, Jonathan; Laflamme, Simon; Dodson, Jacob; Joyce, Bryan

    2018-01-13

    Engineering systems experiencing high-rate dynamic events, including airbags, debris detection, and active blast protection systems, could benefit from real-time observability for enhanced performance. However, the task of high-rate state estimation is challenging, in particular for real-time applications where the rate of the observer's convergence needs to be in the microsecond range. This paper identifies the challenges of state estimation of high-rate systems and discusses the fundamental characteristics of high-rate systems. A survey of applications and methods for estimators that have the potential to produce accurate estimations for a complex system experiencing highly dynamic events is presented. It is argued that adaptive observers are important to this research. In particular, adaptive data-driven observers are advantageous due to their adaptability and lack of dependence on the system model.

  8. Using Geographical Information Systems to Identify Populations in Need of Improved Accessibility to Antivenom Treatment for Snakebite Envenoming in Costa Rica

    PubMed Central

    Hansson, Erik; Sasa, Mahmood; Mattisson, Kristoffer; Robles, Arodys; Gutiérrez, José María

    2013-01-01

    Introduction Snakebite accidents are an important health problem in rural areas of tropical countries worldwide, including Costa Rica, where most bites are caused by the pit-viper Bothrops asper. The treatment of these potentially fatal accidents is based on the timely administration of specific antivenom. In many regions of the world, insufficient health care systems and lack of antivenom in remote and poor areas where snakebites are common, means that efficient treatment is unavailable for many snakebite victims, leading to unnecessary mortality and morbidity. In this study, geographical information systems (GIS) were used to identify populations in Costa Rica with a need of improved access to antivenom treatment: those living in areas with a high risk of snakebites and long time to reach antivenom treatment. Method/Principal Findings Populations living in areas with high risk of snakebites were identified using two approaches: one based on the district-level reported incidence, and another based on mapping environmental factors favoring B. asper presence. Time to reach treatment using ambulance was estimated using cost surface analysis, thereby enabling adjustment of transportation speed by road availability and quality, topography and land use. By mapping populations in high risk of snakebites and the estimated time to treatment, populations with need of improved treatment access were identified. Conclusion/Significance This study demonstrates the usefulness of GIS for improving treatment of snakebites. By mapping reported incidence, risk factors, location of existing treatment resources, and the time estimated to reach these for at-risk populations, rational allocation of treatment resources is facilitated. PMID:23383352

  9. Using geographical information systems to identify populations in need of improved accessibility to antivenom treatment for snakebite envenoming in Costa Rica.

    PubMed

    Hansson, Erik; Sasa, Mahmood; Mattisson, Kristoffer; Robles, Arodys; Gutiérrez, José María

    2013-01-01

    Snakebite accidents are an important health problem in rural areas of tropical countries worldwide, including Costa Rica, where most bites are caused by the pit-viper Bothrops asper. The treatment of these potentially fatal accidents is based on the timely administration of specific antivenom. In many regions of the world, insufficient health care systems and lack of antivenom in remote and poor areas where snakebites are common, means that efficient treatment is unavailable for many snakebite victims, leading to unnecessary mortality and morbidity. In this study, geographical information systems (GIS) were used to identify populations in Costa Rica with a need of improved access to antivenom treatment: those living in areas with a high risk of snakebites and long time to reach antivenom treatment. Populations living in areas with high risk of snakebites were identified using two approaches: one based on the district-level reported incidence, and another based on mapping environmental factors favoring B. asper presence. Time to reach treatment using ambulance was estimated using cost surface analysis, thereby enabling adjustment of transportation speed by road availability and quality, topography and land use. By mapping populations in high risk of snakebites and the estimated time to treatment, populations with need of improved treatment access were identified. This study demonstrates the usefulness of GIS for improving treatment of snakebites. By mapping reported incidence, risk factors, location of existing treatment resources, and the time estimated to reach these for at-risk populations, rational allocation of treatment resources is facilitated.

  10. General constraints on sampling wildlife on FIA plots

    USGS Publications Warehouse

    Bailey, L.L.; Sauer, J.R.; Nichols, J.D.; Geissler, P.H.; McRoberts, Ronald E.; Reams, Gregory A.; Van Deusen, Paul C.; McWilliams, William H.; Cieszewski, Chris J.

    2005-01-01

    This paper reviews the constraints to sampling wildlife populations at FIA points. Wildlife sampling programs must have well-defined goals and provide information adequate to meet those goals. Investigators should choose a State variable based on information needs and the spatial sampling scale. We discuss estimation-based methods for three State variables: species richness, abundance, and patch occupancy. All methods incorporate two essential sources of variation: detectability estimation and spatial variation. FIA sampling imposes specific space and time criteria that may need to be adjusted to meet local wildlife objectives.

  11. Time use patterns among women with rheumatoid arthritis: association with functional limitations and psychological status.

    PubMed

    Katz, P; Morris, A

    2007-03-01

    This study assessed time use patterns among 375 women with rheumatoid arthritis (RA). We hypothesized that (i) as functional limitations increased, time use imbalances would occur (i.e. time needed for obligatory activities would conflict with time needed for productive and free-time activities) and (ii) time use imbalances would be associated with psychological distress. Time use estimates were obtained from written questionnaires; other study data were collected from annual telephone interviews. Activities were categorized as obligatory, committed or discretionary, as defined by Verbrugge. Time use estimates were aggregated to define number of obligatory (e.g. self-care) activities requiring >2 h/day and a number of committed and discretionary activities in which no time was spent each day. After adjusting for age, education, marital status and pain severity, women with more functional limitations were significantly more likely to spend >2 h/day in obligatory activities. As functional limitations increased, the proportion spending no time in each committed activity and many discretionary activities increased. Spending >2 h/day in obligatory activities was not significantly associated with poor psychological status, but spending no time in a greater number of committed and discretionary activities was associated with lower life satisfaction and higher levels of depressive symptoms. Having more severe functional limitations appears to shift time use patterns towards more time spent in obligatory activities and less time spent in committed and discretionary activities. These imbalances in time use were associated with psychological distress, highlighting the need for women with RA to maintain important productive, social and discretionary activities.

  12. 76 FR 61622 - Potential Closing of Morses Line Border Crossing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-05

    ... travelers would need to travel to an alternate crossing which could cost them both time and money. CBP does... measured the distance and estimated time for each combination assuming they could not travel through Morses Line. By comparing the distance and travel time for the fastest route to those for the fastest route...

  13. Perceived Time Progression and Vigilance: Implications for Workload, Stress, and Cerebral Hemodynamics

    DTIC Science & Technology

    2013-04-01

    and maintaining the data needed , and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...diagnostics indicate the need for immediate action (Sheridan, 1970, 1980). Consequently, vigilance has a critical impact in a wide range of automated...activating system) needed for continued alertness. Consequently, lethargy increases in observers and signal detection is reduced. However, recent findings

  14. Sasebo, A Case Study in Optimizing Official Vehicles

    DTIC Science & Technology

    2012-12-01

    is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and maintaining...the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of...that were estimated to reduce federal budget deficits by a total of at least $2.1 trillion over the 2012–2021 period…At least another $1.2 trillion

  15. Estimation of aquifer radionuclide concentrations by postprocessing of conservative tracer model results

    NASA Astrophysics Data System (ADS)

    Gedeon, M.; Vandersteen, K.; Rogiers, B.

    2012-04-01

    Radionuclide concentrations in aquifers represent an important indicator in estimating the impact of a planned surface disposal for low and medium level short-lived radioactive waste in Belgium, developed by the Belgian Agency for Radioactive Waste and Enriched Fissile Materials (ONDRAF/NIRAS), who also coordinates and leads the corresponding research. Estimating aquifer concentrations for individual radionuclides represents a computational challenge because (a) different retardation values are applied to different hydrogeologic units and (b) sequential decay reactions with radionuclides of various sorption characteristics cause long computational times until a steady-state is reached. The presented work proposes a methodology reducing substantially the computational effort by postprocessing the results of a prior non-reactive tracer simulation. These advective transport results represent the steady-state concentration - source flux ratio and the break-through time at each modelling cell. These two variables are further used to estimate the individual radionuclide concentrations by (a) scaling the steady-state concentrations to the source fluxes of individual radionuclides; (b) applying the radioactive decay and ingrowth in a decay chain; (c) scaling the travel time by the retardation factor and (d) applying linear sorption. While all steps except (b) require solving simple linear equations, applying ingrowth of individual radionuclides in decay chains requires solving the differential Bateman equation. This equation needs to be solved once for a unit radionuclide activity at all arrival times found in the numerical grid. The ratios between the parent nuclide activity and the progeny activities are then used in the postprocessing. Results are presented for discrete points and examples of radioactive plume maps are given. These results compare well to the results achieved using a full numerical simulation including the respective chemical reaction processes. Although the proposed method represents a fast way to estimate the radionuclide concentrations without performing timely challenging simulations, its applicability has some limits. The radionuclide source needs to be assumed constant during the period of achieving a steady-state in the model. Otherwise, the source variability of individual radionuclides needs to be modelled using a numerical simulation. However, such a situation only occurs in cases of source variability in a period until steady-state is reached and such a simulation takes a relatively short time. The proposed method enables an effective estimation of individual radionuclide concentrations in the frame of performance assessment of a radioactive waste disposal. Reducing the calculation time to a minimum enables performing sensitivity and uncertainty analyses, testing alternative models, etc. thus enhancing the overall quality of the modelling analysis.

  16. Number needed to treat is incorrect without proper time-related considerations.

    PubMed

    Suissa, Daniel; Brassard, Paul; Smiechowski, Brielan; Suissa, Samy

    2012-01-01

    The number needed to treat (NNT) is a simple measure of a treatment's impact, increasingly reported in randomized trials and observational studies. Its calculation in studies involving varying follow-up times or recurrent outcomes has been criticized. We discuss the NNT in these contexts, illustrating using several published studies. The computation of the NNT is founded on the cumulative incidence of the outcome. Instead, several published studies use simple proportions that do not account for varying follow-up times, or use incidence rates per person-time. We show that these approaches can lead to erroneous values of the NNT and misleading interpretations. For example, after converting the incidence rate to a cumulative incidence, we show that a trial reporting a NNT of 4 "to prevent one exacerbation in 1 year" should have reported a NNT of 9. A survey of all papers reporting NNT, published in four major medical journals in 2009, found that 6 out of all 10 papers involving varying follow-up times did not correctly estimate the NNT. As the "number needed to treat" becomes increasingly used in complex studies and in the comparative effectiveness of therapies, its accurate estimation and interpretation become crucial to avoid erroneous clinical and public health decisions. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. 76 FR 12999 - Submission for OMB Review; Comment Request for Review of a Revised Information Collection: (OMB...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-09

    ...,600 are reported by telephone. A response time of ten minutes per form reporting a missing check is estimated; the same amount of time is needed to report the missing checks or electronic funds transfer (EFT...

  18. Comparison of Natural Gas Storage Estimates from the EIA and AGA

    EIA Publications

    1997-01-01

    The Energy Information Administration (EIA) has been publishing monthly storage information for years. In order to address the need for more timely information, in 1994 the American Gas Association (AGA) began publishing weekly storage levels. Both the EIA and the AGA series provide estimates of the total working gas in storage, but use significantly different methodologies.

  19. Political Revolution And Social Communication Technologies

    DTIC Science & Technology

    2017-12-01

    this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden...estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington headquarters Services

  20. Navy Multiband Terminal (NMT)

    DTIC Science & Technology

    2013-12-01

    instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send...0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing...comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to

  1. PROPERTY APPRAISAL PROVIDES CONTROL, INSURANCE BASIS, AND VALUE ESTIMATE.

    ERIC Educational Resources Information Center

    THOMSON, JACK

    A COMPLETE PROPERTY APPRAISAL SERVES AS A BASIS FOR CONTROL, INSURANCE AND VALUE ESTIMATE. A PROFESSIONAL APPRAISAL FIRM SHOULD PERFORM THIS FUNCTION BECAUSE (1) IT IS FAMILIAR WITH PROPER METHODS, (2) IT CAN PREPARE THE REPORT WITH MINIMUM CONFUSION AND INTERRRUPTION OF THE COLLEGE OPERATION, (3) USE OF ITS PRICING LIBRARY REDUCES TIME NEEDED AND…

  2. Experience with basal area estimation by prisms in lodgepole pine.

    Treesearch

    James M. Trappe

    1957-01-01

    Estimation of basal area by prisms offers intriguing possibilities for reducing time and effort in making stand inventories. Increased inventory efficiency is a particular need in stands that are relatively low in value due to small stems, predominance of low value species or heavy defect. In the Pacific Northwest, lodgepole pine characteristically forms dense low-...

  3. Special Operations Forces Interagency Counterterrorism Reference Manual

    DTIC Science & Technology

    2009-03-01

    information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...Presidential Review Directives and Presidential Decision Directives (Clin- ton administration) and National Security Study Directives and National

  4. Trends in Worker Hearing Loss by Industry Sector, 1981–2010

    PubMed Central

    Masterson, Elizabeth A.; Deddens, James A.; Themann, Christa L.; Bertke, Stephen; Calvert, Geoffrey M.

    2015-01-01

    Background The purpose of this study was to estimate the incidence and prevalence of hearing loss for noise-exposed U.S. workers by industry sector and 5-year time period, covering 30 years. Methods Audiograms for 1.8 million workers from 1981–2010 were examined. Incidence and prevalence were estimated by industry sector and time period. The adjusted risk of incident hearing loss within each time period and industry sector as compared with a reference time period was also estimated. Results The adjusted risk for incident hearing loss decreased over time when all industry sectors were combined. However, the risk remained high for workers in Healthcare and Social Assistance, and the prevalence was consistently high for Mining and Construction workers. Conclusions While progress has been made in reducing the risk of incident hearing loss within most industry sectors, additional efforts are needed within Mining, Construction and Healthcare and Social Assistance. PMID:25690583

  5. Trends in worker hearing loss by industry sector, 1981-2010.

    PubMed

    Masterson, Elizabeth A; Deddens, James A; Themann, Christa L; Bertke, Stephen; Calvert, Geoffrey M

    2015-04-01

    The purpose of this study was to estimate the incidence and prevalence of hearing loss for noise-exposed U.S. workers by industry sector and 5-year time period, covering 30 years. Audiograms for 1.8 million workers from 1981-2010 were examined. Incidence and prevalence were estimated by industry sector and time period. The adjusted risk of incident hearing loss within each time period and industry sector as compared with a reference time period was also estimated. The adjusted risk for incident hearing loss decreased over time when all industry sectors were combined. However, the risk remained high for workers in Healthcare and Social Assistance, and the prevalence was consistently high for Mining and Construction workers. While progress has been made in reducing the risk of incident hearing loss within most industry sectors, additional efforts are needed within Mining, Construction and Healthcare and Social Assistance. © 2015 Wiley Periodicals, Inc.

  6. Using Mathematical Transmission Modelling to Investigate Drivers of Respiratory Syncytial Virus Seasonality in Children in the Philippines

    PubMed Central

    Paynter, Stuart; Yakob, Laith; Simões, Eric A. F.; Lucero, Marilla G.; Tallo, Veronica; Nohynek, Hanna; Ware, Robert S.; Weinstein, Philip; Williams, Gail; Sly, Peter D.

    2014-01-01

    We used a mathematical transmission model to estimate when ecological drivers of respiratory syncytial virus (RSV) transmissibility would need to act in order to produce the observed seasonality of RSV in the Philippines. We estimated that a seasonal peak in transmissibility would need to occur approximately 51 days prior to the observed peak in RSV cases (range 49 to 67 days). We then compared this estimated seasonal pattern of transmissibility to the seasonal patterns of possible ecological drivers of transmissibility: rainfall, humidity and temperature patterns, nutritional status, and school holidays. The timing of the seasonal patterns of nutritional status and rainfall were both consistent with the estimated seasonal pattern of transmissibility and these are both plausible drivers of the seasonality of RSV in this setting. PMID:24587222

  7. Rainfall estimation for real time flood monitoring using geostationary meteorological satellite data

    NASA Astrophysics Data System (ADS)

    Veerakachen, Watcharee; Raksapatcharawong, Mongkol

    2015-09-01

    Rainfall estimation by geostationary meteorological satellite data provides good spatial and temporal resolutions. This is advantageous for real time flood monitoring and warning systems. However, a rainfall estimation algorithm developed in one region needs to be adjusted for another climatic region. This work proposes computationally-efficient rainfall estimation algorithms based on an Infrared Threshold Rainfall (ITR) method calibrated with regional ground truth. Hourly rain gauge data collected from 70 stations around the Chao-Phraya river basin were used for calibration and validation of the algorithms. The algorithm inputs were derived from FY-2E satellite observations consisting of infrared and water vapor imagery. The results were compared with the Global Satellite Mapping of Precipitation (GSMaP) near real time product (GSMaP_NRT) using the probability of detection (POD), root mean square error (RMSE) and linear correlation coefficient (CC) as performance indices. Comparison with the GSMaP_NRT product for real time monitoring purpose shows that hourly rain estimates from the proposed algorithm with the error adjustment technique (ITR_EA) offers higher POD and approximately the same RMSE and CC with less data latency.

  8. Intensive Medical Nutrition Therapy: Methods to Improve Nutrition Provision in the Critical Care Setting

    PubMed Central

    Sheean, Patricia M.; Peterson, Sarah J.; Zhao, Weihan; Gurka, David P.; Braunschweig, Carol A.

    2013-01-01

    Patients requiring mechanical ventilation in an intensive care unit commonly fail to attain enteral nutrition (EN) infusion goals. We conducted a cohort study to quantify and compare the percentage of energy and protein received between standard care (n=24) and intensive medical nutrition therapy (MNT) (n=25) participants; to assess the percentage of energy and protein received varied by nutritional status, and to identify barriers to EN provision. Intensive MNT entailed providing energy at 150% of estimated needs, using only 2.0 kcal/cc enteral formula and 24-hour infusions. Estimated energy and protein needs were calculated using 30 kcal/kg and 1.2 g protein/kg actual or obesity-adjusted admission body weight. Subjective global assessment was completed to ascertain admission intensive care unit nutritional status. Descriptive statistics and survival analyses were conducted to examine time until attaining 100% of feeding targets. Patients had similar estimated energy and protein needs, and 51% were admitted with both respiratory failure and classified as normally nourished (n=25/49). Intensive MNT recipients achieved a greater percentage of daily estimated energy and protein needs than standard care recipients (1,198±493 vs 475±480 kcal, respectively, P<0.0001; and 53±25 vs 29±32 g, respectively, P=0.007) despite longer intensive care unit stays. Cox proportional hazards models showed that intensive MNT patients were 6.5 (95% confidence interval 2.1 to 29.0) and 3.6 (95% confidence interval 1.2 to 15.9) times more likely to achieve 100% of estimated energy and protein needs, respectively, controlling for confounders. Malnourished patients (n=13) received significantly less energy (P=0.003) and protein (P=0.004) compared with normally nourished (n=11) patients receiving standard care. Nutritional status did not affect feeding intakes in the intensive MNT group. Clinical management, lack of physician orders, and gastrointestinal issues involving ileus, gastrointestinal hemorrhage, and EN delivery were the most frequent clinical impediments to EN provision. It was concluded that intensive MNT could achieve higher volumes of EN infusion, regardless of nutritional status. Future studies are needed to advance this methodology and to assess its influence on outcomes. PMID:22579721

  9. Estimation of confidence limits for descriptive indexes derived from autoregressive analysis of time series: Methods and application to heart rate variability.

    PubMed

    Beda, Alessandro; Simpson, David M; Faes, Luca

    2017-01-01

    The growing interest in personalized medicine requires making inferences from descriptive indexes estimated from individual recordings of physiological signals, with statistical analyses focused on individual differences between/within subjects, rather than comparing supposedly homogeneous cohorts. To this end, methods to compute confidence limits of individual estimates of descriptive indexes are needed. This study introduces numerical methods to compute such confidence limits and perform statistical comparisons between indexes derived from autoregressive (AR) modeling of individual time series. Analytical approaches are generally not viable, because the indexes are usually nonlinear functions of the AR parameters. We exploit Monte Carlo (MC) and Bootstrap (BS) methods to reproduce the sampling distribution of the AR parameters and indexes computed from them. Here, these methods are implemented for spectral and information-theoretic indexes of heart-rate variability (HRV) estimated from AR models of heart-period time series. First, the MS and BC methods are tested in a wide range of synthetic HRV time series, showing good agreement with a gold-standard approach (i.e. multiple realizations of the "true" process driving the simulation). Then, real HRV time series measured from volunteers performing cognitive tasks are considered, documenting (i) the strong variability of confidence limits' width across recordings, (ii) the diversity of individual responses to the same task, and (iii) frequent disagreement between the cohort-average response and that of many individuals. We conclude that MC and BS methods are robust in estimating confidence limits of these AR-based indexes and thus recommended for short-term HRV analysis. Moreover, the strong inter-individual differences in the response to tasks shown by AR-based indexes evidence the need of individual-by-individual assessments of HRV features. Given their generality, MC and BS methods are promising for applications in biomedical signal processing and beyond, providing a powerful new tool for assessing the confidence limits of indexes estimated from individual recordings.

  10. Estimation of confidence limits for descriptive indexes derived from autoregressive analysis of time series: Methods and application to heart rate variability

    PubMed Central

    2017-01-01

    The growing interest in personalized medicine requires making inferences from descriptive indexes estimated from individual recordings of physiological signals, with statistical analyses focused on individual differences between/within subjects, rather than comparing supposedly homogeneous cohorts. To this end, methods to compute confidence limits of individual estimates of descriptive indexes are needed. This study introduces numerical methods to compute such confidence limits and perform statistical comparisons between indexes derived from autoregressive (AR) modeling of individual time series. Analytical approaches are generally not viable, because the indexes are usually nonlinear functions of the AR parameters. We exploit Monte Carlo (MC) and Bootstrap (BS) methods to reproduce the sampling distribution of the AR parameters and indexes computed from them. Here, these methods are implemented for spectral and information-theoretic indexes of heart-rate variability (HRV) estimated from AR models of heart-period time series. First, the MS and BC methods are tested in a wide range of synthetic HRV time series, showing good agreement with a gold-standard approach (i.e. multiple realizations of the "true" process driving the simulation). Then, real HRV time series measured from volunteers performing cognitive tasks are considered, documenting (i) the strong variability of confidence limits' width across recordings, (ii) the diversity of individual responses to the same task, and (iii) frequent disagreement between the cohort-average response and that of many individuals. We conclude that MC and BS methods are robust in estimating confidence limits of these AR-based indexes and thus recommended for short-term HRV analysis. Moreover, the strong inter-individual differences in the response to tasks shown by AR-based indexes evidence the need of individual-by-individual assessments of HRV features. Given their generality, MC and BS methods are promising for applications in biomedical signal processing and beyond, providing a powerful new tool for assessing the confidence limits of indexes estimated from individual recordings. PMID:28968394

  11. Estimating Oxygen Needs for Childhood Pneumonia in Developing Country Health Systems: A New Model for Expecting the Unexpected

    PubMed Central

    Bradley, Beverly D.; Howie, Stephen R. C.; Chan, Timothy C. Y.; Cheng, Yu-Ling

    2014-01-01

    Background Planning for the reliable and cost-effective supply of a health service commodity such as medical oxygen requires an understanding of the dynamic need or ‘demand’ for the commodity over time. In developing country health systems, however, collecting longitudinal clinical data for forecasting purposes is very difficult. Furthermore, approaches to estimating demand for supplies based on annual averages can underestimate demand some of the time by missing temporal variability. Methods A discrete event simulation model was developed to estimate variable demand for a health service commodity using the important example of medical oxygen for childhood pneumonia. The model is based on five key factors affecting oxygen demand: annual pneumonia admission rate, hypoxaemia prevalence, degree of seasonality, treatment duration, and oxygen flow rate. These parameters were varied over a wide range of values to generate simulation results for different settings. Total oxygen volume, peak patient load, and hours spent above average-based demand estimates were computed for both low and high seasons. Findings Oxygen demand estimates based on annual average values of demand factors can often severely underestimate actual demand. For scenarios with high hypoxaemia prevalence and degree of seasonality, demand can exceed average levels up to 68% of the time. Even for typical scenarios, demand may exceed three times the average level for several hours per day. Peak patient load is sensitive to hypoxaemia prevalence, whereas time spent at such peak loads is strongly influenced by degree of seasonality. Conclusion A theoretical study is presented whereby a simulation approach to estimating oxygen demand is used to better capture temporal variability compared to standard average-based approaches. This approach provides better grounds for health service planning, including decision-making around technologies for oxygen delivery. Beyond oxygen, this approach is widely applicable to other areas of resource and technology planning in developing country health systems. PMID:24587089

  12. Simple estimation procedures for regression analysis of interval-censored failure time data under the proportional hazards model.

    PubMed

    Sun, Jianguo; Feng, Yanqin; Zhao, Hui

    2015-01-01

    Interval-censored failure time data occur in many fields including epidemiological and medical studies as well as financial and sociological studies, and many authors have investigated their analysis (Sun, The statistical analysis of interval-censored failure time data, 2006; Zhang, Stat Modeling 9:321-343, 2009). In particular, a number of procedures have been developed for regression analysis of interval-censored data arising from the proportional hazards model (Finkelstein, Biometrics 42:845-854, 1986; Huang, Ann Stat 24:540-568, 1996; Pan, Biometrics 56:199-203, 2000). For most of these procedures, however, one drawback is that they involve estimation of both regression parameters and baseline cumulative hazard function. In this paper, we propose two simple estimation approaches that do not need estimation of the baseline cumulative hazard function. The asymptotic properties of the resulting estimates are given, and an extensive simulation study is conducted and indicates that they work well for practical situations.

  13. Introduction to State Estimation of High-Rate System Dynamics

    PubMed Central

    Dodson, Jacob; Joyce, Bryan

    2018-01-01

    Engineering systems experiencing high-rate dynamic events, including airbags, debris detection, and active blast protection systems, could benefit from real-time observability for enhanced performance. However, the task of high-rate state estimation is challenging, in particular for real-time applications where the rate of the observer’s convergence needs to be in the microsecond range. This paper identifies the challenges of state estimation of high-rate systems and discusses the fundamental characteristics of high-rate systems. A survey of applications and methods for estimators that have the potential to produce accurate estimations for a complex system experiencing highly dynamic events is presented. It is argued that adaptive observers are important to this research. In particular, adaptive data-driven observers are advantageous due to their adaptability and lack of dependence on the system model. PMID:29342855

  14. Synchronization for Optical PPM with Inter-Symbol Guard Times

    NASA Astrophysics Data System (ADS)

    Rogalin, R.; Srinivasan, M.

    2017-05-01

    Deep space optical communications promises orders of magnitude growth in communication capacity, supporting high data rate applications such as video streaming and high-bandwidth science instruments. Pulse position modulation is the modulation format of choice for deep space applications, and by inserting inter-symbol guard times between the symbols, the signal carries the timing information needed by the demodulator. Accurately extracting this timing information is crucial to demodulating and decoding this signal. In this article, we propose a number of timing and frequency estimation schemes for this modulation format, and in particular highlight a low complexity maximum likelihood timing estimator that significantly outperforms the prior art in this domain. This method does not require an explicit synchronization sequence, freeing up channel resources for data transmission.

  15. Accelerometer-based wireless body area network to estimate intensity of therapy in post-acute rehabilitation

    PubMed Central

    Choquette, Stéphane; Hamel, Mathieu; Boissy, Patrick

    2008-01-01

    Background It has been suggested that there is a dose-response relationship between the amount of therapy and functional recovery in post-acute rehabilitation care. To this day, only the total time of therapy has been investigated as a potential determinant of this dose-response relationship because of methodological and measurement challenges. The primary objective of this study was to compare time and motion measures during real life physical therapy with estimates of active time (i.e. the time during which a patient is active physically) obtained with a wireless body area network (WBAN) of 3D accelerometer modules positioned at the hip, wrist and ankle. The secondary objective was to assess the differences in estimates of active time when using a single accelerometer module positioned at the hip. Methods Five patients (77.4 ± 5.2 y) with 4 different admission diagnoses (stroke, lower limb fracture, amputation and immobilization syndrome) were recruited in a post-acute rehabilitation center and observed during their physical therapy sessions throughout their stay. Active time was recorded by a trained observer using a continuous time and motion analysis program running on a Tablet-PC. Two WBAN configurations were used: 1) three accelerometer modules located at the hip, wrist and ankle (M3) and 2) one accelerometer located at the hip (M1). Acceleration signals from the WBANs were synchronized with the observations. Estimates of active time were computed based on the temporal density of the acceleration signals. Results A total of 62 physical therapy sessions were observed. Strong associations were found between WBANs estimates of active time and time and motion measures of active time. For the combined sessions, the intraclass correlation coefficient (ICC) was 0.93 (P ≤ 0.001) for M3 and 0.79 (P ≤ 0.001) for M1. The mean percentage of differences between observation measures and estimates from the WBAN of active time was -8.7% ± 2.0% using data from M3 and -16.4% ± 10.4% using data from M1. Conclusion WBANs estimates of active time compare favorably with results from observation-based time and motion measures. While the investigation on the association between active time and outcomes of rehabilitation needs to be studied in a larger scale study, the use of an accelerometer-based WBAN to measure active time is a promising approach that offers a better overall precision than methods relying on work sampling. Depending on the accuracy needed, the use of a single accelerometer module positioned on the hip may still be an interesting alternative to using multiple modules. PMID:18764954

  16. Modeling environmental noise exceedances using non-homogeneous Poisson processes.

    PubMed

    Guarnaccia, Claudio; Quartieri, Joseph; Barrios, Juan M; Rodrigues, Eliane R

    2014-10-01

    In this work a non-homogeneous Poisson model is considered to study noise exposure. The Poisson process, counting the number of times that a sound level surpasses a threshold, is used to estimate the probability that a population is exposed to high levels of noise a certain number of times in a given time interval. The rate function of the Poisson process is assumed to be of a Weibull type. The presented model is applied to community noise data from Messina, Sicily (Italy). Four sets of data are used to estimate the parameters involved in the model. After the estimation and tuning are made, a way of estimating the probability that an environmental noise threshold is exceeded a certain number of times in a given time interval is presented. This estimation can be very useful in the study of noise exposure of a population and also to predict, given the current behavior of the data, the probability of occurrence of high levels of noise in the near future. One of the most important features of the model is that it implicitly takes into account different noise sources, which need to be treated separately when using usual models.

  17. Bearings Only Air-to-Air Ranging

    DTIC Science & Technology

    1988-07-25

    directly in fiut of the observer whem first detected, more time will be needed for a good estimate. A sound uinp them is for the observer, having...altitude angle to provide an estimate of the z component. Moving targets commonly require some 60 seconds for good estimates of target location and...fixed target case, where a good strategy for the observer can be determined a priori, highly effective maneuvers for the observer in the case of a moving

  18. Fast estimation of space-robots inertia parameters: A modular mathematical formulation

    NASA Astrophysics Data System (ADS)

    Nabavi Chashmi, Seyed Yaser; Malaek, Seyed Mohammad-Bagher

    2016-10-01

    This work aims to propose a new technique that considerably helps enhance time and precision needed to identify ;Inertia Parameters (IPs); of a typical Autonomous Space-Robot (ASR). Operations might include, capturing an unknown Target Space-Object (TSO), ;active space-debris removal; or ;automated in-orbit assemblies;. In these operations generating precise successive commands are essential to the success of the mission. We show how a generalized, repeatable estimation-process could play an effective role to manage the operation. With the help of the well-known Force-Based approach, a new ;modular formulation; has been developed to simultaneously identify IPs of an ASR while it captures a TSO. The idea is to reorganize the equations with associated IPs with a ;Modular Set; of matrices instead of a single matrix representing the overall system dynamics. The devised Modular Matrix Set will then facilitate the estimation process. It provides a conjugate linear model in mass and inertia terms. The new formulation is, therefore, well-suited for ;simultaneous estimation processes; using recursive algorithms like RLS. Further enhancements would be needed for cases the effect of center of mass location becomes important. Extensive case studies reveal that estimation time is drastically reduced which in-turn paves the way to acquire better results.

  19. Modeling Estimated Personnel Needs for a Potential Foot and Mouth Disease Outbreak

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simmons, K; Hullinger, P

    2008-01-29

    Foot and Mouth disease (FMD) is a highly infectious and contagious viral disease affecting cloven-hoofed livestock that was last detected in the United States (US) in 1929. The prevalence of FMD in other countries, as well as the current potential for this virus to be used as a form of agroterrorism, has made preparations for a potential FMD outbreak a national priority. To assist in the evaluation of national preparedness, all 50 states were surveyed via e-mail, telephone and web search to obtain emergency response plans for FMD or for foreign animal diseases in general. Information from 33 states wasmore » obtained and analyzed for estimates of personnel resources needed to respond to an outbreak. These estimates were consolidated and enhanced to create a tool that could be used by individual states to better understand the personnel that would be needed to complete various tasks over time during an outbreak response. The estimates were then coupled, post-processing, to the output from FMD outbreaks simulated in California using the Multiscale Epidemiological/Economic Simulation and Analysis (MESA) model at Lawrence Livermore National Laboratory to estimate the personnel resource demands, by task, over the course of an outbreak response.« less

  20. Space Domain Awareness

    DTIC Science & Technology

    2012-09-01

    the Space Surveillance Network has been tracking orbital objects and maintaining a catalog that allows space operators to safely operate satellites ...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...Distribution Unlimited) backward) in time , but the accuracy degrades as the amount of propagation time increases. Thus, the need to maintain a

  1. A profile of physiotherapy supply in Ireland.

    PubMed

    Eighan, James; Walsh, Brendan; Smith, Samantha; Wren, Maev-Ann; Barron, Steve; Morgenroth, Edgar

    2018-04-13

    The lack of information on public and private physiotherapy supply in Ireland makes current and future resource allocation decisions difficult. This paper estimates the supply of physiotherapists in Ireland and profiles physiotherapists across acute and non-acute sectors, and across public and private practice. It examines geographic variation in physiotherapist supply, examining the implications of controlling for healthcare need. Physiotherapist headcounts are estimated using Health Service Personnel Census (HSPC) and Irish Society of Chartered Physiotherapists (ISCP) Register data. Headcounts are converted to whole-time equivalents (WTEs) using the HSPC and a survey of ISCP members to account for full- and part-time working practices. Non-acute supply per 10,000 population in each county is estimated to examine geographic inequalities and the raw population is adjusted in turn for a range of need indicators. An estimated 3172 physiotherapists were practising in Ireland in 2015; 6.8 physiotherapists per 10,000, providing an estimated 2620 WTEs. Females accounted for 74% of supply. Supply was greater in the non-acute sector; 1774 WTEs versus 846 WTEs in the acute sector. Physiotherapists in the acute sector were located mainly in publicly financed institutions (89%) with an even public/private split observed in the non-acute sector. Non-acute physiotherapist supply is unequally distributed across Ireland (Gini coefficient = 0.12; 95% CI 0.08-0.15), and inequalities remain after controlling for variations in healthcare needs across counties. The supply of physiotherapists in Ireland is 30% lower than the EU-28 average. Substantial inequality in the distribution of physiotherapists across counties is observed.

  2. A History-based Estimation for LHCb job requirements

    NASA Astrophysics Data System (ADS)

    Rauschmayr, Nathalie

    2015-12-01

    The main goal of a Workload Management System (WMS) is to find and allocate resources for the given tasks. The more and better job information the WMS receives, the easier will be to accomplish its task, which directly translates into higher utilization of resources. Traditionally, the information associated with each job, like expected runtime, is defined beforehand by the Production Manager in best case and fixed arbitrary values by default. In the case of LHCb's Workload Management System no mechanisms are provided which automate the estimation of job requirements. As a result, much more CPU time is normally requested than actually needed. Particularly, in the context of multicore jobs this presents a major problem, since single- and multicore jobs shall share the same resources. Consequently, grid sites need to rely on estimations given by the VOs in order to not decrease the utilization of their worker nodes when making multicore job slots available. The main reason for going to multicore jobs is the reduction of the overall memory footprint. Therefore, it also needs to be studied how memory consumption of jobs can be estimated. A detailed workload analysis of past LHCb jobs is presented. It includes a study of job features and their correlation with runtime and memory consumption. Following the features, a supervised learning algorithm is developed based on a history based prediction. The aim is to learn over time how jobs’ runtime and memory evolve influenced due to changes in experiment conditions and software versions. It will be shown that estimation can be notably improved if experiment conditions are taken into account.

  3. Robust time and frequency domain estimation methods in adaptive control

    NASA Technical Reports Server (NTRS)

    Lamaire, Richard Orville

    1987-01-01

    A robust identification method was developed for use in an adaptive control system. The type of estimator is called the robust estimator, since it is robust to the effects of both unmodeled dynamics and an unmeasurable disturbance. The development of the robust estimator was motivated by a need to provide guarantees in the identification part of an adaptive controller. To enable the design of a robust control system, a nominal model as well as a frequency-domain bounding function on the modeling uncertainty associated with this nominal model must be provided. Two estimation methods are presented for finding parameter estimates, and, hence, a nominal model. One of these methods is based on the well developed field of time-domain parameter estimation. In a second method of finding parameter estimates, a type of weighted least-squares fitting to a frequency-domain estimated model is used. The frequency-domain estimator is shown to perform better, in general, than the time-domain parameter estimator. In addition, a methodology for finding a frequency-domain bounding function on the disturbance is used to compute a frequency-domain bounding function on the additive modeling error due to the effects of the disturbance and the use of finite-length data. The performance of the robust estimator in both open-loop and closed-loop situations is examined through the use of simulations.

  4. Improved theory of time domain reflectometry with variable coaxial cable length for electrical conductivity measurements

    USDA-ARS?s Scientific Manuscript database

    Although empirical models have been developed previously, a mechanistic model is needed for estimating electrical conductivity (EC) using time domain reflectometry (TDR) with variable lengths of coaxial cable. The goals of this study are to: (1) derive a mechanistic model based on multisection tra...

  5. Estimating the time evolution of NMR systems via a quantum-speed-limit-like expression

    NASA Astrophysics Data System (ADS)

    Villamizar, D. V.; Duzzioni, E. I.; Leal, A. C. S.; Auccaise, R.

    2018-05-01

    Finding the solutions of the equations that describe the dynamics of a given physical system is crucial in order to obtain important information about its evolution. However, by using estimation theory, it is possible to obtain, under certain limitations, some information on its dynamics. The quantum-speed-limit (QSL) theory was originally used to estimate the shortest time in which a Hamiltonian drives an initial state to a final one for a given fidelity. Using the QSL theory in a slightly different way, we are able to estimate the running time of a given quantum process. For that purpose, we impose the saturation of the Anandan-Aharonov bound in a rotating frame of reference where the state of the system travels slower than in the original frame (laboratory frame). Through this procedure it is possible to estimate the actual evolution time in the laboratory frame of reference with good accuracy when compared to previous methods. Our method is tested successfully to predict the time spent in the evolution of nuclear spins 1/2 and 3/2 in NMR systems. We find that the estimated time according to our method is better than previous approaches by up to four orders of magnitude. One disadvantage of our method is that we need to solve a number of transcendental equations, which increases with the system dimension and parameter discretization used to solve such equations numerically.

  6. Identifying Optimal Temporal Scale for the Correlation of AOD and Ground Measurements of PM2.5 to Improve the Modeling Performance in a Real-Time Air Quality Estimation System

    NASA Technical Reports Server (NTRS)

    Li,Hui; Faruque, Fazlay; Williams, Worth; Al-Hamdan, Mohammad; Luvall, Jeffrey; Crosson, William; Rickman, Douglas; Limaye, Ashutosh

    2008-01-01

    Aerosol optical depth (AOD), derived from satellite measurements using Moderate Resolution Imaging Spectrometer (MODIS), offers indirect estimates of particle matter. Research shows a significant positive correlation between satellite-based measurements of AOD and ground-based measurements of particulate matter with aerodynamic diameter less than or equal to 2.5 micrometers (PM2.5). In addition, satellite observations have also shown great promise in improving estimates of PM2.5 air quality surface. Research shows that correlations between AOD and ground PM2.5 are affected by a combination of many factors such as inherent characteristics of satellite observations, terrain, cloud cover, height of the mixing layer, and weather conditions, and thus might vary widely in different regions, different seasons, and even different days in a same location. Analysis of correlating AOD with ground measured PM2.5 on a day-to-day basis suggests the temporal scale, a number of immediate latest days for a given run's day, for their correlations needs to be considered to improve air quality surface estimates, especially when satellite observations are used in a real-time pollution system. The second reason is that correlation coefficients between AOD and ground PM2.5 cannot be predetermined and needs to be calculated for each day's run for a real-time system because the coefficients can vary over space and time. Few studies have been conducted to explore the optimal way to apply AOD data to improve model accuracies of PM2.5 surface estimation in a real-time air quality system. This paper discusses the best temporal scale to calculate the correlation of AOD and ground particle matter data to improve the results of pollution models in real-time system.

  7. Determining Source Attenuation History to Support Closure by Natural Attenuation

    DTIC Science & Technology

    2013-11-01

    of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or any other...aspect of this collection of information , including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for

  8. Microstructure Analyses of Detonation Diamond Nanoparticles

    DTIC Science & Technology

    2012-05-01

    burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing...data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information Send comments regarding this...burden estimate or any other aspect of this collection of information, including suggestions for reducing the burden to Department of Defense

  9. Alaska Native Parkinson’s Disease Registry

    DTIC Science & Technology

    2008-11-01

    OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for...reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing this collection of...information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this

  10. Communication Breakdown: DHS Operations During a Cyber Attack

    DTIC Science & Technology

    2010-12-01

    is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of...Presidential Directive, Malware, National Exercise, Quadrennial Homeland Security Review , Trusted Internet Connections, Zero-Day Exploits 16. PRICE CODE 17

  11. MABLE Final Report

    DTIC Science & Technology

    2011-11-30

    modulates or controls the state of Y). The process of identifying these relationships is analogous to what statisticians do during exploratory data ...for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or

  12. Development of an Air-Deployable Ocean Profiler

    DTIC Science & Technology

    2009-01-01

    select the most appropriate technology for each component; sanity check that the selected technolgies can meet the design goals; and detailed...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate

  13. Adaptive Campaigning Applied: Australian Army Operations in Iraq and Afghanistan

    DTIC Science & Technology

    2011-05-01

    of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing this collection of information . Send comments regarding this burden estimate or any...other aspect of this collection of information , including suggestions for reducing this burden to Washington Headquarters Services, Directorate for

  14. Optimizing Human Input in Social Network Analysis

    DTIC Science & Technology

    2018-01-23

    of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or any...other aspect of this collection of information , including suggesstions for reducing this burden, to Washington Headquarters Services, Directorate for

  15. The Reality Of The Homeland Security Enterprise Information Sharing Environment

    DTIC Science & Technology

    2017-12-01

    THE HOMELAND SECURITY ENTERPRISE INFORMATION SHARING ENVIRONMENT by Michael E. Brown December 2017 Thesis Advisors: Erik Dahl Robert...collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or

  16. Microscopic approaches to quantum nonequilibriumthermodynamics and information

    DTIC Science & Technology

    2018-02-09

    Microscopic approaches to quantum non- equilibrium thermodynamics and information The views, opinions and/or findings contained in this report are... information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or any other

  17. Two Invariants of Human-Swarm Interaction

    DTIC Science & Technology

    2018-01-16

    for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden...estimate or any other aspect of this collection of information , including suggestions for reducing the burden, to Department of Defense, Washington

  18. Coupling Considerations in Assembly Language. Revision 1

    DTIC Science & Technology

    2018-02-13

    reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments...regarding this burden estimate or any other aspect of this collection of information , including suggestions for reducing the burden, to the Department of

  19. Inclusion of Disaster Resiliency in City/Neighborhood Comprehensive Plans

    DTIC Science & Technology

    2017-09-01

    collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or...any other aspect of this collection of information , including suggestions for reducing this burden, to Washington headquarters Services, Directorate

  20. Unitary Transformations in 3 D Vector Representation of Qutrit States

    DTIC Science & Technology

    2018-03-12

    Representation of Qutrit States Vinod K Mishra Computational and Information Sciences Directorate, ARL Approved for public... information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection information . Send comments regarding this burden estimate or any other aspect

  1. Joint Direct Attack Munition (JDAM)

    DTIC Science & Technology

    2013-12-01

    instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send...0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing...comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to

  2. Activity Recognition for Agent Teams

    DTIC Science & Technology

    2007-07-01

    Uncertainty in Artifcial Intelligence (UAI), 1994. [47] S. Intille and A. Bobick. Visual tracking using closed-worlds. Technical Report 294, MIT Media Lab...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other

  3. The Effects of Physical Impairment on Shooting Performance

    DTIC Science & Technology

    2012-08-01

    Anthropometry Anthropometric data were collected from each participant. Summary anthropometric statistics are shown in table 1. Table 1...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection information. Send comments regarding this burden estimate or any other aspect of

  4. Small-area estimation of forest attributes within fire boundaries

    Treesearch

    T. Frescino; G. Moisen; K. Adachi; J. Breidt

    2014-01-01

    Wildfires are gaining more attention every year as they burn more frequently, more intensely, and across larger landscapes. Generating timely estimates of forest resources within fire perimeters is important for land managers to quickly determine the impact of fi res on U.S. forests. The U.S. Forest Service’s Forest Inventory and Analysis (FIA) program needs tools to...

  5. Bio-inspired vision based robot control using featureless estimations of time-to-contact.

    PubMed

    Zhang, Haijie; Zhao, Jianguo

    2017-01-31

    Marvelous vision based dynamic behaviors of insects and birds such as perching, landing, and obstacle avoidance have inspired scientists to propose the idea of time-to-contact, which is defined as the time for a moving observer to contact an object or surface if the current velocity is maintained. Since with only a vision sensor, time-to-contact can be directly estimated from consecutive images, it is widely used for a variety of robots to fulfill various tasks such as obstacle avoidance, docking, chasing, perching and landing. However, most of existing methods to estimate the time-to-contact need to extract and track features during the control process, which is time-consuming and cannot be applied to robots with limited computation power. In this paper, we adopt a featureless estimation method, extend this method to more general settings with angular velocities, and improve the estimation results using Kalman filtering. Further, we design an error based controller with gain scheduling strategy to control the motion of mobile robots. Experiments for both estimation and control are conducted using a customized mobile robot platform with low-cost embedded systems. Onboard experimental results demonstrate the effectiveness of the proposed approach, with the robot being controlled to successfully dock in front of a vertical wall. The estimation and control methods presented in this paper can be applied to computation-constrained miniature robots for agile locomotion such as landing, docking, or navigation.

  6. Needs Assessment for Education and Training in Welding on Maui.

    ERIC Educational Resources Information Center

    Pezzoli, Jean A.

    In fall 1996, Hawaii's Maui Community College undertook a study to determine the demand for welders and welding education over the next 5 years and to estimate the characteristics of such training in terms of time of offering and courses needed. Questionnaires were mailed to a sample of 282 welding and related businesses in Maui, requesting…

  7. 38 CFR 21.8070 - Basic duration of a vocational training program.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... vocational rehabilitation, the CP or VRC will estimate the time the child needs to complete a vocational... training period the eligible child needs, the CP or VRC must determine that: (1) The proposed vocational.... In calculating the proposed program's length, the CP or VRC will follow the procedures in § 21.8074(a...

  8. Methods of adjusting the stable estimates of fertility for the effects of mortality decline.

    PubMed

    Abou-Gamrah, H

    1976-03-01

    Summary The paper shows how stable population methods, based on the age structure and the rate of increase, may be used to estimate the demographic measures of a quasi-stable population. After a discussion of known methods for adjusting the stable estimates to allow for the effects of mortality decline two new methods are presented, the application of which requires less information. The first method does not need any supplementary information, and the second method requires an estimate of the difference between the last two five-year intercensal rates of increase, i.e. five times the annual change of the rate of increase during the last ten years. For these new methods we do not need to know the onset year of mortality decline as in the Coale-Demeny method, or a long series of rates of increase as in Zachariah's method.

  9. Methods for Estimating Magnitude and Frequency of Floods in Rural Basins in the Southeastern United States: South Carolina

    USGS Publications Warehouse

    Feaster, Toby D.; Gotvald, Anthony J.; Weaver, J. Curtis

    2009-01-01

    For more than 50 years, the U.S. Geological Survey (USGS) has been developing regional regression equations that can be used to estimate flood magnitude and frequency at ungaged sites. Flood magnitude relates to the volume of flow that occurs over some period of time and usually is presented in cubic feet per second. Flood frequency relates to the probability of occurrence of a flood; that is, on average, what is the likelihood that a flood with a specified magnitude will occur in any given year (1 percent chance, 10 percent chance, 50 percent chance, and so on). Such flood estimates are needed for the efficient design of bridges, highway embankments, levees, and other structures near streams. In addition, these estimates are needed for the effective planning and management of land and water resources, to protect lives and property in flood-prone areas, and to determine flood-insurance rates.

  10. Bayesian estimates of the incidence of rare cancers in Europe.

    PubMed

    Botta, Laura; Capocaccia, Riccardo; Trama, Annalisa; Herrmann, Christian; Salmerón, Diego; De Angelis, Roberta; Mallone, Sandra; Bidoli, Ettore; Marcos-Gragera, Rafael; Dudek-Godeau, Dorota; Gatta, Gemma; Cleries, Ramon

    2018-04-21

    The RARECAREnet project has updated the estimates of the burden of the 198 rare cancers in each European country. Suspecting that scant data could affect the reliability of statistical analysis, we employed a Bayesian approach to estimate the incidence of these cancers. We analyzed about 2,000,000 rare cancers diagnosed in 2000-2007 provided by 83 population-based cancer registries from 27 European countries. We considered European incidence rates (IRs), calculated over all the data available in RARECAREnet, as a valid a priori to merge with country-specific observed data. Therefore we provided (1) Bayesian estimates of IRs and the yearly numbers of cases of rare cancers in each country; (2) the expected time (T) in years needed to observe one new case; and (3) practical criteria to decide when to use the Bayesian approach. Bayesian and classical estimates did not differ much; substantial differences (>10%) ranged from 77 rare cancers in Iceland to 14 in England. The smaller the population the larger the number of rare cancers needing a Bayesian approach. Bayesian estimates were useful for cancers with fewer than 150 observed cases in a country during the study period; this occurred mostly when the population of the country is small. For the first time the Bayesian estimates of IRs and the yearly expected numbers of cases for each rare cancer in each individual European country were calculated. Moreover, the indicator T is useful to convey incidence estimates for exceptionally rare cancers and in small countries; it far exceeds the professional lifespan of a medical doctor. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Estimating millet production for famine early warning: An application of crop simulation modelling using satellite and ground-based data in Burkina Faso

    USGS Publications Warehouse

    Thornton, P. K.; Bowen, W. T.; Ravelo, A.C.; Wilkens, P. W.; Farmer, G.; Brock, J.; Brink, J. E.

    1997-01-01

    Early warning of impending poor crop harvests in highly variable environments can allow policy makers the time they need to take appropriate action to ameliorate the effects of regional food shortages on vulnerable rural and urban populations. Crop production estimates for the current season can be obtained using crop simulation models and remotely sensed estimates of rainfall in real time, embedded in a geographic information system that allows simple analysis of simulation results. A prototype yield estimation system was developed for the thirty provinces of Burkina Faso. It is based on CERES-Millet, a crop simulation model of the growth and development of millet (Pennisetum spp.). The prototype was used to estimate millet production in contrasting seasons and to derive production anomaly estimates for the 1986 season. Provincial yields simulated halfway through the growing season were generally within 15% of their final (end-of-season) values. Although more work is required to produce an operational early warning system of reasonable credibility, the methodology has considerable potential for providing timely estimates of regional production of the major food crops in countries of sub-Saharan Africa.

  12. Monaural room acoustic parameters from music and speech.

    PubMed

    Kendrick, Paul; Cox, Trevor J; Li, Francis F; Zhang, Yonggang; Chambers, Jonathon A

    2008-07-01

    This paper compares two methods for extracting room acoustic parameters from reverberated speech and music. An approach which uses statistical machine learning, previously developed for speech, is extended to work with music. For speech, reverberation time estimations are within a perceptual difference limen of the true value. For music, virtually all early decay time estimations are within a difference limen of the true value. The estimation accuracy is not good enough in other cases due to differences between the simulated data set used to develop the empirical model and real rooms. The second method carries out a maximum likelihood estimation on decay phases at the end of notes or speech utterances. This paper extends the method to estimate parameters relating to the balance of early and late energies in the impulse response. For reverberation time and speech, the method provides estimations which are within the perceptual difference limen of the true value. For other parameters such as clarity, the estimations are not sufficiently accurate due to the natural reverberance of the excitation signals. Speech is a better test signal than music because of the greater periods of silence in the signal, although music is needed for low frequency measurement.

  13. Radiation dose optimization in the decommissioning plan for Loviisa NPP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holmberg, R.; Eurajoki, T.

    1995-03-01

    Finnish rules for nuclear power require a detailed decommissioning plan to be made and kept up to date already during plant operation. The main reasons for this {open_quotes}premature{close_quotes} plan, is, firstly, the need to demonstrate the feasibility of decommissioning, and, secondly, to make realistic cost estimates in order to fund money for this future operation. The decomissioning for Lovissa Nuclear Power Plant (NPP) (2{times}445 MW, PWR) was issued in 1987. It must be updated about every five years. One important aspect of the plant is an estimate of radiation doses to the decomissioning workers. The doses were recently re-estimated becausemore » of a need to decrease the total collective dose estimate in the original plan, 23 manSv. In the update, the dose was reduced by one-third. Part of the reduction was due to changes in the protection and procedures, in which ALARA considerations were taken into account, and partly because of re-estimation of the doses.« less

  14. The global blue-sky albedo change between 2000 - 2015 seen from MODIS

    NASA Astrophysics Data System (ADS)

    Chrysoulakis, N.; Mitraka, Z.; Gorelick, N.

    2016-12-01

    The land surface albedo is a critical physical variable, which influences the Earth's climate by affecting the energy budget and distribution in the Earth-atmosphere system. Blue-sky albedo estimates provide a quantitative means for better constraining global and regional scale climate models. The Moderate Resolution Imaging Spectroradiometer (MODIS) albedo product includes parameters for the estimation of both the directional-hemispherical surface reflectance (black-sky albedo) and the bi-hemispherical surface reflectance (white-sky albedo). This dataset was used here for the blue-sky albedo estimation over the globe on an 8-day basis at 0.5 km spatial resolution for the whole time period covered by MODIS acquisitions (i.e. 2000 until today). To estimate the blue-sky albedo, the fraction of the diffused radiation is needed, a function of the Aerosol Optical Thickness (AOT). Required AOT information was acquired from the MODIS AOT product at 1̊ × 1̊ spatial resolution. Since the blue-sky albedo depends on the solar zenith angle (SZA), the 8-day mean blue-sky albedo values were computed as averages of the corresponding values for the representative SZAs covering the 24-hour day. The estimated blue-sky albedo time series was analyzed to capture changes during the 15 period. All computation were performed using the Google Earth Engine (GEE). The GEE provided access to all the MODIS products needed for the analysis without the need of searching or downloading. Moreover, the combination of MODIS products in both temporal and spatial terms was fast and effecting using the GEE API (Application Program Interface). All the products covering the globe and for the time period of 15 years were processed via a single collection. Most importantly, GEE allowed for including the calculation of SZAs covering the 24-hour day which improves the quality of the overall product. The 8-day global products of land surface albedo are available through http://www.rslab.gr/downloads.html

  15. Towards Automated Annotation of Benthic Survey Images: Variability of Human Experts and Operational Modes of Automation

    PubMed Central

    Beijbom, Oscar; Edmunds, Peter J.; Roelfsema, Chris; Smith, Jennifer; Kline, David I.; Neal, Benjamin P.; Dunlap, Matthew J.; Moriarty, Vincent; Fan, Tung-Yung; Tan, Chih-Jui; Chan, Stephen; Treibitz, Tali; Gamst, Anthony; Mitchell, B. Greg; Kriegman, David

    2015-01-01

    Global climate change and other anthropogenic stressors have heightened the need to rapidly characterize ecological changes in marine benthic communities across large scales. Digital photography enables rapid collection of survey images to meet this need, but the subsequent image annotation is typically a time consuming, manual task. We investigated the feasibility of using automated point-annotation to expedite cover estimation of the 17 dominant benthic categories from survey-images captured at four Pacific coral reefs. Inter- and intra- annotator variability among six human experts was quantified and compared to semi- and fully- automated annotation methods, which are made available at coralnet.ucsd.edu. Our results indicate high expert agreement for identification of coral genera, but lower agreement for algal functional groups, in particular between turf algae and crustose coralline algae. This indicates the need for unequivocal definitions of algal groups, careful training of multiple annotators, and enhanced imaging technology. Semi-automated annotation, where 50% of the annotation decisions were performed automatically, yielded cover estimate errors comparable to those of the human experts. Furthermore, fully-automated annotation yielded rapid, unbiased cover estimates but with increased variance. These results show that automated annotation can increase spatial coverage and decrease time and financial outlay for image-based reef surveys. PMID:26154157

  16. Dependence of atmospheric refractive index structure parameter (Cn2) on the residence time and vertical distribution of aerosols.

    PubMed

    Anand, N; Satheesh, S K; Krishna Moorthy, K

    2017-07-15

    Effects of absorbing atmospheric aerosols in modulating the tropospheric refractive index structure parameter (Cn2) are estimated using high resolution radiosonde and multi-satellite data along with a radiative transfer model. We report the influence of variations in residence time and vertical distribution of aerosols in modulating Cn2 and why the aerosol induced atmospheric heating needs to be considered while estimating a free space optical communication link budget. The results show that performance of the link is seriously affected if large concentrations of absorbing aerosols reside for a long time in the atmospheric path.

  17. 76 FR 45799 - Agency Information Collection Activities; Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ... the 2005 survey so that the results from it can be used as a baseline for a time-series analysis.\\1... 15 minutes to complete the pretest, the same time as that needed for the actual survey. The revised estimate takes further into account the presumed added time required to respond to questions unique to the...

  18. Approaches and Data Quality for Global Precipitation Estimation

    NASA Astrophysics Data System (ADS)

    Huffman, G. J.; Bolvin, D. T.; Nelkin, E. J.

    2015-12-01

    The space and time scales on which precipitation varies are small compared to the satellite coverage that we have, so it is necessary to merge "all" of the available satellite estimates. Differing retrieval capabilities from the various satellites require inter-calibration for the satellite estimates, while "morphing", i.e., Lagrangian time interpolation, is used to lengthen the period over which time interpolation is valid. Additionally, estimates from geostationary-Earth-orbit infrared data are plentiful, but of sufficiently lower quality compared to low-Earth-orbit passive microwave estimates that they are only used when needed. Finally, monthly surface precipitation gauge data can be used to reduce bias and improve patterns of occurrence for monthly satellite data, and short-interval satellite estimates can be improved with a simple scaling such that they sum to the monthly satellite-gauge combination. The presentation will briefly consider some of the design decisions for practical computation of the Global Precipitation Measurement (GPM) mission product Integrated Multi-satellitE Retrievals for GPM (IMERG), then examine design choices that maximize value for end users. For example, data fields are provided in the output file that provide insight into the basis for the estimated precipitation, including error, sensor providing the estimate, precipitation phase (solid/liquid), and intermediate precipitation estimates. Another important initiative is successive computations for the same data date/time at longer latencies as additional data are received, which for IMERG is currently done at 6 hours, 16 hours, and 3 months after observation time. Importantly, users require long records for each latency, which runs counter to the data archiving practices at most archive sites. As well, the assignment of Digital Object Identifiers (DOI's) for near-real-time data sets (at 6 and 16 hours for IMERG) is not a settled issue.

  19. Mesospheric temperatures estimated from the meteor decay times over King Sejong Station(62.2°S, 58.8°W), Antarctica

    NASA Astrophysics Data System (ADS)

    Kim, J.; Kim, Y.; Jee, G.

    2010-12-01

    A VHF meteor radar has ben operated at King Sejong Station (62.2°S, 58.8°W), Antarctica since March 2007 for the observations of the neutral winds in the mesosphere and lower thermosphere region. In addition, the radar observation allows usto estimate the neutral temperature from the measured meteor decay times of the meteor echoes by utilizing Hocking's method (Hocking, 1999). For this temperature estimation, the meteor echoes observed from March 2007 to July 2009 were divded, for the first time, into weak and strong echoes depending on the strength of estimated relative electron line densities. The estimated temperatures are then compared the temperature measurements from the spectral airglow temperature imager (SATI) which has also been operated at the same location since 2002. The estimated temperatures from strong echoes were significantly lower than the temperatures estimated from weak echoes by on average about 31 K. As was done in most previous studies, we also derived the temperature by using all echoes without dividing into weak and strong, which produces about 10 K lower than the weak echoes. Among these hree estimated temperatures, the one from weak echoes was most similar to the SATI temperature. This result indicates that the strong echoes tend to reduce the estimated temperature and therefore need to be removed in the estimation procedure. We will also present the comparison of the estimated temperature with other measurements, for example, from the TIMED/SABER instrument and the NRLMSISE-00 empirical model results as a further validation.

  20. Does Contraceptive Use in the United States Meet Global Goals?

    PubMed

    Frederiksen, Brittni N; Ahrens, Katherine A; Moskosky, Susan; Gavin, Loretta

    2017-12-01

    The United Nations Sustainable Development Goals (SDGs) seek to achieve health equity, and they apply to all countries. SDG contraceptive use estimates for the United States are needed to contextualize U.S. performance in relation to that of other countries. Data from the 2011-2013 and 2013-2015 waves of the National Survey of Family Growth were used to calculate three SDG indicators of contraceptive use for U.S. women aged 15-44: contraceptive prevalence, unmet need for family planning and demand for family planning satisfied by modern methods. These measures were calculated separately for married or cohabiting women and for unmarried, sexually active women; differences by sociodemographic characteristics were assessed using t tests from logistic regression analysis. Estimates for married women were compared with 2010-2015 estimates from 94 other countries, most of which were low- or middle-income. For married or cohabiting women, U.S. estimates for contraceptive prevalence, unmet need and demand satisfied by modern methods were 74%, 9% and 80%, respectively; for unmarried, sexually active women, they were 85%, 11% and 82%, respectively. Estimates varied by sociodemographic characteristics, particularly among married or cohabiting women. Five countries performed better than the United States on contraceptive prevalence, 12 on unmet need and four on both measures; seven performed better on demand satisfied by modern methods. There is a need to continue efforts to expand access to contraceptive care in the United States, and to monitor the SDG indicators so that improvement can be tracked over time. Copyright © 2017 by the Guttmacher Institute.

  1. The never ending road: improving, adapting and refining a needs-based model to estimate future general practitioner requirements in two Australian states.

    PubMed

    Laurence, Caroline O; Heywood, Troy; Bell, Janice; Atkinson, Kaye; Karnon, Jonathan

    2018-03-27

    Health workforce planning models have been developed to estimate the future health workforce requirements for a population whom they serve and have been used to inform policy decisions. To adapt and further develop a need-based GP workforce simulation model to incorporate current and estimated geographic distribution of patients and GPs. A need-based simulation model that estimates the supply of GPs and levels of services required in South Australia (SA) was adapted and applied to the Western Australian (WA) workforce. The main outcome measure was the differences in the number of full-time equivalent (FTE) GPs supplied and required from 2013 to 2033. The base scenario estimated a shortage of GPs in WA from 2019 onwards with a shortage of 493 FTE GPs in 2033, while for SA, estimates showed an oversupply over the projection period. The WA urban and rural models estimated an urban shortage of GPs over this period. A reduced international medical graduate recruitment scenario resulted in estimated shortfalls of GPs by 2033 for WA and SA. The WA-specific scenarios of lower population projections and registrar work value resulted in a reduced shortage of FTE GPs in 2033, while unfilled training places increased the shortfall of FTE GPs in 2033. The simulation model incorporates contextual differences to its structure that allows within and cross jurisdictional comparisons of workforce estimations. It also provides greater insights into the drivers of supply and demand and the impact of changes in workforce policy, promoting more informed decision-making.

  2. Contract-Based Integration of Cyber-Physical Analyses

    DTIC Science & Technology

    2014-10-14

    Conference on Embedded Software Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the ...data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this

  3. Mems-Based Waste Vibration and Acoustic Energy Harvesters

    DTIC Science & Technology

    2014-12-01

    information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork

  4. Coupled Ocean-Atmosphere Dynamics and Predictability of MJO’s

    DTIC Science & Technology

    2012-09-30

    chlorophyll modulation by the MJO Previous studies analyzed ocean color satellite data and suggested that the primary mechanism of surface...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of

  5. Coupled Ocean-Atmosphere Dynamics and Predictability of MJO’s

    DTIC Science & Technology

    2012-09-30

    mechanisms of surface chlorophyll modulation by the MJO Previous studies analyzed ocean color satellite data and suggested that the primary mechanism of...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the... data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this

  6. An Evaluation of Shipyard Practices and Their Correlation to Ship Costs

    DTIC Science & Technology

    2017-12-01

    Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and...collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate

  7. Characterizing Candidate Oncogenes at 8q21 in Breast Cancer

    DTIC Science & Technology

    2008-03-01

    this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden...estimate or any other aspect of this collection of information, including suggestions for reducing this burden to Department of Defense, Washington

  8. Applications of Electromagnetic Waves to Problems in Nondestructive Testing and Target Identification

    DTIC Science & Technology

    2014-09-09

    public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send...comments regarding this burden estimate or any other aspect of this collection of information , including suggestions for reducing the burden, to the

  9. Defence Science and Technology Strategy. Science and Technology for a Secure Canada

    DTIC Science & Technology

    2006-12-01

    Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments...regarding this burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington

  10. Spectral Analysis for DIAL and Lidar Detection of TATP

    DTIC Science & Technology

    2008-08-13

    REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 The public reporting burden for this collection of information is. estimated to average 1...hour per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining the data needed, and...completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of

  11. Far Infrared Photonic Crystals Operating in the Reststrahl Region

    DTIC Science & Technology

    2007-08-20

    REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1...hour per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining the data needed, and...completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information

  12. Overcoming Resistance to Trastuzumab in HER2-Amplified Breast Cancers

    DTIC Science & Technology

    2011-08-01

    Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for...reviewing instructions, searching existing data sources , gathering and maintaining the data needed, and completing and reviewing this collection of...information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing

  13. Second-Order Active NLO Chromophores for DNA Based Electro-Optics Materials

    DTIC Science & Technology

    2010-09-21

    REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour...per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining the data needed, and...completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information

  14. Post-Remediation Evaluation of EVO Treatment: How Can We Improve Performance

    DTIC Science & Technology

    2017-11-15

    this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing this collection of information . Send comments regarding this burden...estimate or any other aspect of this collection of information , including suggestions for reducing this burden to Department of Defense, Washington

  15. Attribution In Influence: Relative Power And The Use Of Attribution

    DTIC Science & Technology

    2017-12-01

    reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments...regarding this burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington

  16. Direct Thermodynamic Measurements of the Energetics of Information Processing

    DTIC Science & Technology

    2017-08-08

    Report: Direct thermodynamic measurements of the energetics of information processing The views, opinions and/or findings contained in this report are... information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or any other

  17. Role of the U.S. Government in the Cybersecurity of Private Entities

    DTIC Science & Technology

    2017-12-01

    reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding...this burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington

  18. Correlation Immunity, Avalanche Features, and Other Cryptographic Properties of Generalized Boolean Functions

    DTIC Science & Technology

    2017-09-01

    information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or any other aspect...of this collection of information , including suggestions for reducing this burden to Washington headquarters Services, Directorate for Information

  19. Bioinspired Surface Treatments for Improved Decontamination: Handling andDecontamination Considerations

    DTIC Science & Technology

    2018-03-16

    this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing this collection of information . Send comments regarding this burden...estimate or any other aspect of this collection of information , including suggestions for reducing this burden to Department of Defense, Washington

  20. Workshop on Information Engines at the Frontiers of Nanoscale Thermodynamics

    DTIC Science & Technology

    2017-11-01

    Report: Workshop on Information Engines at the Frontiers of Nanoscale Thermodynamics The views, opinions and/or findings contained in this report are... information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or any other

  1. Optimizing Sparse Representations of Kinetic Distributions via Information Theory

    DTIC Science & Technology

    2017-07-31

    for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden...estimate or any other aspect of this collection of information , including suggestions for reducing the burden, to Department of Defense, Washington

  2. How The Democratization Of Technology Enhances Intelligence-Led Policing And Serves The Community

    DTIC Science & Technology

    2017-12-01

    reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments...regarding this burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington

  3. Navy And Marine Corps IT/IS Acquisition: A Way Forward

    DTIC Science & Technology

    2017-12-01

    reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding...this burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington

  4. Propagation of Statistical Noise Through a Two-Qubit Maximum Likelihood Tomography

    DTIC Science & Technology

    2018-04-01

    University Daniel E Jones, Brian T Kirby, and Michael Brodsky Computational and Information Sciences Directorate, ARL Approved for...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection information . Send comments regarding this burden estimate or any other

  5. DoD Software Intensive Systems Development: A Hit and Miss Process

    DTIC Science & Technology

    2015-05-01

    searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments...Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington

  6. Epigenetic Regulation of microRNA Expression: Targeting the Triple-Negative Breast Cancer Phenotype

    DTIC Science & Technology

    2011-10-01

    information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing this collection of information . Send comments regarding this burden estimate or any...other aspect of this collection of information , including suggestions for reducing this burden to Department of Defense, Washington Headquarters Services

  7. The Effect of Modified Eye Position on Shooting Performance

    DTIC Science & Technology

    2011-04-01

    participants was 20/20, with one participant aided by corrective contact lenses. 3.3 Anthropometry Anthropometric data was collected from each...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the... data needed, and completing and reviewing the collection information. Send comments regarding this burden estimate or any other aspect of this

  8. Human Systems Integration (HSI) in Acquisition. HSI Domain Guide

    DTIC Science & Technology

    2009-08-01

    job simulation that includes posture data , force parameters, and anthropometry . Output includes the percentage of men and women who have the strength...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of

  9. Method to Estimate the Dissolved Air Content in Hydraulic Fluid

    NASA Technical Reports Server (NTRS)

    Hauser, Daniel M.

    2011-01-01

    In order to verify the air content in hydraulic fluid, an instrument was needed to measure the dissolved air content before the fluid was loaded into the system. The instrument also needed to measure the dissolved air content in situ and in real time during the de-aeration process. The current methods used to measure the dissolved air content require the fluid to be drawn from the hydraulic system, and additional offline laboratory processing time is involved. During laboratory processing, there is a potential for contamination to occur, especially when subsaturated fluid is to be analyzed. A new method measures the amount of dissolved air in hydraulic fluid through the use of a dissolved oxygen meter. The device measures the dissolved air content through an in situ, real-time process that requires no additional offline laboratory processing time. The method utilizes an instrument that measures the partial pressure of oxygen in the hydraulic fluid. By using a standardized calculation procedure that relates the oxygen partial pressure to the volume of dissolved air in solution, the dissolved air content is estimated. The technique employs luminescent quenching technology to determine the partial pressure of oxygen in the hydraulic fluid. An estimated Henry s law coefficient for oxygen and nitrogen in hydraulic fluid is calculated using a standard method to estimate the solubility of gases in lubricants. The amount of dissolved oxygen in the hydraulic fluid is estimated using the Henry s solubility coefficient and the measured partial pressure of oxygen in solution. The amount of dissolved nitrogen that is in solution is estimated by assuming that the ratio of dissolved nitrogen to dissolved oxygen is equal to the ratio of the gas solubility of nitrogen to oxygen at atmospheric pressure and temperature. The technique was performed at atmospheric pressure and room temperature. The technique could be theoretically carried out at higher pressures and elevated temperatures.

  10. For Mole Problems, Call Avogadro: 602-1023.

    ERIC Educational Resources Information Center

    Uthe, R. E.

    2002-01-01

    Describes techniques to help introductory students become familiar with Avogadro's number and mole calculations. Techniques involve estimating numbers of common objects then calculating the length of time needed to count large numbers of them. For example, the immense amount of time required to count a mole of sand grains at one grain per second…

  11. Estimating Alarm Thresholds for Process Monitoring Data under Different Assumptions about the Data Generating Mechanism

    DOE PAGES

    Burr, Tom; Hamada, Michael S.; Howell, John; ...

    2013-01-01

    Process monitoring (PM) for nuclear safeguards sometimes requires estimation of thresholds corresponding to small false alarm rates. Threshold estimation dates to the 1920s with the Shewhart control chart; however, because possible new roles for PM are being evaluated in nuclear safeguards, it is timely to consider modern model selection options in the context of threshold estimation. One of the possible new PM roles involves PM residuals, where a residual is defined as residual = data − prediction. This paper reviews alarm threshold estimation, introduces model selection options, and considers a range of assumptions regarding the data-generating mechanism for PM residuals.more » Two PM examples from nuclear safeguards are included to motivate the need for alarm threshold estimation. The first example involves mixtures of probability distributions that arise in solution monitoring, which is a common type of PM. The second example involves periodic partial cleanout of in-process inventory, leading to challenging structure in the time series of PM residuals.« less

  12. The effect of concurrent hand movement on estimated time to contact in a prediction motion task.

    PubMed

    Zheng, Ran; Maraj, Brian K V

    2018-04-27

    In many activities, we need to predict the arrival of an occluded object. This action is called prediction motion or motion extrapolation. Previous researchers have found that both eye tracking and the internal clocking model are involved in the prediction motion task. Additionally, it is reported that concurrent hand movement facilitates the eye tracking of an externally generated target in a tracking task, even if the target is occluded. The present study examined the effect of concurrent hand movement on the estimated time to contact in a prediction motion task. We found different (accurate/inaccurate) concurrent hand movements had the opposite effect on the eye tracking accuracy and estimated TTC in the prediction motion task. That is, the accurate concurrent hand tracking enhanced eye tracking accuracy and had the trend to increase the precision of estimated TTC, but the inaccurate concurrent hand tracking decreased eye tracking accuracy and disrupted estimated TTC. However, eye tracking accuracy does not determine the precision of estimated TTC.

  13. Estimating the effect of a rare time-dependent treatment on the recurrent event rate.

    PubMed

    Smith, Abigail R; Zhu, Danting; Goodrich, Nathan P; Merion, Robert M; Schaubel, Douglas E

    2018-05-30

    In many observational studies, the objective is to estimate the effect of treatment or state-change on the recurrent event rate. If treatment is assigned after the start of follow-up, traditional methods (eg, adjustment for baseline-only covariates or fully conditional adjustment for time-dependent covariates) may give biased results. We propose a two-stage modeling approach using the method of sequential stratification to accurately estimate the effect of a time-dependent treatment on the recurrent event rate. At the first stage, we estimate the pretreatment recurrent event trajectory using a proportional rates model censored at the time of treatment. Prognostic scores are estimated from the linear predictor of this model and used to match treated patients to as yet untreated controls based on prognostic score at the time of treatment for the index patient. The final model is stratified on matched sets and compares the posttreatment recurrent event rate to the recurrent event rate of the matched controls. We demonstrate through simulation that bias due to dependent censoring is negligible, provided the treatment frequency is low, and we investigate a threshold at which correction for dependent censoring is needed. The method is applied to liver transplant (LT), where we estimate the effect of development of post-LT End Stage Renal Disease (ESRD) on rate of days hospitalized. Copyright © 2018 John Wiley & Sons, Ltd.

  14. Efficient high-rate satellite clock estimation for PPP ambiguity resolution using carrier-ranges.

    PubMed

    Chen, Hua; Jiang, Weiping; Ge, Maorong; Wickert, Jens; Schuh, Harald

    2014-11-25

    In order to catch up the short-term clock variation of GNSS satellites, clock corrections must be estimated and updated at a high-rate for Precise Point Positioning (PPP). This estimation is already very time-consuming for the GPS constellation only as a great number of ambiguities need to be simultaneously estimated. However, on the one hand better estimates are expected by including more stations, and on the other hand satellites from different GNSS systems must be processed integratively for a reliable multi-GNSS positioning service. To alleviate the heavy computational burden, epoch-differenced observations are always employed where ambiguities are eliminated. As the epoch-differenced method can only derive temporal clock changes which have to be aligned to the absolute clocks but always in a rather complicated way, in this paper, an efficient method for high-rate clock estimation is proposed using the concept of "carrier-range" realized by means of PPP with integer ambiguity resolution. Processing procedures for both post- and real-time processing are developed, respectively. The experimental validation shows that the computation time could be reduced to about one sixth of that of the existing methods for post-processing and less than 1 s for processing a single epoch of a network with about 200 stations in real-time mode after all ambiguities are fixed. This confirms that the proposed processing strategy will enable the high-rate clock estimation for future multi-GNSS networks in post-processing and possibly also in real-time mode.

  15. Modeling qRT-PCR dynamics with application to cancer biomarker quantification.

    PubMed

    Chervoneva, Inna; Freydin, Boris; Hyslop, Terry; Waldman, Scott A

    2017-01-01

    Quantitative reverse transcription polymerase chain reaction (qRT-PCR) is widely used for molecular diagnostics and evaluating prognosis in cancer. The utility of mRNA expression biomarkers relies heavily on the accuracy and precision of quantification, which is still challenging for low abundance transcripts. The critical step for quantification is accurate estimation of efficiency needed for computing a relative qRT-PCR expression. We propose a new approach to estimating qRT-PCR efficiency based on modeling dynamics of polymerase chain reaction amplification. In contrast, only models for fluorescence intensity as a function of polymerase chain reaction cycle have been used so far for quantification. The dynamics of qRT-PCR efficiency is modeled using an ordinary differential equation model, and the fitted ordinary differential equation model is used to obtain effective polymerase chain reaction efficiency estimates needed for efficiency-adjusted quantification. The proposed new qRT-PCR efficiency estimates were used to quantify GUCY2C (Guanylate Cyclase 2C) mRNA expression in the blood of colorectal cancer patients. Time to recurrence and GUCY2C expression ratios were analyzed in a joint model for survival and longitudinal outcomes. The joint model with GUCY2C quantified using the proposed polymerase chain reaction efficiency estimates provided clinically meaningful results for association between time to recurrence and longitudinal trends in GUCY2C expression.

  16. Getting to the point: Rapid point selection and variable density InSAR time series for urban deformation monitoring

    NASA Astrophysics Data System (ADS)

    Spaans, K.; Hooper, A. J.

    2017-12-01

    The short revisit time and high data acquisition rates of current satellites have resulted in increased interest in the development of deformation monitoring and rapid disaster response capability, using InSAR. Fast, efficient data processing methodologies are required to deliver the timely results necessary for this, and also to limit computing resources required to process the large quantities of data being acquired. Contrary to volcano or earthquake applications, urban monitoring requires high resolution processing, in order to differentiate movements between buildings, or between buildings and the surrounding land. Here we present Rapid time series InSAR (RapidSAR), a method that can efficiently update high resolution time series of interferograms, and demonstrate its effectiveness over urban areas. The RapidSAR method estimates the coherence of pixels on an interferogram-by-interferogram basis. This allows for rapid ingestion of newly acquired images without the need to reprocess the earlier acquired part of the time series. The coherence estimate is based on ensembles of neighbouring pixels with similar amplitude behaviour through time, which are identified on an initial set of interferograms, and need be re-evaluated only occasionally. By taking into account scattering properties of points during coherence estimation, a high quality coherence estimate is achieved, allowing point selection at full resolution. The individual point selection maximizes the amount of information that can be extracted from each interferogram, as no selection compromise has to be reached between high and low coherence interferograms. In other words, points do not have to be coherent throughout the time series to contribute to the deformation time series. We demonstrate the effectiveness of our method over urban areas in the UK. We show how the algorithm successfully extracts high density time series from full resolution Sentinel-1 interferograms, and distinguish clearly between buildings and surrounding vegetation or streets. The fact that new interferograms can be processed separately from the remainder of the time series helps manage the high data volumes, both in space and time, generated by current missions.

  17. Joint Bearing and Range Estimation of Multiple Objects from Time-Frequency Analysis.

    PubMed

    Liu, Jeng-Cheng; Cheng, Yuang-Tung; Hung, Hsien-Sen

    2018-01-19

    Direction-of-arrival (DOA) and range estimation is an important issue of sonar signal processing. In this paper, a novel approach using Hilbert-Huang transform (HHT) is proposed for joint bearing and range estimation of multiple targets based on a uniform linear array (ULA) of hydrophones. The structure of this ULA based on micro-electro-mechanical systems (MEMS) technology, and thus has attractive features of small size, high sensitivity and low cost, and is suitable for Autonomous Underwater Vehicle (AUV) operations. This proposed target localization method has the following advantages: only a single snapshot of data is needed and real-time processing is feasible. The proposed algorithm transforms a very complicated nonlinear estimation problem to a simple nearly linear one via time-frequency distribution (TFD) theory and is verified with HHT. Theoretical discussions of resolution issue are also provided to facilitate the design of a MEMS sensor with high sensitivity. Simulation results are shown to verify the effectiveness of the proposed method.

  18. A computational approach to estimate postmortem interval using opacity development of eye for human subjects.

    PubMed

    Cantürk, İsmail; Özyılmaz, Lale

    2018-07-01

    This paper presents an approach to postmortem interval (PMI) estimation, which is a very debated and complicated area of forensic science. Most of the reported methods to determine PMI in the literature are not practical because of the need for skilled persons and significant amounts of time, and give unsatisfactory results. Additionally, the error margin of PMI estimation increases proportionally with elapsed time after death. It is crucial to develop practical PMI estimation methods for forensic science. In this study, a computational system is developed to determine the PMI of human subjects by investigating postmortem opacity development of the eye. Relevant features from the eye images were extracted using image processing techniques to reflect gradual opacity development. The features were then investigated to predict the time after death using machine learning methods. The experimental results prove that the development of opacity can be utilized as a practical computational tool to determine PMI for human subjects. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Independent evaluation of point source fossil fuel CO2 emissions to better than 10%

    PubMed Central

    Turnbull, Jocelyn Christine; Keller, Elizabeth D.; Norris, Margaret W.; Wiltshire, Rachael M.

    2016-01-01

    Independent estimates of fossil fuel CO2 (CO2ff) emissions are key to ensuring that emission reductions and regulations are effective and provide needed transparency and trust. Point source emissions are a key target because a small number of power plants represent a large portion of total global emissions. Currently, emission rates are known only from self-reported data. Atmospheric observations have the potential to meet the need for independent evaluation, but useful results from this method have been elusive, due to challenges in distinguishing CO2ff emissions from the large and varying CO2 background and in relating atmospheric observations to emission flux rates with high accuracy. Here we use time-integrated observations of the radiocarbon content of CO2 (14CO2) to quantify the recently added CO2ff mole fraction at surface sites surrounding a point source. We demonstrate that both fast-growing plant material (grass) and CO2 collected by absorption into sodium hydroxide solution provide excellent time-integrated records of atmospheric 14CO2. These time-integrated samples allow us to evaluate emissions over a period of days to weeks with only a modest number of measurements. Applying the same time integration in an atmospheric transport model eliminates the need to resolve highly variable short-term turbulence. Together these techniques allow us to independently evaluate point source CO2ff emission rates from atmospheric observations with uncertainties of better than 10%. This uncertainty represents an improvement by a factor of 2 over current bottom-up inventory estimates and previous atmospheric observation estimates and allows reliable independent evaluation of emissions. PMID:27573818

  20. Independent evaluation of point source fossil fuel CO2 emissions to better than 10%.

    PubMed

    Turnbull, Jocelyn Christine; Keller, Elizabeth D; Norris, Margaret W; Wiltshire, Rachael M

    2016-09-13

    Independent estimates of fossil fuel CO2 (CO2ff) emissions are key to ensuring that emission reductions and regulations are effective and provide needed transparency and trust. Point source emissions are a key target because a small number of power plants represent a large portion of total global emissions. Currently, emission rates are known only from self-reported data. Atmospheric observations have the potential to meet the need for independent evaluation, but useful results from this method have been elusive, due to challenges in distinguishing CO2ff emissions from the large and varying CO2 background and in relating atmospheric observations to emission flux rates with high accuracy. Here we use time-integrated observations of the radiocarbon content of CO2 ((14)CO2) to quantify the recently added CO2ff mole fraction at surface sites surrounding a point source. We demonstrate that both fast-growing plant material (grass) and CO2 collected by absorption into sodium hydroxide solution provide excellent time-integrated records of atmospheric (14)CO2 These time-integrated samples allow us to evaluate emissions over a period of days to weeks with only a modest number of measurements. Applying the same time integration in an atmospheric transport model eliminates the need to resolve highly variable short-term turbulence. Together these techniques allow us to independently evaluate point source CO2ff emission rates from atmospheric observations with uncertainties of better than 10%. This uncertainty represents an improvement by a factor of 2 over current bottom-up inventory estimates and previous atmospheric observation estimates and allows reliable independent evaluation of emissions.

  1. Density estimates of monarch butterflies overwintering in central Mexico

    PubMed Central

    Diffendorfer, Jay E.; López-Hoffman, Laura; Oberhauser, Karen; Pleasants, John; Semmens, Brice X.; Semmens, Darius; Taylor, Orley R.; Wiederholt, Ruscena

    2017-01-01

    Given the rapid population decline and recent petition for listing of the monarch butterfly (Danaus plexippus L.) under the Endangered Species Act, an accurate estimate of the Eastern, migratory population size is needed. Because of difficulty in counting individual monarchs, the number of hectares occupied by monarchs in the overwintering area is commonly used as a proxy for population size, which is then multiplied by the density of individuals per hectare to estimate population size. There is, however, considerable variation in published estimates of overwintering density, ranging from 6.9–60.9 million ha−1. We develop a probability distribution for overwinter density of monarch butterflies from six published density estimates. The mean density among the mixture of the six published estimates was ∼27.9 million butterflies ha−1 (95% CI [2.4–80.7] million ha−1); the mixture distribution is approximately log-normal, and as such is better represented by the median (21.1 million butterflies ha−1). Based upon assumptions regarding the number of milkweed needed to support monarchs, the amount of milkweed (Asclepias spp.) lost (0.86 billion stems) in the northern US plus the amount of milkweed remaining (1.34 billion stems), we estimate >1.8 billion stems is needed to return monarchs to an average population size of 6 ha. Considerable uncertainty exists in this required amount of milkweed because of the considerable uncertainty occurring in overwinter density estimates. Nevertheless, the estimate is on the same order as other published estimates. The studies included in our synthesis differ substantially by year, location, method, and measures of precision. A better understanding of the factors influencing overwintering density across space and time would be valuable for increasing the precision of conservation recommendations. PMID:28462031

  2. Density estimates of monarch butterflies overwintering in central Mexico

    USGS Publications Warehouse

    Thogmartin, Wayne E.; Diffendorfer, James E.; Lopez-Hoffman, Laura; Oberhauser, Karen; Pleasants, John M.; Semmens, Brice X.; Semmens, Darius J.; Taylor, Orley R.; Wiederholt, Ruscena

    2017-01-01

    Given the rapid population decline and recent petition for listing of the monarch butterfly (Danaus plexippus L.) under the Endangered Species Act, an accurate estimate of the Eastern, migratory population size is needed. Because of difficulty in counting individual monarchs, the number of hectares occupied by monarchs in the overwintering area is commonly used as a proxy for population size, which is then multiplied by the density of individuals per hectare to estimate population size. There is, however, considerable variation in published estimates of overwintering density, ranging from 6.9–60.9 million ha−1. We develop a probability distribution for overwinter density of monarch butterflies from six published density estimates. The mean density among the mixture of the six published estimates was ∼27.9 million butterflies ha−1 (95% CI [2.4–80.7] million ha−1); the mixture distribution is approximately log-normal, and as such is better represented by the median (21.1 million butterflies ha−1). Based upon assumptions regarding the number of milkweed needed to support monarchs, the amount of milkweed (Asclepias spp.) lost (0.86 billion stems) in the northern US plus the amount of milkweed remaining (1.34 billion stems), we estimate >1.8 billion stems is needed to return monarchs to an average population size of 6 ha. Considerable uncertainty exists in this required amount of milkweed because of the considerable uncertainty occurring in overwinter density estimates. Nevertheless, the estimate is on the same order as other published estimates. The studies included in our synthesis differ substantially by year, location, method, and measures of precision. A better understanding of the factors influencing overwintering density across space and time would be valuable for increasing the precision of conservation recommendations.

  3. Balancing nurses' workload in hospital wards: study protocol of developing a method to manage workload.

    PubMed

    van den Oetelaar, W F J M; van Stel, H F; van Rhenen, W; Stellato, R K; Grolman, W

    2016-11-10

    Hospitals pursue different goals at the same time: excellent service to their patients, good quality care, operational excellence, retaining employees. This requires a good balance between patient needs and nursing staff. One way to ensure a proper fit between patient needs and nursing staff is to work with a workload management method. In our view, a nursing workload management method needs to have the following characteristics: easy to interpret; limited additional registration; applicable to different types of hospital wards; supported by nurses; covers all activities of nurses and suitable for prospective planning of nursing staff. At present, no such method is available. The research follows several steps to come to a workload management method for staff nurses. First, a list of patient characteristics relevant to care time will be composed by performing a Delphi study among staff nurses. Next, a time study of nurses' activities will be carried out. The 2 can be combined to estimate care time per patient group and estimate the time nurses spend on non-patient-related activities. These 2 estimates can be combined and compared with available nursing resources: this gives an estimate of nurses' workload. The research will take place in an academic hospital in the Netherlands. 6 surgical wards will be included, capacity 15-30 beds. The study protocol was submitted to the Medical Ethical Review Board of the University Medical Center (UMC) Utrecht and received a positive advice, protocol number 14-165/C. This method will be developed in close cooperation with staff nurses and ward management. The strong involvement of the end users will contribute to a broader support of the results. The method we will develop may also be useful for planning purposes; this is a strong advantage compared with existing methods, which tend to focus on retrospective analysis. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  4. Timber marking costs in spruce-fir: experience on the Penobscot Experimental Forest

    Treesearch

    Paul E. Sendak

    2002-01-01

    In the application of partial harvests, time needs to be allocated to marking trees to be cut. On the Penobscot Experimental Forest located in Maine, eight major experimental treatments have been applied to northern conifer stands for more than 40 yr. Data recorded at the time of marking were used to estimate the time required to mark trees for harvest. A simple linear...

  5. Short Vigilance Tasks are Hard Work Even If Time Flies

    DTIC Science & Technology

    2016-10-21

    maintaining the data needed , and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...actual time. Upon completion of the task, participants filled out questionnaires related to the hedonic and temporal evaluation of the task. Participants...time. Upon completion of the task, participants filled out questionnaires related to the hedonic and temporal evaluation of the task. Participants

  6. Real-time reflectometry measurement validation in H-mode regimes for plasma position control.

    PubMed

    Santos, J; Guimarais, L; Manso, M

    2010-10-01

    It has been shown that in H-mode regimes, reflectometry electron density profiles and an estimate for the density at the separatrix can be jointly used to track the separatrix within the precision required for plasma position control on ITER. We present a method to automatically remove, from the position estimation procedure, measurements performed during collapse and recovery phases of edge localized modes (ELMs). Based on the rejection mechanism, the method also produces an estimate confidence value to be fed to the position feedback controller. Preliminary results show that the method improves the real-time experimental separatrix tracking capabilities and has the potential to eliminate the need for an external online source of ELM event signaling during control feedback operation.

  7. Female Genital Mutilation/Cutting in the United States: Updated Estimates of Women and Girls at Risk, 2012

    PubMed Central

    Stupp, Paul; Okoroh, Ekwutosi; Besera, Ghenet; Goodman, David; Danel, Isabella

    2016-01-01

    Objectives In 1996, the U.S. Congress passed legislation making female genital mutilation/cutting (FGM/C) illegal in the United States. CDC published the first estimates of the number of women and girls at risk for FGM/C in 1997. Since 2012, various constituencies have again raised concerns about the practice in the United States. We updated an earlier estimate of the number of women and girls in the United States who were at risk for FGM/C or its consequences. Methods We estimated the number of women and girls who were at risk for undergoing FGM/C or its consequences in 2012 by applying country-specific prevalence of FGM/C to the estimated number of women and girls living in the United States who were born in that country or who lived with a parent born in that country. Results Approximately 513,000 women and girls in the United States were at risk for FGM/C or its consequences in 2012, which was more than three times higher than the earlier estimate, based on 1990 data. The increase in the number of women and girls younger than 18 years of age at risk for FGM/C was more than four times that of previous estimates. Conclusion The estimated increase was wholly a result of rapid growth in the number of immigrants from FGM/C-practicing countries living in the United States and not from increases in FGM/C prevalence in those countries. Scientifically valid information regarding whether women or their daughters have actually undergone FGM/C and related information that can contribute to efforts to prevent the practice in the United States and provide needed health services to women who have undergone FGM/C are needed. PMID:26957669

  8. Female Genital Mutilation/Cutting in the United States: Updated Estimates of Women and Girls at Risk, 2012.

    PubMed

    Goldberg, Howard; Stupp, Paul; Okoroh, Ekwutosi; Besera, Ghenet; Goodman, David; Danel, Isabella

    2016-01-01

    In 1996, the U.S. Congress passed legislation making female genital mutilation/cutting (FGM/C) illegal in the United States. CDC published the first estimates of the number of women and girls at risk for FGM/C in 1997. Since 2012, various constituencies have again raised concerns about the practice in the United States. We updated an earlier estimate of the number of women and girls in the United States who were at risk for FGM/C or its consequences. We estimated the number of women and girls who were at risk for undergoing FGM/C or its consequences in 2012 by applying country-specific prevalence of FGM/C to the estimated number of women and girls living in the United States who were born in that country or who lived with a parent born in that country. Approximately 513,000 women and girls in the United States were at risk for FGM/C or its consequences in 2012, which was more than three times higher than the earlier estimate, based on 1990 data. The increase in the number of women and girls younger than 18 years of age at risk for FGM/C was more than four times that of previous estimates. The estimated increase was wholly a result of rapid growth in the number of immigrants from FGM/C-practicing countries living in the United States and not from increases in FGM/C prevalence in those countries. Scientifically valid information regarding whether women or their daughters have actually undergone FGM/C and related information that can contribute to efforts to prevent the practice in the United States and provide needed health services to women who have undergone FGM/C are needed.

  9. A 3D approximate maximum likelihood solver for localization of fish implanted with acoustic transmitters

    DOE PAGES

    Li, Xinya; Deng, Z. Daniel; USA, Richland Washington; ...

    2014-11-27

    Better understanding of fish behavior is vital for recovery of many endangered species including salmon. The Juvenile Salmon Acoustic Telemetry System (JSATS) was developed to observe the out-migratory behavior of juvenile salmonids tagged by surgical implantation of acoustic micro-transmitters and to estimate the survival when passing through dams on the Snake and Columbia Rivers. A robust three-dimensional solver was needed to accurately and efficiently estimate the time sequence of locations of fish tagged with JSATS acoustic transmitters, to describe in sufficient detail the information needed to assess the function of dam-passage design alternatives. An approximate maximum likelihood solver was developedmore » using measurements of time difference of arrival from all hydrophones in receiving arrays on which a transmission was detected. Field experiments demonstrated that the developed solver performed significantly better in tracking efficiency and accuracy than other solvers described in the literature.« less

  10. A 3D approximate maximum likelihood solver for localization of fish implanted with acoustic transmitters

    NASA Astrophysics Data System (ADS)

    Li, Xinya; Deng, Z. Daniel; Sun, Yannan; Martinez, Jayson J.; Fu, Tao; McMichael, Geoffrey A.; Carlson, Thomas J.

    2014-11-01

    Better understanding of fish behavior is vital for recovery of many endangered species including salmon. The Juvenile Salmon Acoustic Telemetry System (JSATS) was developed to observe the out-migratory behavior of juvenile salmonids tagged by surgical implantation of acoustic micro-transmitters and to estimate the survival when passing through dams on the Snake and Columbia Rivers. A robust three-dimensional solver was needed to accurately and efficiently estimate the time sequence of locations of fish tagged with JSATS acoustic transmitters, to describe in sufficient detail the information needed to assess the function of dam-passage design alternatives. An approximate maximum likelihood solver was developed using measurements of time difference of arrival from all hydrophones in receiving arrays on which a transmission was detected. Field experiments demonstrated that the developed solver performed significantly better in tracking efficiency and accuracy than other solvers described in the literature.

  11. A 3D approximate maximum likelihood solver for localization of fish implanted with acoustic transmitters

    PubMed Central

    Li, Xinya; Deng, Z. Daniel; Sun, Yannan; Martinez, Jayson J.; Fu, Tao; McMichael, Geoffrey A.; Carlson, Thomas J.

    2014-01-01

    Better understanding of fish behavior is vital for recovery of many endangered species including salmon. The Juvenile Salmon Acoustic Telemetry System (JSATS) was developed to observe the out-migratory behavior of juvenile salmonids tagged by surgical implantation of acoustic micro-transmitters and to estimate the survival when passing through dams on the Snake and Columbia Rivers. A robust three-dimensional solver was needed to accurately and efficiently estimate the time sequence of locations of fish tagged with JSATS acoustic transmitters, to describe in sufficient detail the information needed to assess the function of dam-passage design alternatives. An approximate maximum likelihood solver was developed using measurements of time difference of arrival from all hydrophones in receiving arrays on which a transmission was detected. Field experiments demonstrated that the developed solver performed significantly better in tracking efficiency and accuracy than other solvers described in the literature. PMID:25427517

  12. A 3D approximate maximum likelihood solver for localization of fish implanted with acoustic transmitters.

    PubMed

    Li, Xinya; Deng, Z Daniel; Sun, Yannan; Martinez, Jayson J; Fu, Tao; McMichael, Geoffrey A; Carlson, Thomas J

    2014-11-27

    Better understanding of fish behavior is vital for recovery of many endangered species including salmon. The Juvenile Salmon Acoustic Telemetry System (JSATS) was developed to observe the out-migratory behavior of juvenile salmonids tagged by surgical implantation of acoustic micro-transmitters and to estimate the survival when passing through dams on the Snake and Columbia Rivers. A robust three-dimensional solver was needed to accurately and efficiently estimate the time sequence of locations of fish tagged with JSATS acoustic transmitters, to describe in sufficient detail the information needed to assess the function of dam-passage design alternatives. An approximate maximum likelihood solver was developed using measurements of time difference of arrival from all hydrophones in receiving arrays on which a transmission was detected. Field experiments demonstrated that the developed solver performed significantly better in tracking efficiency and accuracy than other solvers described in the literature.

  13. A 3D approximate maximum likelihood solver for localization of fish implanted with acoustic transmitters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Xinya; Deng, Z. Daniel; USA, Richland Washington

    Better understanding of fish behavior is vital for recovery of many endangered species including salmon. The Juvenile Salmon Acoustic Telemetry System (JSATS) was developed to observe the out-migratory behavior of juvenile salmonids tagged by surgical implantation of acoustic micro-transmitters and to estimate the survival when passing through dams on the Snake and Columbia Rivers. A robust three-dimensional solver was needed to accurately and efficiently estimate the time sequence of locations of fish tagged with JSATS acoustic transmitters, to describe in sufficient detail the information needed to assess the function of dam-passage design alternatives. An approximate maximum likelihood solver was developedmore » using measurements of time difference of arrival from all hydrophones in receiving arrays on which a transmission was detected. Field experiments demonstrated that the developed solver performed significantly better in tracking efficiency and accuracy than other solvers described in the literature.« less

  14. Quantifying Transmission Heterogeneity Using Both Pathogen Phylogenies and Incidence Time Series

    PubMed Central

    Li, Lucy M.; Grassly, Nicholas C.; Fraser, Christophe

    2017-01-01

    Abstract Heterogeneity in individual-level transmissibility can be quantified by the dispersion parameter k of the offspring distribution. Quantifying heterogeneity is important as it affects other parameter estimates, it modulates the degree of unpredictability of an epidemic, and it needs to be accounted for in models of infection control. Aggregated data such as incidence time series are often not sufficiently informative to estimate k. Incorporating phylogenetic analysis can help to estimate k concurrently with other epidemiological parameters. We have developed an inference framework that uses particle Markov Chain Monte Carlo to estimate k and other epidemiological parameters using both incidence time series and the pathogen phylogeny. Using the framework to fit a modified compartmental transmission model that includes the parameter k to simulated data, we found that more accurate and less biased estimates of the reproductive number were obtained by combining epidemiological and phylogenetic analyses. However, k was most accurately estimated using pathogen phylogeny alone. Accurately estimating k was necessary for unbiased estimates of the reproductive number, but it did not affect the accuracy of reporting probability and epidemic start date estimates. We further demonstrated that inference was possible in the presence of phylogenetic uncertainty by sampling from the posterior distribution of phylogenies. Finally, we used the inference framework to estimate transmission parameters from epidemiological and genetic data collected during a poliovirus outbreak. Despite the large degree of phylogenetic uncertainty, we demonstrated that incorporating phylogenetic data in parameter inference improved the accuracy and precision of estimates. PMID:28981709

  15. Synthesis of Natural Electric and Magnetic Time Series Using Impulse Responses of Inter-station Transfer Functions and a Reference

    NASA Astrophysics Data System (ADS)

    Wang, H.; Cheng, J.

    2017-12-01

    A method to Synthesis natural electric and magnetic Time series is proposed whereby the time series of local site are derived using an Impulse Response and a reference (STIR). The method is based on the assumption that the external source of magnetic fields are uniform, and the electric and magnetic fields acquired at the surface satisfy a time-independent linear relation in frequency domain.According to the convolution theorem, we can synthesize natural electric and magnetic time series using the impulse responses of inter-station transfer functions with a reference. Applying this method, two impulse responses need to be estimated: the quasi-MT impulse response tensor and the horizontal magnetic impulse response tensor. These impulse response tensors relate the local horizontal electric and magnetic components with the horizontal magnetic components at a reference site, respectively. Some clean segments of times series are selected to estimate impulse responses by using least-square (LS) method. STIR is similar with STIN (Wang, 2017), but STIR does not need to estimate the inter-station transfer functions, and the synthesized data are more accurate in high frequency, where STIN fails when the inter-station transfer functions are contaminated severely. A test with good quality of MT data shows that synthetic time-series are similar to natural electric and magnetic time series. For contaminated AMT example, when this method is used to remove noise present at the local site, the scatter of MT sounding curves are clear reduced, and the data quality are improved. *This work is funded by National Key R&D Program of China(2017YFC0804105),National Natural Science Foundation of China (41604064, 51574250), State Key Laboratory of Coal Resources and Safe Mining ,China University of Mining & Technology,(SKLCRSM16DC09)

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, Richard O.

    The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Somemore » statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.« less

  17. Assessment of the Accountability of Night Vision Devices Provided to the Security Forces of Iraq

    DTIC Science & Technology

    2009-03-17

    of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other... data in this project. The qualitative data consisted of individual interviews, direct observation, and written documents. Quantitative data

  18. Acquisition Program Lead Systems Integration/Lead Capabilities Integration Decision Support Methodology and Tool

    DTIC Science & Technology

    2015-04-30

    from the MIT Sloan School that provide a relative complexity score for functions (Product and Context Complexity). The PMA assesses the complexity...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or

  19. Launch and Recovery System Literature Review

    DTIC Science & Technology

    2010-12-01

    information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing he collection of information. Send comments regarding this burden estimate or any other aspect...if it does not display a currently valid OMB control number. PLEASE DO NOT RETURNYOU FORM TO THE AVOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 31-12

  20. An Approach Using MIP Products for the Development of the Coalition Battle Management Language Standard

    DTIC Science & Technology

    2013-06-01

    collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or...Control Information Exchange Data Model (JC3IEDM). The Coalition Battle Management Language (CBML) being developed by the Simulation Interoperability

  1. Preliminary Estimates of 1972-73 Full-Time Instructional Faculty in Institutions of Higher Education. Bulletin. Advanced Statistics for Management. No. 14, March 1, 1973.

    ERIC Educational Resources Information Center

    National Center for Educational Statistics (DHEW/OE), Washington, DC.

    In response to needs expressed by the community of higher education institutions, the National Center for Educational Statistics has produced early estimates of a selected group of mean salaries of instructional faculty in institutions of higher education in 1972-73. The number and salaries of male and female instructional staff by rank are of…

  2. The RADAR Test Methodology: Evaluating a Multi-Task Machine Learning System with Humans in the Loop

    DTIC Science & Technology

    2006-10-01

    burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing...data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this...burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington Headquarters Services

  3. Deformation Mechanisms and High Strain Rate Properties of Magnesium (Mg) and Mg Alloys

    DTIC Science & Technology

    2012-08-01

    collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection information. Send comments regarding this burden estimate or...conference in June 2010 (2). A comprehensive historical review of the U.S. military applications of Mg alloys has recently been published (3

  4. Radiative Transfer in Submerged Macrophyte Canopies

    DTIC Science & Technology

    2001-09-30

    estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining the data...needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection...person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control

  5. Social and Cognitive Functioning as Risk Factors for Suicide: A Historical-Prospective Cohort Study

    DTIC Science & Technology

    2011-04-01

    Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per...response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining the data needed, and completing and...reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including

  6. Enabling Software Acquisition Improvement: Government and Industry Software Development Team Acquisition Model

    DTIC Science & Technology

    2010-04-30

    estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining...previous and current complex SW development efforts, the program offices will have a source of objective lessons learned and metrics that can be applied...the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this

  7. Operative Therapy and the Growth of Breast Cancer Micrometastases: Cause and Effect

    DTIC Science & Technology

    2006-08-01

    0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...instructions, searching existing data sources , gathering and maintaining the data needed, and completing and reviewing this collection of information...Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden to

  8. Factors Impacting Intra-District Collaboration: A Field Study in a Midwest Police Department

    DTIC Science & Technology

    2018-03-01

    burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing...data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this...burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington headquarters Services

  9. Blind, Deaf, and Dumb: We Must Be Prepared to Fight for Information

    DTIC Science & Technology

    2017-05-25

    Blind, Deaf, and Dumb: We Must Be Prepared to Fight for Information A Monograph By LTC Stephen M. Johnson United States Army... information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing this collection of information . Send comments regarding this burden estimate or any other aspect

  10. China’s War by Other Means: Unveiling China’s Quest for Information Dominance

    DTIC Science & Technology

    2017-06-09

    CHINA’S WAR BY OTHER MEANS: UNVEILING CHINA’S QUEST FOR INFORMATION DOMINANCE A thesis presented to the Faculty of the U.S...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing this collection of information . Send comments regarding this burden estimate

  11. Agent And Component Object Framework For Concept Design Modeling Of Mobile Cyber Physical Systems

    DTIC Science & Technology

    2018-03-01

    burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing...data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this...burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington headquarters

  12. An Analysis of the Marine Corps Selection Process: Does Increased Competition Lead to Increased Quality

    DTIC Science & Technology

    2018-03-01

    collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or...any other aspect of this collection of information , including suggestions for reducing this burden, to Washington headquarters Services, Directorate

  13. Scalable Matrix Algorithms for Interactive Analytics of Very Large Informatics Graphs

    DTIC Science & Technology

    2017-06-14

    information networks. Depending on the situation, these larger networks may not fit on a single machine. Although we considered traditional matrix and graph...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or

  14. V-22 Osprey Joint Services Advanced Vertical Lift Aircraft (V-22)

    DTIC Science & Technology

    2013-12-01

    Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response...including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the... collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions

  15. Optimization of an Innovative Biofiltration System as a VOC Control Technology for Aircraft Painting Facilities

    DTIC Science & Technology

    2004-04-20

    EUROPE (Leson, 1991). Chemical Operations Coffee Roasting Composting Facilities Chemical Storage Coca Roasting Landfill Gas Extraction Film Coating...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect

  16. A Case Study in Transnational Crime: Ukraine and Modern Slavery

    DTIC Science & Technology

    2007-06-01

    remained unable to appropriate resources or plan efficiently. The full extent of the decline remains unknown, because statistics were manipulated to hide...information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of

  17. Learning to Leave. The Preeminence of Disengagement in US Military Strategy

    DTIC Science & Technology

    2008-05-01

    information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of...Information center cataloging Data Brown, R. Greg. Learning to leave : the preeminence of disengagement in US military strategy / R. Greg Brown. p. ; cm

  18. Multimodal Signal Processing for Personnel Detection and Activity Classification for Indoor Surveillance

    DTIC Science & Technology

    2013-11-15

    features and designed a classifier that achieves up to 95% classification accuracy on classifying the occupancy with indoor footstep data. MDL-based...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other

  19. Geochemical Characterization of Concentrated Gas Hydrate Deposits on the Hikurangi Margin, New Zealand: Preliminary Geochemical Cruise Report

    DTIC Science & Technology

    2008-02-29

    information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing this collection of information . Send comments regarding this burden estimate or any other aspect...of this collection of information , including suggestions for reducing this burden to Department of Defense, Washington Headquarters Services

  20. A Qualitative Analysis of the Navy’s HSI Billet Structure

    DTIC Science & Technology

    2008-06-01

    of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...subspecialty code. The research results support the hypothesis that the work requirements of the July 2007 data set of 4600P-coded billets (billets

  1. Human Systems Integration (HSI) in Acquisition. Acquisition Phase Guide

    DTIC Science & Technology

    2009-08-01

    information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of...available Concept of Operations (CONOPS) and other available data 1.1 Select and review Baseline Comparison System(s) (BCS) documentation 1.2 Assess

  2. Applicability of Human Simulation for Enhancing Operations of Dismounted Soldiers

    DTIC Science & Technology

    2010-10-01

    information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect...consisted of a data capturing phase, in which field-trials at a German MOUT training facility were observed, a subsequent data -processing including

  3. Bayesian Maximum Entropy space/time estimation of surface water chloride in Maryland using river distances.

    PubMed

    Jat, Prahlad; Serre, Marc L

    2016-12-01

    Widespread contamination of surface water chloride is an emerging environmental concern. Consequently accurate and cost-effective methods are needed to estimate chloride along all river miles of potentially contaminated watersheds. Here we introduce a Bayesian Maximum Entropy (BME) space/time geostatistical estimation framework that uses river distances, and we compare it with Euclidean BME to estimate surface water chloride from 2005 to 2014 in the Gunpowder-Patapsco, Severn, and Patuxent subbasins in Maryland. River BME improves the cross-validation R 2 by 23.67% over Euclidean BME, and river BME maps are significantly different than Euclidean BME maps, indicating that it is important to use river BME maps to assess water quality impairment. The river BME maps of chloride concentration show wide contamination throughout Baltimore and Columbia-Ellicott cities, the disappearance of a clean buffer separating these two large urban areas, and the emergence of multiple localized pockets of contamination in surrounding areas. The number of impaired river miles increased by 0.55% per year in 2005-2009 and by 1.23% per year in 2011-2014, corresponding to a marked acceleration of the rate of impairment. Our results support the need for control measures and increased monitoring of unassessed river miles. Copyright © 2016. Published by Elsevier Ltd.

  4. Human immunodeficiency virus prevalence, incidence, and residual transmission risk in first-time and repeat blood donations in Zimbabwe: implications on blood safety.

    PubMed

    Mapako, Tonderai; Mvere, David A; Chitiyo, McLeod E; Rusakaniko, Simbarashe; Postma, Maarten J; van Hulst, Marinus

    2013-10-01

    National Blood Service Zimbabwe human immunodeficiency virus (HIV) risk management strategy includes screening and discarding of first-time donations, which are collected in blood packs without an anticoagulant (dry pack). To evaluate the impact of discarding first-time donations on blood safety the HIV prevalence, incidence, and residual risk in first-time and repeat donations (wet packs) were compared. Donor data from 2002 to 2010 were retrieved from a centralized national electronic donor database and retrospectively analyzed. Chi-square test was used to compare HIV prevalence with relative risk (RR), and the RR point estimates and 95% confidence interval (CI) are reported. Trend analysis was done using Cochran-Armitage trend test. HIV residual risk estimates were determined using published residual risk estimation models. Over the 9 years the overall HIV prevalence estimates are 1.29% (n = 116,058) and 0.42% (n = 434,695) for first-time and repeat donations, respectively. The overall RR was 3.1 (95% CI, 2.9-3.3; p < 0.0001). The overall mean residual transmission risk of HIV window phase donations in first-time was 1:7384 (range, 1:11,308-1:5356) and in repeat donors it was 1:5496 (range, 1:9943-1:3347). The significantly high HIV prevalence estimates recorded in first-time over repeat donations is indicative of the effectiveness of the HIV risk management strategy. However, comparable residual transmission risk estimates in first-time and repeat donors point to the need to further review the risk management strategies. Given the potential wastage of valuable resources, future studies should focus on the cost-effectiveness and utility of screening and discarding first-time donations. © 2013 American Association of Blood Banks.

  5. The Performance of the Date-Randomization Test in Phylogenetic Analyses of Time-Structured Virus Data.

    PubMed

    Duchêne, Sebastián; Duchêne, David; Holmes, Edward C; Ho, Simon Y W

    2015-07-01

    Rates and timescales of viral evolution can be estimated using phylogenetic analyses of time-structured molecular sequences. This involves the use of molecular-clock methods, calibrated by the sampling times of the viral sequences. However, the spread of these sampling times is not always sufficient to allow the substitution rate to be estimated accurately. We conducted Bayesian phylogenetic analyses of simulated virus data to evaluate the performance of the date-randomization test, which is sometimes used to investigate whether time-structured data sets have temporal signal. An estimate of the substitution rate passes this test if its mean does not fall within the 95% credible intervals of rate estimates obtained using replicate data sets in which the sampling times have been randomized. We find that the test sometimes fails to detect rate estimates from data with no temporal signal. This error can be minimized by using a more conservative criterion, whereby the 95% credible interval of the estimate with correct sampling times should not overlap with those obtained with randomized sampling times. We also investigated the behavior of the test when the sampling times are not uniformly distributed throughout the tree, which sometimes occurs in empirical data sets. The test performs poorly in these circumstances, such that a modification to the randomization scheme is needed. Finally, we illustrate the behavior of the test in analyses of nucleotide sequences of cereal yellow dwarf virus. Our results validate the use of the date-randomization test and allow us to propose guidelines for interpretation of its results. © The Author 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Heterogeneous nucleation of aspartame from aqueous solutions

    NASA Astrophysics Data System (ADS)

    Kubota, Noriaki; Kinno, Hiroaki; Shimizu, Kenji

    1990-03-01

    Waiting times, the time from the instant of quenching needed for a first nucleus to appear, were measured at constant supercoolings for primary nucleation of aspartame (α-L-aspartyl-L-phenylalanine methylester) from aqueous solutions, which were sealed into glass ampoules (solution volume = 3.16 cm 3). Since the waiting time became shorter by filtering the solution prior to quenching, the nucleation was concluded to be heterogeneously induced. The measured waiting time consisted of two parts: time needed for the nucleus to grow to a detactable size (growth time) and stochastic time needed for nucleation (true waiting time). The distribution of the true waiting time, is well explained by a stochastic model, in which nucleation is regarded to occur heterogeneously and in a stochastic manner by two kinds of active sites. The active sites are estimated to be located on foreign particles in which such elements as Si, Al and Mg were contained. The amount of each element is very small in the order of magnitude of ppb (mass basis) of the whole solution. The growth time was correlated with the degree of supercooling.

  7. The epoch state navigation filter. [for maximum likelihood estimates of position and velocity vectors

    NASA Technical Reports Server (NTRS)

    Battin, R. H.; Croopnick, S. R.; Edwards, J. A.

    1977-01-01

    The formulation of a recursive maximum likelihood navigation system employing reference position and velocity vectors as state variables is presented. Convenient forms of the required variational equations of motion are developed together with an explicit form of the associated state transition matrix needed to refer measurement data from the measurement time to the epoch time. Computational advantages accrue from this design in that the usual forward extrapolation of the covariance matrix of estimation errors can be avoided without incurring unacceptable system errors. Simulation data for earth orbiting satellites are provided to substantiate this assertion.

  8. Operation of the yield estimation subsystem

    NASA Technical Reports Server (NTRS)

    Mccrary, D. G.; Rogers, J. L.; Hill, J. D. (Principal Investigator)

    1979-01-01

    The organization and products of the yield estimation subsystem (YES) are described with particular emphasis on meteorological data acquisition, yield estimation, crop calendars, weekly weather summaries, and project reports. During the three phases of LACIE, YES demonstrated that it is possible to use the flow of global meteorological data and provide valuable information regarding global wheat production. It was able to establish a capability to collect, in a timely manner, detailed weather data from all regions of the world, and to evaluate and convert that data into information appropriate to the project's needs.

  9. Cubic Foot Volume Tables for Slash Pine Plantations of the Middle Coastal Plain of Georgia and the Carolina Sandhills

    Treesearch

    C.E. McGee; F.A. Bennett

    1959-01-01

    Proper management of any timber species or type requires valid estimates of volume from time to time. Tables 1 and 2 were constructed to meet this need for the expanding area of slash pine plantations in the middle coastal plain of Georgia and the Carolina Sandhills.

  10. Fire behavior simulation in Mediterranean forests using the minimum travel time algorithm

    Treesearch

    Kostas Kalabokidis; Palaiologos Palaiologou; Mark A. Finney

    2014-01-01

    Recent large wildfires in Greece exemplify the need for pre-fire burn probability assessment and possible landscape fire flow estimation to enhance fire planning and resource allocation. The Minimum Travel Time (MTT) algorithm, incorporated as FlamMap's version five module, provide valuable fire behavior functions, while enabling multi-core utilization for the...

  11. Cultivating Innovation to Ignite Organizational Transformation

    DTIC Science & Technology

    2004-03-01

    over a modem . Need to have a section entitled "incoming students" The current site and design sitemap make and incoming student hunt through the...Download Times: Estimated download times (seconds) Object type Number Size (bytes) 14.4 28.8 33.6 56K 128K T1 HTML 1 8600 6.27 3.44 2.87 2.46

  12. The Army and the Need for an Amphibious Capability

    DTIC Science & Technology

    2015-05-23

    prevailing Army-Marine amphibious set-up was unsound because only the Army had both the means and the grasp of the problem to plan, prepare, and... The Army and the Need for an Amphibious Capability A Monograph by MAJ Joseph E. Malone United States Army...this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data

  13. DOD Manufacturing Arsenals: Actions Needed to Identify and Sustain Critical Capabilities

    DTIC Science & Technology

    2015-11-01

    to each develop their own unique method. A senior OSD official described the resulting process as unsound . Each manufacturing arsenal declared what...Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments

  14. Time to Improve U.S. Defense Structure for the Western Hemisphere

    DTIC Science & Technology

    2009-01-01

    gathering and maintaining the data needed , and completing and reviewing the collection of information. Send comments regarding this burden estimate or any...diverse as the United States, Bolivia, and Saint Kitts and Nevis . Classical military threats that characterized the bipolar world do not...strategy is in the offing, seeking strategic relationships with France, Russia, and other extraregional actors. The United States needs to consider

  15. Estimating time-based instantaneous total mortality rate based on the age-structured abundance index

    NASA Astrophysics Data System (ADS)

    Wang, Yingbin; Jiao, Yan

    2015-05-01

    The instantaneous total mortality rate ( Z) of a fish population is one of the important parameters in fisheries stock assessment. The estimation of Z is crucial to fish population dynamics analysis, abundance and catch forecast, and fisheries management. A catch curve-based method for estimating time-based Z and its change trend from catch per unit effort (CPUE) data of multiple cohorts is developed. Unlike the traditional catch-curve method, the method developed here does not need the assumption of constant Z throughout the time, but the Z values in n continuous years are assumed constant, and then the Z values in different n continuous years are estimated using the age-based CPUE data within these years. The results of the simulation analyses show that the trends of the estimated time-based Z are consistent with the trends of the true Z, and the estimated rates of change from this approach are close to the true change rates (the relative differences between the change rates of the estimated Z and the true Z are smaller than 10%). Variations of both Z and recruitment can affect the estimates of Z value and the trend of Z. The most appropriate value of n can be different given the effects of different factors. Therefore, the appropriate value of n for different fisheries should be determined through a simulation analysis as we demonstrated in this study. Further analyses suggested that selectivity and age estimation are also two factors that can affect the estimated Z values if there is error in either of them, but the estimated change rates of Z are still close to the true change rates. We also applied this approach to the Atlantic cod ( Gadus morhua) fishery of eastern Newfoundland and Labrador from 1983 to 1997, and obtained reasonable estimates of time-based Z.

  16. Estimating the coverage of mental health programmes: a systematic review.

    PubMed

    De Silva, Mary J; Lee, Lucy; Fuhr, Daniela C; Rathod, Sujit; Chisholm, Dan; Schellenberg, Joanna; Patel, Vikram

    2014-04-01

    The large treatment gap for people suffering from mental disorders has led to initiatives to scale up mental health services. In order to track progress, estimates of programme coverage, and changes in coverage over time, are needed. Systematic review of mental health programme evaluations that assess coverage, measured either as the proportion of the target population in contact with services (contact coverage) or as the proportion of the target population who receive appropriate and effective care (effective coverage). We performed a search of electronic databases and grey literature up to March 2013 and contacted experts in the field. Methods to estimate the numerator (service utilization) and the denominator (target population) were reviewed to explore methods which could be used in programme evaluations. We identified 15 735 unique records of which only seven met the inclusion criteria. All studies reported contact coverage. No study explicitly measured effective coverage, but it was possible to estimate this for one study. In six studies the numerator of coverage, service utilization, was estimated using routine clinical information, whereas one study used a national community survey. The methods for estimating the denominator, the population in need of services, were more varied and included national prevalence surveys case registers, and estimates from the literature. Very few coverage estimates are available. Coverage could be estimated at low cost by combining routine programme data with population prevalence estimates from national surveys.

  17. Using regression methods to estimate stream phosphorus loads at the Illinois River, Arkansas

    USGS Publications Warehouse

    Haggard, B.E.; Soerens, T.S.; Green, W.R.; Richards, R.P.

    2003-01-01

    The development of total maximum daily loads (TMDLs) requires evaluating existing constituent loads in streams. Accurate estimates of constituent loads are needed to calibrate watershed and reservoir models for TMDL development. The best approach to estimate constituent loads is high frequency sampling, particularly during storm events, and mass integration of constituents passing a point in a stream. Most often, resources are limited and discrete water quality samples are collected on fixed intervals and sometimes supplemented with directed sampling during storm events. When resources are limited, mass integration is not an accurate means to determine constituent loads and other load estimation techniques such as regression models are used. The objective of this work was to determine a minimum number of water-quality samples needed to provide constituent concentration data adequate to estimate constituent loads at a large stream. Twenty sets of water quality samples with and without supplemental storm samples were randomly selected at various fixed intervals from a database at the Illinois River, northwest Arkansas. The random sets were used to estimate total phosphorus (TP) loads using regression models. The regression-based annual TP loads were compared to the integrated annual TP load estimated using all the data. At a minimum, monthly sampling plus supplemental storm samples (six samples per year) was needed to produce a root mean square error of less than 15%. Water quality samples should be collected at least semi-monthly (every 15 days) in studies less than two years if seasonal time factors are to be used in the regression models. Annual TP loads estimated from independently collected discrete water quality samples further demonstrated the utility of using regression models to estimate annual TP loads in this stream system.

  18. Trend Change Detection in NDVI Time Series: Effects of Inter-Annual Variability and Methodology

    NASA Technical Reports Server (NTRS)

    Forkel, Matthias; Carvalhais, Nuno; Verbesselt, Jan; Mahecha, Miguel D.; Neigh, Christopher S.R.; Reichstein, Markus

    2013-01-01

    Changing trends in ecosystem productivity can be quantified using satellite observations of Normalized Difference Vegetation Index (NDVI). However, the estimation of trends from NDVI time series differs substantially depending on analyzed satellite dataset, the corresponding spatiotemporal resolution, and the applied statistical method. Here we compare the performance of a wide range of trend estimation methods and demonstrate that performance decreases with increasing inter-annual variability in the NDVI time series. Trend slope estimates based on annual aggregated time series or based on a seasonal-trend model show better performances than methods that remove the seasonal cycle of the time series. A breakpoint detection analysis reveals that an overestimation of breakpoints in NDVI trends can result in wrong or even opposite trend estimates. Based on our results, we give practical recommendations for the application of trend methods on long-term NDVI time series. Particularly, we apply and compare different methods on NDVI time series in Alaska, where both greening and browning trends have been previously observed. Here, the multi-method uncertainty of NDVI trends is quantified through the application of the different trend estimation methods. Our results indicate that greening NDVI trends in Alaska are more spatially and temporally prevalent than browning trends. We also show that detected breakpoints in NDVI trends tend to coincide with large fires. Overall, our analyses demonstrate that seasonal trend methods need to be improved against inter-annual variability to quantify changing trends in ecosystem productivity with higher accuracy.

  19. Estimating quality weights for EQ-5D health states with the time trade-off method in South Korea.

    PubMed

    Jo, Min-Woo; Yun, Sung-Cheol; Lee, Sang-Il

    2008-12-01

    To estimate quality weights of EQ-5D health states with the time trade-off (TTO) method in the general population of South Korea. A total of 500 respondents valued 42 hypothetical EQ-5D health states using the TTO and visual analog scale. The quality weights for all EQ-5D health states were estimated by a random effects model and compared with those from studies in other countries. Overall estimated quality weights for all EQ-5D health states from this study were highly correlated with those from previous studies, but quality weights of individual states were substantially different from those of their corresponding states in other studies. The Korean value set differed from value sets from other countries. Special caution is needed when a value set from one country is applied to another with a different culture.

  20. Estimating Photosynthetically Available Radiation (PAR) at the Earth's surface from satellite observations

    NASA Technical Reports Server (NTRS)

    Frouin, Robert

    1993-01-01

    Current satellite algorithms to estimate photosynthetically available radiation (PAR) at the earth' s surface are reviewed. PAR is deduced either from an insolation estimate or obtained directly from top-of-atmosphere solar radiances. The characteristics of both approaches are contrasted and typical results are presented. The inaccuracies reported, about 10 percent and 6 percent on daily and monthly time scales, respectively, are useful to model oceanic and terrestrial primary productivity. At those time scales variability due to clouds in the ratio of PAR and insolation is reduced, making it possible to deduce PAR directly from insolation climatologies (satellite or other) that are currently available or being produced. Improvements, however, are needed in conditions of broken cloudiness and over ice/snow. If not addressed properly, calibration/validation issues may prevent quantitative use of the PAR estimates in studies of climatic change. The prospects are good for an accurate, long-term climatology of PAR over the globe.

  1. deltaGseg: macrostate estimation via molecular dynamics simulations and multiscale time series analysis.

    PubMed

    Low, Diana H P; Motakis, Efthymios

    2013-10-01

    Binding free energy calculations obtained through molecular dynamics simulations reflect intermolecular interaction states through a series of independent snapshots. Typically, the free energies of multiple simulated series (each with slightly different starting conditions) need to be estimated. Previous approaches carry out this task by moving averages at certain decorrelation times, assuming that the system comes from a single conformation description of binding events. Here, we discuss a more general approach that uses statistical modeling, wavelets denoising and hierarchical clustering to estimate the significance of multiple statistically distinct subpopulations, reflecting potential macrostates of the system. We present the deltaGseg R package that performs macrostate estimation from multiple replicated series and allows molecular biologists/chemists to gain physical insight into the molecular details that are not easily accessible by experimental techniques. deltaGseg is a Bioconductor R package available at http://bioconductor.org/packages/release/bioc/html/deltaGseg.html.

  2. Delivery of Modular Lethality via a Parent-Child Concept

    DTIC Science & Technology

    2015-02-01

    time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the...downrange distance to the target, is the time of flight, is the distance of the thruster force from the body center of gravity, and is...velocity and time of flight can be estimated or measured in flight. These values can be collected in a term, , and the 2 components of lateral

  3. Computational Sensitivity Analysis for the Aerodynamic Design of Supersonic and Hypersonic Air Vehicles

    DTIC Science & Technology

    2015-05-18

    response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and... reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information...five times the speed of sound. For reference, the SR-71 Blackbird , the fastest manned airbreathing typically flew at three times the speed of sound

  4. Monitoring Seasonal Evapotranspiration in Vulnerable Agriculture using Time Series VHSR Satellite Data

    NASA Astrophysics Data System (ADS)

    Dalezios, Nicolas; Spyropoulos, Nicos V.; Tarquis, Ana M.

    2015-04-01

    The research work stems from the hypothesis that it is possible to perform an estimation of seasonal water needs of olive tree farms under drought periods by cross correlating high spatial, spectral and temporal resolution (~monthly) of satellite data, acquired at well defined time intervals of the phenological cycle of crops, with ground-truth information simultaneously applied during the image acquisitions. The present research is for the first time, demonstrating the coordinated efforts of space engineers, satellite mission control planners, remote sensing scientists and ground teams to record at specific time intervals of the phenological cycle of trees from ground "zero" and from 770 km above the Earth's surface, the status of plants for subsequent cross correlation and analysis regarding the estimation of the seasonal evapotranspiration in vulnerable agricultural environment. The ETo and ETc derived by Penman-Montieth equation and reference Kc tables, compared with new ETd using the Kc extracted from the time series satellite data. Several vegetation indices were also used especially the RedEdge and the chlorophyll one based on WorldView-2 RedEdge and second NIR bands to relate the tree status with water and nutrition needs. Keywords: Evapotransipration, Very High Spatial Resolution - VHSR, time series, remote sensing, vulnerability, agriculture, vegetation indeces.

  5. Combined use of flowmeter and time-drawdown data to estimate hydraulic conductivities in layered aquifer systems

    USGS Publications Warehouse

    Hanson, R.T.; Nishikawa, T.

    1996-01-01

    The vertical distribution of hydraulic conductivity in layered aquifer systems commonly is needed for model simulations of ground-water flow and transport. In previous studies, time-drawdown data or flowmeter data were used individually, but not in combination, to estimate hydraulic conductivity. In this study, flowmeter data and time-drawdown data collected from a long-screened production well and nearby monitoring wells are combined to estimate the vertical distribution of hydraulic conductivity in a complex multilayer coastal aquifer system. Flowmeter measurements recorded as a function of depth delineate nonuniform inflow to the wellbore, and this information is used to better discretize the vertical distribution of hydraulic conductivity using analytical and numerical methods. The time-drawdown data complement the flowmeter data by giving insight into the hydraulic response of aquitards when flow rates within the wellbore are below the detection limit of the flowmeter. The combination of these field data allows for the testing of alternative conceptual models of radial flow to the wellbore.

  6. Fast frequency acquisition via adaptive least squares algorithm

    NASA Technical Reports Server (NTRS)

    Kumar, R.

    1986-01-01

    A new least squares algorithm is proposed and investigated for fast frequency and phase acquisition of sinusoids in the presence of noise. This algorithm is a special case of more general, adaptive parameter-estimation techniques. The advantages of the algorithms are their conceptual simplicity, flexibility and applicability to general situations. For example, the frequency to be acquired can be time varying, and the noise can be nonGaussian, nonstationary and colored. As the proposed algorithm can be made recursive in the number of observations, it is not necessary to have a priori knowledge of the received signal-to-noise ratio or to specify the measurement time. This would be required for batch processing techniques, such as the fast Fourier transform (FFT). The proposed algorithm improves the frequency estimate on a recursive basis as more and more observations are obtained. When the algorithm is applied in real time, it has the extra advantage that the observations need not be stored. The algorithm also yields a real time confidence measure as to the accuracy of the estimator.

  7. PolyWaTT: A polynomial water travel time estimator based on Derivative Dynamic Time Warping and Perceptually Important Points

    NASA Astrophysics Data System (ADS)

    Claure, Yuri Navarro; Matsubara, Edson Takashi; Padovani, Carlos; Prati, Ronaldo Cristiano

    2018-03-01

    Traditional methods for estimating timing parameters in hydrological science require a rigorous study of the relations of flow resistance, slope, flow regime, watershed size, water velocity, and other local variables. These studies are mostly based on empirical observations, where the timing parameter is estimated using empirically derived formulas. The application of these studies to other locations is not always direct. The locations in which equations are used should have comparable characteristics to the locations from which such equations have been derived. To overcome this barrier, in this work, we developed a data-driven approach to estimate timing parameters such as travel time. Our proposal estimates timing parameters using historical data of the location without the need of adapting or using empirical formulas from other locations. The proposal only uses one variable measured at two different locations on the same river (for instance, two river-level measurements, one upstream and the other downstream on the same river). The recorded data from each location generates two time series. Our method aligns these two time series using derivative dynamic time warping (DDTW) and perceptually important points (PIP). Using data from timing parameters, a polynomial function generalizes the data by inducing a polynomial water travel time estimator, called PolyWaTT. To evaluate the potential of our proposal, we applied PolyWaTT to three different watersheds: a floodplain ecosystem located in the part of Brazil known as Pantanal, the world's largest tropical wetland area; and the Missouri River and the Pearl River, in United States of America. We compared our proposal with empirical formulas and a data-driven state-of-the-art method. The experimental results demonstrate that PolyWaTT showed a lower mean absolute error than all other methods tested in this study, and for longer distances the mean absolute error achieved by PolyWaTT is three times smaller than empirical formulas.

  8. Mixed H2/H∞-Based Fusion Estimation for Energy-Limited Multi-Sensors in Wearable Body Networks

    PubMed Central

    Li, Chao; Zhang, Zhenjiang; Chao, Han-Chieh

    2017-01-01

    In wireless sensor networks, sensor nodes collect plenty of data for each time period. If all of data are transmitted to a Fusion Center (FC), the power of sensor node would run out rapidly. On the other hand, the data also needs a filter to remove the noise. Therefore, an efficient fusion estimation model, which can save the energy of the sensor nodes while maintaining higher accuracy, is needed. This paper proposes a novel mixed H2/H∞-based energy-efficient fusion estimation model (MHEEFE) for energy-limited Wearable Body Networks. In the proposed model, the communication cost is firstly reduced efficiently while keeping the estimation accuracy. Then, the parameters in quantization method are discussed, and we confirm them by an optimization method with some prior knowledge. Besides, some calculation methods of important parameters are researched which make the final estimates more stable. Finally, an iteration-based weight calculation algorithm is presented, which can improve the fault tolerance of the final estimate. In the simulation, the impacts of some pivotal parameters are discussed. Meanwhile, compared with the other related models, the MHEEFE shows a better performance in accuracy, energy-efficiency and fault tolerance. PMID:29280950

  9. Space-Time Smoothing of Complex Survey Data: Small Area Estimation for Child Mortality.

    PubMed

    Mercer, Laina D; Wakefield, Jon; Pantazis, Athena; Lutambi, Angelina M; Masanja, Honorati; Clark, Samuel

    2015-12-01

    Many people living in low and middle-income countries are not covered by civil registration and vital statistics systems. Consequently, a wide variety of other types of data including many household sample surveys are used to estimate health and population indicators. In this paper we combine data from sample surveys and demographic surveillance systems to produce small area estimates of child mortality through time. Small area estimates are necessary to understand geographical heterogeneity in health indicators when full-coverage vital statistics are not available. For this endeavor spatio-temporal smoothing is beneficial to alleviate problems of data sparsity. The use of conventional hierarchical models requires careful thought since the survey weights may need to be considered to alleviate bias due to non-random sampling and non-response. The application that motivated this work is estimation of child mortality rates in five-year time intervals in regions of Tanzania. Data come from Demographic and Health Surveys conducted over the period 1991-2010 and two demographic surveillance system sites. We derive a variance estimator of under five years child mortality that accounts for the complex survey weighting. For our application, the hierarchical models we consider include random effects for area, time and survey and we compare models using a variety of measures including the conditional predictive ordinate (CPO). The method we propose is implemented via the fast and accurate integrated nested Laplace approximation (INLA).

  10. Estimation for general birth-death processes

    PubMed Central

    Crawford, Forrest W.; Minin, Vladimir N.; Suchard, Marc A.

    2013-01-01

    Birth-death processes (BDPs) are continuous-time Markov chains that track the number of “particles” in a system over time. While widely used in population biology, genetics and ecology, statistical inference of the instantaneous particle birth and death rates remains largely limited to restrictive linear BDPs in which per-particle birth and death rates are constant. Researchers often observe the number of particles at discrete times, necessitating data augmentation procedures such as expectation-maximization (EM) to find maximum likelihood estimates. For BDPs on finite state-spaces, there are powerful matrix methods for computing the conditional expectations needed for the E-step of the EM algorithm. For BDPs on infinite state-spaces, closed-form solutions for the E-step are available for some linear models, but most previous work has resorted to time-consuming simulation. Remarkably, we show that the E-step conditional expectations can be expressed as convolutions of computable transition probabilities for any general BDP with arbitrary rates. This important observation, along with a convenient continued fraction representation of the Laplace transforms of the transition probabilities, allows for novel and efficient computation of the conditional expectations for all BDPs, eliminating the need for truncation of the state-space or costly simulation. We use this insight to derive EM algorithms that yield maximum likelihood estimation for general BDPs characterized by various rate models, including generalized linear models. We show that our Laplace convolution technique outperforms competing methods when they are available and demonstrate a technique to accelerate EM algorithm convergence. We validate our approach using synthetic data and then apply our methods to cancer cell growth and estimation of mutation parameters in microsatellite evolution. PMID:25328261

  11. Estimation for general birth-death processes.

    PubMed

    Crawford, Forrest W; Minin, Vladimir N; Suchard, Marc A

    2014-04-01

    Birth-death processes (BDPs) are continuous-time Markov chains that track the number of "particles" in a system over time. While widely used in population biology, genetics and ecology, statistical inference of the instantaneous particle birth and death rates remains largely limited to restrictive linear BDPs in which per-particle birth and death rates are constant. Researchers often observe the number of particles at discrete times, necessitating data augmentation procedures such as expectation-maximization (EM) to find maximum likelihood estimates. For BDPs on finite state-spaces, there are powerful matrix methods for computing the conditional expectations needed for the E-step of the EM algorithm. For BDPs on infinite state-spaces, closed-form solutions for the E-step are available for some linear models, but most previous work has resorted to time-consuming simulation. Remarkably, we show that the E-step conditional expectations can be expressed as convolutions of computable transition probabilities for any general BDP with arbitrary rates. This important observation, along with a convenient continued fraction representation of the Laplace transforms of the transition probabilities, allows for novel and efficient computation of the conditional expectations for all BDPs, eliminating the need for truncation of the state-space or costly simulation. We use this insight to derive EM algorithms that yield maximum likelihood estimation for general BDPs characterized by various rate models, including generalized linear models. We show that our Laplace convolution technique outperforms competing methods when they are available and demonstrate a technique to accelerate EM algorithm convergence. We validate our approach using synthetic data and then apply our methods to cancer cell growth and estimation of mutation parameters in microsatellite evolution.

  12. Combining statistics from two national complex surveys to estimate injury rates per hour exposed and variance by activity in the USA.

    PubMed

    Lin, Tin-Chi; Marucci-Wellman, Helen R; Willetts, Joanna L; Brennan, Melanye J; Verma, Santosh K

    2016-12-01

    A common issue in descriptive injury epidemiology is that in order to calculate injury rates that account for the time spent in an activity, both injury cases and exposure time of specific activities need to be collected. In reality, few national surveys have this capacity. To address this issue, we combined statistics from two different national complex surveys as inputs for the numerator and denominator to estimate injury rate, accounting for the time spent in specific activities and included a procedure to estimate variance using the combined surveys. The 2010 National Health Interview Survey (NHIS) was used to quantify injuries, and the 2010 American Time Use Survey (ATUS) was used to quantify time of exposure to specific activities. The injury rate was estimated by dividing the average number of injuries (from NHIS) by average exposure hours (from ATUS), both measured for specific activities. The variance was calculated using the 'delta method', a general method for variance estimation with complex surveys. Among the five types of injuries examined, 'sport and exercise' had the highest rate (12.64 injuries per 100 000 h), followed by 'working around house/yard' (6.14), driving/riding a motor vehicle (2.98), working (1.45) and sleeping/resting/eating/drinking (0.23). The results show a ranking of injury rate by activity quite different from estimates using population as the denominator. Our approach produces an estimate of injury risk which includes activity exposure time and may more reliably reflect the underlying injury risks, offering an alternative method for injury surveillance and research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  13. Evaluation of Bayesian estimation of a hidden continuous-time Markov chain model with application to threshold violation in water-quality indicators

    USGS Publications Warehouse

    Deviney, Frank A.; Rice, Karen; Brown, Donald E.

    2012-01-01

    Natural resource managers require information concerning  the frequency, duration, and long-term probability of occurrence of water-quality indicator (WQI) violations of defined thresholds. The timing of these threshold crossings often is hidden from the observer, who is restricted to relatively infrequent observations. Here, a model for the hidden process is linked with a model for the observations, and the parameters describing duration, return period, and long-term probability of occurrence are estimated using Bayesian methods. A simulation experiment is performed to evaluate the approach under scenarios based on the equivalent of a total monitoring period of 5-30 years and an observation frequency of 1-50 observations per year. Given constant threshold crossing rate, accuracy and precision of parameter estimates increased with longer total monitoring period and more-frequent observations. Given fixed monitoring period and observation frequency, accuracy and precision of parameter estimates increased with longer times between threshold crossings. For most cases where the long-term probability of being in violation is greater than 0.10, it was determined that at least 600 observations are needed to achieve precise estimates.  An application of the approach is presented using 22 years of quasi-weekly observations of acid-neutralizing capacity from Deep Run, a stream in Shenandoah National Park, Virginia. The time series also was sub-sampled to simulate monthly and semi-monthly sampling protocols. Estimates of the long-term probability of violation were unbiased despite sampling frequency; however, the expected duration and return period were over-estimated using the sub-sampled time series with respect to the full quasi-weekly time series.

  14. Near real-time estimation of burned area using VIIRS 375 m active fire product

    NASA Astrophysics Data System (ADS)

    Oliva, P.; Schroeder, W.

    2016-12-01

    Every year, more than 300 million hectares of land burn globally, causing significant ecological and economic consequences, and associated climatological effects as a result of fire emissions. In recent decades, burned area estimates generated from satellite data have provided systematic global information for ecological analysis of fire impacts, climate and carbon cycle models, and fire regimes studies, among many others. However, there is still need of near real-time burned area estimations in order to assess the impacts of fire and estimate smoke and emissions. The enhanced characteristics of the Visible Infrared Imaging Radiometer Suite (VIIRS) 375 m channels on board the Suomi National Polar-orbiting Partnesship (S-NPP) make possible the use of near real-time active fire detection data for burned area estimation. In this study, consecutive VIIRS 375 m active fire detections were aggregated to produce the VIIRS 375 m burned area (BA) estimation over ten ecologically diverse study areas. The accuracy of the BA estimations was assessed by comparison with Landsat-8 supervised burned area classification. The performance of the VIIRS 375 m BA estimates was dependent on the ecosystem characteristics and fire behavior. Higher accuracy was observed in forested areas characterized by large long-duration fires, while grasslands, savannas and agricultural areas showed the highest omission and commission errors. Complementing those analyses, we performed the burned area estimation of the largest fires in Oregon and Washington states during 2015 and the Fort McMurray fire in Canada 2016. The results showed good agreement with NIROPs airborne fire perimeters proving that the VIIRS 375 m BA estimations can be used for near real-time assessments of fire effects.

  15. Why are You Late?: Investigating the Role of Time Management in Time-Based Prospective Memory

    PubMed Central

    Waldum, Emily R; McDaniel, Mark A.

    2016-01-01

    Time-based prospective memory tasks (TBPM) are those that are to be performed at a specific future time. Contrary to typical laboratory TBPM tasks (e.g., “hit the “z” key every 5 minutes”), many real-world TBPM tasks require more complex time-management processes. For instance to attend an appointment on time, one must estimate the duration of the drive to the appointment and then utilize this estimate to create and execute a secondary TBPM intention (e.g., “I need to start driving by 1:30 to make my 2:00 appointment on time”). Future under- and overestimates of drive time can lead to inefficient TBPM performance with the former lending to missed appointments and the latter to long stints in the waiting room. Despite the common occurrence of complex TBPM tasks in everyday life, to date, no studies have investigated how components of time management, including time estimation, affect behavior in such complex TBPM tasks. Therefore, the current study aimed to investigate timing biases in both older and younger adults and further to determine how such biases along with additional time management components including planning and plan fidelity influence complex TBPM performance. Results suggest for the first time that younger and older adults do not always utilize similar timing strategies, and as a result, can produce differential timing biases under the exact same environmental conditions. These timing biases, in turn, play a vital role in how efficiently both younger and older adults perform a later TBPM task that requires them to utilize their earlier time estimate. PMID:27336325

  16. Estimation of effective connectivity using multi-layer perceptron artificial neural network.

    PubMed

    Talebi, Nasibeh; Nasrabadi, Ali Motie; Mohammad-Rezazadeh, Iman

    2018-02-01

    Studies on interactions between brain regions estimate effective connectivity, (usually) based on the causality inferences made on the basis of temporal precedence. In this study, the causal relationship is modeled by a multi-layer perceptron feed-forward artificial neural network, because of the ANN's ability to generate appropriate input-output mapping and to learn from training examples without the need of detailed knowledge of the underlying system. At any time instant, the past samples of data are placed in the network input, and the subsequent values are predicted at its output. To estimate the strength of interactions, the measure of " Causality coefficient " is defined based on the network structure, the connecting weights and the parameters of hidden layer activation function. Simulation analysis demonstrates that the method, called "CREANN" (Causal Relationship Estimation by Artificial Neural Network), can estimate time-invariant and time-varying effective connectivity in terms of MVAR coefficients. The method shows robustness with respect to noise level of data. Furthermore, the estimations are not significantly influenced by the model order (considered time-lag), and the different initial conditions (initial random weights and parameters of the network). CREANN is also applied to EEG data collected during a memory recognition task. The results implicate that it can show changes in the information flow between brain regions, involving in the episodic memory retrieval process. These convincing results emphasize that CREANN can be used as an appropriate method to estimate the causal relationship among brain signals.

  17. Developing the Navy’s NC Flying Boats: Transforming Aeronautical Engineering for the First Transatlantic Flight

    DTIC Science & Technology

    2011-12-01

    Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...searching existing data sources gathering and maintaining the data needed, and completing and reviewing this collection of information Send comments...regarding this burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden lo Department of

  18. Evaluation of Factors on the Patterns of Ship Movement and Predictability of Future Ship Location in the Gulf of Mexico

    DTIC Science & Technology

    2017-03-01

    0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...instruction, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information ...Send comments regarding this burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden

  19. Identification of Genes and Genetic Variants Associated with Poor Treatment Response in Patients with Prostate Cancer

    DTIC Science & Technology

    2013-10-01

    collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate...variants which explain much more than a small amount of risk for prostate cancer among a small population of men. Even less progress has been made

  20. Imperishable Networks: Complexity Theory and Communication Networking-Bridging the Gap Between Algorithmic Information Theory and Communication Networking

    DTIC Science & Technology

    2003-04-01

    gener- ally considered to be passive data . Instead the genetic material should be capable of being algorith - mic information, that is, program code or...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other

  1. Identifying Enterprise Leverage Points in Defense Acquisition Program Performance

    DTIC Science & Technology

    2009-09-01

    estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining...the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this...of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB

  2. Biomarker Discovery in Gulf War Veterans: Development of a War Illness Diagnostic Panel

    DTIC Science & Technology

    2014-10-17

    estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the... data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this...that notwithstanding any other provision of law, no person shall be subject to any penalty for failing to comply with a collection of information if it

  3. Estimating root collar diameter growth for multi-stem western woodland tree species on remeasured forest inventory and analysis plots

    Treesearch

    Michael T. Thompson; Maggie. Toone

    2012-01-01

    Tree diameter growth models are widely used in many forestry applications, often to predict tree size at a future point in time. Also, there are instances where projections of past diameters are needed. An individual tree model has been developed to estimate diameter growth of multi-stem woodland tree species where the diameter is measured at root collar. The model was...

  4. Design and Analysis of Low Frequency Communication System in Persian Gulf

    DTIC Science & Technology

    2008-09-01

    information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources , gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently

  5. Nonlinear Oscillations of Microscale Piezoelectric Resonators and Resonator Arrays

    DTIC Science & Technology

    2006-06-30

    REO TD C M NA INPG Form ApprovedREPO T D CUM NTATON AGEOMB No. 0704-0188 Public reporting burden for this collection of information is estimated to...average 1 hour per response, including the time for reviewing instructions, searching data sources , gathering and maintaining the data needed, and...completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information

  6. Development of a Tetrathioether (S4) Bifunctional Chelate System for Rh-105

    DTIC Science & Technology

    2013-07-01

    information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources , gathering and...maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect...information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY

  7. Development of a Tetrathioether (S4) Bifunctional Chelate System for Rh-105

    DTIC Science & Technology

    2013-06-01

    is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources , gathering and...maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of...if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) June

  8. sarA as a Target for the Treatment and Prevention of Staphylococcal Biofilm-Associated Infection

    DTIC Science & Technology

    2015-02-01

    M.S., Compadre, C.M. 2011. Sesquiterpene lactons from Gynoxys verrucosa and their anti - MRSA activity. Journal of Ethnopharmacology, 137:1055-1059. 11...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data...needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this

  9. Determination of time zero from a charged particle detector

    DOEpatents

    Green, Jesse Andrew [Los Alamos, NM

    2011-03-15

    A method, system and computer program is used to determine a linear track having a good fit to a most likely or expected path of charged particle passing through a charged particle detector having a plurality of drift cells. Hit signals from the charged particle detector are associated with a particular charged particle track. An initial estimate of time zero is made from these hit signals and linear tracks are then fit to drift radii for each particular time-zero estimate. The linear track having the best fit is then searched and selected and errors in fit and tracking parameters computed. The use of large and expensive fast detectors needed to time zero in the charged particle detectors can be avoided by adopting this method and system.

  10. Estimating Angle-of-Arrival and Time-of-Flight for Multipath Components Using WiFi Channel State Information.

    PubMed

    Ahmed, Afaz Uddin; Arablouei, Reza; Hoog, Frank de; Kusy, Branislav; Jurdak, Raja; Bergmann, Neil

    2018-05-29

    Channel state information (CSI) collected during WiFi packet transmissions can be used for localization of commodity WiFi devices in indoor environments with multipath propagation. To this end, the angle of arrival (AoA) and time of flight (ToF) for all dominant multipath components need to be estimated. A two-dimensional (2D) version of the multiple signal classification (MUSIC) algorithm has been shown to solve this problem using 2D grid search, which is computationally expensive and is therefore not suited for real-time localisation. In this paper, we propose using a modified matrix pencil (MMP) algorithm instead. Specifically, we show that the AoA and ToF estimates can be found independently of each other using the one-dimensional (1D) MMP algorithm and the results can be accurately paired to obtain the AoA⁻ToF pairs for all multipath components. Thus, the 2D estimation problem reduces to running 1D estimation multiple times, substantially reducing the computational complexity. We identify and resolve the problem of degenerate performance when two or more multipath components have the same AoA. In addition, we propose a packet aggregation model that uses the CSI data from multiple packets to improve the performance under noisy conditions. Simulation results show that our algorithm achieves two orders of magnitude reduction in the computational time over the 2D MUSIC algorithm while achieving similar accuracy. High accuracy and low computation complexity of our approach make it suitable for applications that require location estimation to run on resource-constrained embedded devices in real time.

  11. A technique for estimating ground-water levels at sites in Rhode Island from observation-well data

    USGS Publications Warehouse

    Socolow, Roy S.; Frimpter, Michael H.; Turtora, Michael; Bell, Richard W.

    1994-01-01

    Estimates of future high, median, and low ground- water levels are needed for engineering and architectural design decisions and for appropriate selection of land uses. For example, the failure of individual underground sewage-disposal systems due to high ground-water levels can be prevented if accurate water-level estimates are available. Estimates of extreme or average conditions are needed because short duration preconstruction obser- vations are unlikely to be adequately represen- tative. Water-level records for 40 U.S. Geological Survey observation wells in Rhode Island were used to describe and interpret water-level fluctuations. The maximum annual range of water levels average about 6 feet in sand and gravel and 11 feet in till. These data were used to develop equations for estimating future high, median, and low water levels on the basis of any one measurement at a site and records of water levels at observation wells used as indexes. The estimating technique relies on several assumptions about temporal and spatial variations: (1) Water levels will vary in the future as they have in the past, (2) Water levels fluctuate seasonally (3) Ground-water fluctuations are dependent on site geology, and (4) Water levels throughout Rhode Island are subject to similar precipitation and climate. Comparison of 6,697 estimates of high, median, and low water levels (depth to water level exceeded 95, 50, and 5 percent of the time, respectively) with the actual measured levels exceeded 95, 50, and 5 percent of the time at 14 sites unaffected by pumping and unknown reasons, yielded mean squared errors ranging from 0.34 to 1.53 square feet, 0.30 to 1.22 square feet, and 0.32 to 2.55 square feet, respectively. (USGS)

  12. Spatial downscaling and correction of precipitation and temperature time series to high resolution hydrological response units in the Canadian Rocky Mountains

    NASA Astrophysics Data System (ADS)

    Kienzle, Stefan

    2015-04-01

    Precipitation is the central driving force of most hydrological processes, and is also the most variable element of the hydrological cycle. As the precipitation to runoff ratio is non-linear, errors in precipitation estimations are amplified in streamflow simulations. Therefore, the accurate estimate of areal precipitation is essential for watershed models and relevant impacts studies. A procedure is presented to demonstrate the spatial distribution of daily precipitation and temperature estimates across the Rocky Mountains within the framework of the ACRU agro-hydrological modelling system (ACRU). ACRU (Schulze, 1995) is a physical-conceptual, semi-distributed hydrological modelling system designed to be responsive to changes in land use and climate. The model has been updated to include specific high-mountain and cold climate routines and is applied to simulate impacts of land cover and climate change on the hydrological behaviour of numerous Rocky Mountain watersheds in Alberta, Canada. Both air temperature and precipitation time series need to be downscaled to hydrological response units (HRUs), as they are the spatial modelling units for the model. The estimation of accurate daily air temperatures is critical for the separation of rain and snow. The precipitation estimation procedure integrates a spatially distributed daily precipitation database for the period 1950 to 2010 at a scale of 10 by 10 km with a 1971-2000 climate normal database available at 2 by 2 km (PRISM). Resulting daily precipitation time series are further downscaled to the spatial resolution of hydrological response units, defined by 100 m elevation bands, land cover, and solar radiation, which have an average size of about 15 km2. As snow measurements are known to have a potential under-catch of up to 40%, further adjustment of snowfall may need to be increased using a procedure by Richter (1995). Finally, precipitation input to HRUs with slopes steeper than 10% need to be further corrected, because the true, sloped area, has a larger area than the planimetric area derived from a GIS. The omission of correcting for sloped areas would result in incorrect calculations of interception volumes, soil moisture storages, groundwater recharge rates, actual evapotranspiration volumes, and runoff coefficients. Daily minimum and maximum air temperatures are estimated for each HRU by downscaling the 10km time series to the HRUs by (a) applying monthly mean lapse rates, estimated either from surrounding climate stations or from the PRISM climate normal dataset in combination with a digital elevation model, (b) adjusting further for aspect of the HRU based on monthly mean incoming solar radiation, and (c) adjusting for canopy cover using the monthly mean leaf area indices. Precipitation estimates can be verified using independent snow water equivalent measurements derived from snow pillow or snow course observations, while temperature estimates are verified against either independent temperature measurements from climate stations, or from fire observation towers.

  13. Real-time estimation of BDS/GPS high-rate satellite clock offsets using sequential least squares

    NASA Astrophysics Data System (ADS)

    Fu, Wenju; Yang, Yuanxi; Zhang, Qin; Huang, Guanwen

    2018-07-01

    The real-time precise satellite clock product is one of key prerequisites for real-time Precise Point Positioning (PPP). The accuracy of the 24-hour predicted satellite clock product with 15 min sampling interval and an update of 6 h provided by the International GNSS Service (IGS) is only 3 ns, which could not meet the needs of all real-time PPP applications. The real-time estimation of high-rate satellite clock offsets is an efficient method for improving the accuracy. In this paper, the sequential least squares method to estimate real-time satellite clock offsets with high sample rate is proposed to improve the computational speed by applying an optimized sparse matrix operation to compute the normal equation and using special measures to take full advantage of modern computer power. The method is first applied to BeiDou Navigation Satellite System (BDS) and provides real-time estimation with a 1 s sample rate. The results show that the amount of time taken to process a single epoch is about 0.12 s using 28 stations. The Standard Deviation (STD) and Root Mean Square (RMS) of the real-time estimated BDS satellite clock offsets are 0.17 ns and 0.44 ns respectively when compared to German Research Center for Geosciences (GFZ) final clock products. The positioning performance of the real-time estimated satellite clock offsets is evaluated. The RMSs of the real-time BDS kinematic PPP in east, north, and vertical components are 7.6 cm, 6.4 cm and 19.6 cm respectively. The method is also applied to Global Positioning System (GPS) with a 10 s sample rate and the computational time of most epochs is less than 1.5 s with 75 stations. The STD and RMS of the real-time estimated GPS satellite clocks are 0.11 ns and 0.27 ns, respectively. The accuracies of 5.6 cm, 2.6 cm and 7.9 cm in east, north, and vertical components are achieved for the real-time GPS kinematic PPP.

  14. An evaluation of rapid methods for monitoring vegetation characteristics of wetland bird habitat

    USGS Publications Warehouse

    Tavernia, Brian G.; Lyons, James E.; Loges, Brian W.; Wilson, Andrew; Collazo, Jaime A.; Runge, Michael C.

    2016-01-01

    Wetland managers benefit from monitoring data of sufficient precision and accuracy to assess wildlife habitat conditions and to evaluate and learn from past management decisions. For large-scale monitoring programs focused on waterbirds (waterfowl, wading birds, secretive marsh birds, and shorebirds), precision and accuracy of habitat measurements must be balanced with fiscal and logistic constraints. We evaluated a set of protocols for rapid, visual estimates of key waterbird habitat characteristics made from the wetland perimeter against estimates from (1) plots sampled within wetlands, and (2) cover maps made from aerial photographs. Estimated percent cover of annuals and perennials using a perimeter-based protocol fell within 10 percent of plot-based estimates, and percent cover estimates for seven vegetation height classes were within 20 % of plot-based estimates. Perimeter-based estimates of total emergent vegetation cover did not differ significantly from cover map estimates. Post-hoc analyses revealed evidence for observer effects in estimates of annual and perennial covers and vegetation height. Median time required to complete perimeter-based methods was less than 7 percent of the time needed for intensive plot-based methods. Our results show that rapid, perimeter-based assessments, which increase sample size and efficiency, provide vegetation estimates comparable to more intensive methods.

  15. Estimating anesthesia and surgical procedure times from medicare anesthesia claims.

    PubMed

    Silber, Jeffrey H; Rosenbaum, Paul R; Zhang, Xuemei; Even-Shoshan, Orit

    2007-02-01

    Procedure times are important variables that often are included in studies of quality and efficiency. However, due to the need for costly chart review, most studies are limited to single-institution analyses. In this article, the authors describe how well the anesthesia claim from Medicare can estimate chart times. The authors abstracted information on time of induction and entrance to the recovery room ("anesthesia chart time") from the charts of 1,931 patients who underwent general and orthopedic surgical procedures in Pennsylvania. The authors then merged the associated bills from claims data supplied from Medicare (Part B data) that included a variable denoting the time in minutes for the anesthesia service. The authors also investigated the time from incision to closure ("surgical chart time") on a subset of 1,888 patients. Anesthesia claim time from Medicare was highly predictive of anesthesia chart time (Kendall's rank correlation tau = 0.85, P < 0.0001, median absolute error = 5.1 min) but somewhat less predictive of surgical chart time (Kendall's tau = 0.73, P < 0.0001, median absolute error = 13.8 min). When predicting chart time from Medicare bills, variables reflecting procedure type, comorbidities, and hospital type did not significantly improve the prediction, suggesting that errors in predicting the chart time from the anesthesia bill time are not related to these factors; however, the individual hospital did have some influence on these estimates. Anesthesia chart time can be well estimated using Medicare claims, thereby facilitating studies with vastly larger sample sizes and much lower costs of data collection.

  16. Real-time estimation system for seismic-intensity exposed-population

    NASA Astrophysics Data System (ADS)

    Aoi, S.; Nakamura, H.; Kunugi, T.; Suzuki, W.; Fujiwara, H.

    2013-12-01

    For an appropriate first-action to an earthquake, risk (damage) information evaluated in real-time are important as well as hazard (ground motion) information. To meet this need, we are developing real-time estimation system (J-RISQ) for exposed population and earthquake damage on buildings. We plan to open the web page of estimated exposed population to the public from autumn. When an earthquake occurs, seismic intensities are calculated at each observation station and sent to the DMC (Data Management Center) in different timing. For rapid estimation, the system does not wait for the data from all the stations but begins the first estimation when the number of the stations observing the seismic intensity of 2.5 or larger exceeds the threshold amount. Estimations are updated several times using all the available data at that moment. Spatial distribution of seismic intensity in 250 m meshes is estimated by the site amplification factor of surface layers and the observed data. By using this intensity distribution, the exposed population is estimated using population data of each mesh. The exposed populations for municipalities and prefectures are estimated by summing-up the exposures of included meshes for the area and are appropriately rounded taking estimation precision into consideration. The estimated intensities for major cities are shown by the histograms, which indicate the variation of the estimated values in the city together with the observed maximum intensity. The variation is mainly caused by the difference of the site amplification factors. The intensities estimated for meshes with large amplification factor are sometimes larger than the maximum value observed in the city. The estimated results are seen on the web site just after the earthquake. The results of the past earthquakes can be easily searched by keywords such as date, magnitudes, seismic intensities and source areas. The summary of the results in the one-page report of Portable Document Format is also available. This system has been experimentally operated since 2010 and has performed the estimations in real-time for more than 670 earthquakes by July of 2012. For about 75 % of these earthquakes, it takes less than one minute to send the e-mail of first estimation after receiving data from the first triggered station, and therefore, the rapidity of the system is satisfactory. To upload a PDF report form to the web site, it takes approximately additional 30 second.

  17. A Simulation Model for Purchasing Duplicate Copies in a Library

    ERIC Educational Resources Information Center

    Arms, W. Y.; Walter, T. P.

    1974-01-01

    A method of estimating the number of duplicate copies of books needed based on a computer simulation which takes into account number of copies available, number of loans, total underlying demand, satisfaction level, percentage time on shelf. (LS)

  18. Historical emissions of carbonaceous aerosols from biomass and fossil fuel burning for the period 1870-2000

    NASA Astrophysics Data System (ADS)

    Ito, Akinori; Penner, Joyce E.

    2005-06-01

    Historical changes of black carbon (BC) and particulate organic matter (POM) emissions from biomass burning (BB) and fossil fuel (FF) burning are estimated from 1870 to 2000. A bottom-up inventory for open vegetation (OV) burning is scaled by a top-down estimate for the year 2000. Monthly and interannual variations are derived over the time period from 1979 to 2000 based on the TOMS satellite aerosol index (AI) and this global map. Prior to 1979, emissions are scaled to a CH4 emissions inventory based on land-use change. Biofuel (BF) emissions from a recent inventory for developing countries are scaled forward and backward in time using population statistics and crop production statistics. In developed countries, wood consumption data together with emission factors for cooking and heating practices are used for biofuel estimates. For fossil fuel use, we use fuel consumption data and specific emission factors for different fuel use categories to develop an inventory over 1950-2000, and emissions are scaled to a CO2 inventory prior to that time. Technology changes for emissions from the diesel transport sector are included. During the last decade of this time period, the BC and POM emissions from biomass burning (i.e., OV + BF) contribute a significant amount to the primary sources of BC and POM and are larger than those from FF. Thus 59% of the NH BC emissions and 90% of the NH POM emissions are from BB in 2000. Fossil fuel consumption technologies are needed prior to 1990 in order to improve estimates of fossil fuel emissions during the twentieth century. These results suggest that the aerosol emissions from biomass burning need to be represented realistically in climate change assessments. The estimated emissions are available on a 1° × 1° grid for global climate modeling studies of climate changes.

  19. Unequal-Arm Interferometry and Ranging in Space

    NASA Technical Reports Server (NTRS)

    Tinto, Massimo

    2005-01-01

    Space-borne interferometric gravitational wave detectors, sensitive in the low-frequency (millihertz) band, will fly in the next decade. In these detectors the spacecraft-to-spacecraft light-traveltimes will necessarily be unequal, time-varying, and (due to aberration) have different time delays on up- and down-links. By using knowledge of the inter-spacecraft light-travel-times and their time evolution it is possible to cancel in post-processing the otherwise dominant laser phase noise and obtain a variety of interferometric data combinations sensitive to gravitational radiation. This technique, which has been named Time-Delay Interferometry (TDI), can be implemented with constellations of three or more formation-flying spacecraft that coherently track each other. As an example application we consider the Laser Interferometer Space Antenna (LISA) mission and show that TDI combinations can be synthesized by properly time-shifting and linearly combining the phase measurements performed on board the three spacecraft. Since TDI exactly suppresses the laser noises when the delays coincide with the light-travel-times, we then show that TDI can also be used for estimating the time-delays needed for its implementation. This is done by performing a post-processing non-linear minimization procedure, which provides an effective, powerful, and simple way for making measurements of the inter-spacecraft light-travel-times. This processing technique, named Time-Delay Interferometric Ranging (TDIR), is highly accurate in estimating the time-delays and allows TDI to be successfully implemented without the need of a dedicated ranging subsystem.

  20. Value of Information Analysis for Time-lapse Seismic Data by Simulation-Regression

    NASA Astrophysics Data System (ADS)

    Dutta, G.; Mukerji, T.; Eidsvik, J.

    2016-12-01

    A novel method to estimate the Value of Information (VOI) of time-lapse seismic data in the context of reservoir development is proposed. VOI is a decision analytic metric quantifying the incremental value that would be created by collecting information prior to making a decision under uncertainty. The VOI has to be computed before collecting the information and can be used to justify its collection. Previous work on estimating the VOI of geophysical data has involved explicit approximation of the posterior distribution of reservoir properties given the data and then evaluating the prospect values for that posterior distribution of reservoir properties. Here, we propose to directly estimate the prospect values given the data by building a statistical relationship between them using regression. Various regression techniques such as Partial Least Squares Regression (PLSR), Multivariate Adaptive Regression Splines (MARS) and k-Nearest Neighbors (k-NN) are used to estimate the VOI, and the results compared. For a univariate Gaussian case, the VOI obtained from simulation-regression has been shown to be close to the analytical solution. Estimating VOI by simulation-regression is much less computationally expensive since the posterior distribution of reservoir properties given each possible dataset need not be modeled and the prospect values need not be evaluated for each such posterior distribution of reservoir properties. This method is flexible, since it does not require rigid model specification of posterior but rather fits conditional expectations non-parametrically from samples of values and data.

  1. Real-time soft tissue motion estimation for lung tumors during radiotherapy delivery.

    PubMed

    Rottmann, Joerg; Keall, Paul; Berbeco, Ross

    2013-09-01

    To provide real-time lung tumor motion estimation during radiotherapy treatment delivery without the need for implanted fiducial markers or additional imaging dose to the patient. 2D radiographs from the therapy beam's-eye-view (BEV) perspective are captured at a frame rate of 12.8 Hz with a frame grabber allowing direct RAM access to the image buffer. An in-house developed real-time soft tissue localization algorithm is utilized to calculate soft tissue displacement from these images in real-time. The system is tested with a Varian TX linear accelerator and an AS-1000 amorphous silicon electronic portal imaging device operating at a resolution of 512 × 384 pixels. The accuracy of the motion estimation is verified with a dynamic motion phantom. Clinical accuracy was tested on lung SBRT images acquired at 2 fps. Real-time lung tumor motion estimation from BEV images without fiducial markers is successfully demonstrated. For the phantom study, a mean tracking error <1.0 mm [root mean square (rms) error of 0.3 mm] was observed. The tracking rms accuracy on BEV images from a lung SBRT patient (≈20 mm tumor motion range) is 1.0 mm. The authors demonstrate for the first time real-time markerless lung tumor motion estimation from BEV images alone. The described system can operate at a frame rate of 12.8 Hz and does not require prior knowledge to establish traceable landmarks for tracking on the fly. The authors show that the geometric accuracy is similar to (or better than) previously published markerless algorithms not operating in real-time.

  2. A MODIS direct broadcast algorithm for mapping wildfire burned area in the western United States

    Treesearch

    S. P. Urbanski; J. M. Salmon; B. L. Nordgren; W. M. Hao

    2009-01-01

    Improved wildland fire emission inventory methods are needed to support air quality forecasting and guide the development of air shed management strategies. Air quality forecasting requires dynamic fire emission estimates that are generated in a timely manner to support real-time operations. In the regulatory and planning realm, emission inventories are essential for...

  3. Monitoring and Modeling Performance of Communications in Computational Grids

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael A.; Le, Thuy T.

    2003-01-01

    Computational grids may include many machines located in a number of sites. For efficient use of the grid we need to have an ability to estimate the time it takes to communicate data between the machines. For dynamic distributed grids it is unrealistic to know exact parameters of the communication hardware and the current communication traffic and we should rely on a model of the network performance to estimate the message delivery time. Our approach to a construction of such a model is based on observation of the messages delivery time with various message sizes and time scales. We record these observations in a database and use them to build a model of the message delivery time. Our experiments show presence of multiple bands in the logarithm of the message delivery times. These multiple bands represent multiple paths messages travel between the grid machines and are incorporated in our multiband model.

  4. A threshold-free summary index of prediction accuracy for censored time to event data.

    PubMed

    Yuan, Yan; Zhou, Qian M; Li, Bingying; Cai, Hengrui; Chow, Eric J; Armstrong, Gregory T

    2018-05-10

    Prediction performance of a risk scoring system needs to be carefully assessed before its adoption in clinical practice. Clinical preventive care often uses risk scores to screen asymptomatic population. The primary clinical interest is to predict the risk of having an event by a prespecified future time t 0 . Accuracy measures such as positive predictive values have been recommended for evaluating the predictive performance. However, for commonly used continuous or ordinal risk score systems, these measures require a subjective cutoff threshold value that dichotomizes the risk scores. The need for a cutoff value created barriers for practitioners and researchers. In this paper, we propose a threshold-free summary index of positive predictive values that accommodates time-dependent event status and competing risks. We develop a nonparametric estimator and provide an inference procedure for comparing this summary measure between 2 risk scores for censored time to event data. We conduct a simulation study to examine the finite-sample performance of the proposed estimation and inference procedures. Lastly, we illustrate the use of this measure on a real data example, comparing 2 risk score systems for predicting heart failure in childhood cancer survivors. Copyright © 2018 John Wiley & Sons, Ltd.

  5. Soil hydraulic properties estimate based on numerical analysis of disc infiltrometer three-dimensional infiltration curve

    NASA Astrophysics Data System (ADS)

    Latorre, Borja; Peña-Sancho, Carolina; Angulo-Jaramillo, Rafaël; Moret-Fernández, David

    2015-04-01

    Measurement of soil hydraulic properties is of paramount importance in fields such as agronomy, hydrology or soil science. Fundamented on the analysis of the Haverkamp et al. (1994) model, the aim of this paper is to explain a technique to estimate the soil hydraulic properties (sorptivity, S, and hydraulic conductivity, K) from the full-time cumulative infiltration curves. The method (NSH) was validated by means of 12 synthetic infiltration curves generated with HYDRUS-3D from known soil hydraulic properties. The K values used to simulate the synthetic curves were compared to those estimated with the proposed method. A procedure to identify and remove the effect of the contact sand layer on the cumulative infiltration curve was also developed. A sensitivity analysis was performed using the water level measurement as uncertainty source. Finally, the procedure was evaluated using different infiltration times and data noise. Since a good correlation between the K used in HYDRUS-3D to model the infiltration curves and those estimated by the NSH method was obtained, (R2 =0.98), it can be concluded that this technique is robust enough to estimate the soil hydraulic conductivity from complete infiltration curves. The numerical procedure to detect and remove the influence of the contact sand layer on the K and S estimates seemed to be robust and efficient. An effect of the curve infiltration noise on the K estimate was observed, which uncertainty increased with increasing noise. Finally, the results showed that infiltration time was an important factor to estimate K. Lower values of K or smaller uncertainty needed longer infiltration times.

  6. Commuting to work: RN travel time to employment in rural and urban areas.

    PubMed

    Rosenberg, Marie-Claire; Corcoran, Sean P; Kovner, Christine; Brewer, Carol

    2011-02-01

    To investigate the variation in average daily travel time to work among registered nurses (RNs) living in urban, suburban, and rural areas. We examine how travel time varies across RN characteristics, job setting, and availability of local employment opportunities. Descriptive statistics and linear regression using a 5% sample from the 2000 Census and a longitudinal survey of newly licensed RNs (NLRN). Travel time for NLRN respondents was estimated using geographic information systems (GIS) software. In the NLRN, rural nurses and those living in small towns had significantly longer average commute times. Young married RNs and RNs with children also tended to have longer commute times, as did RNs employed by hospitals. The findings indicate that travel time to work varies significantly across locale types. Further research is needed to understand whether and to what extent lengthy commute times impact RN workforce needs in rural and urban areas.

  7. A 3D approximate maximum likelihood localization solver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-09-23

    A robust three-dimensional solver was needed to accurately and efficiently estimate the time sequence of locations of fish tagged with acoustic transmitters and vocalizing marine mammals to describe in sufficient detail the information needed to assess the function of dam-passage design alternatives and support Marine Renewable Energy. An approximate maximum likelihood solver was developed using measurements of time difference of arrival from all hydrophones in receiving arrays on which a transmission was detected. Field experiments demonstrated that the developed solver performed significantly better in tracking efficiency and accuracy than other solvers described in the literature.

  8. Velocity gradients and reservoir volumes lessons in computational sensitivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, P.W.

    1995-12-31

    The sensitivity of reservoir volume estimation from depth converted geophysical time maps to the velocity gradients employed is investigated through a simple model study. The computed volumes are disconcertingly sensitive to gradients, both horizontal and vertical. The need for an accurate method of time to depth conversion is well demonstrated by the model study in which errors in velocity are magnified 40 fold in the computation of the volume. Thus if +/- 10% accuracy in the volume is desired, we must be able to estimate the velocity at the water contact with 0.25% accuracy. Put another way, if the velocitymore » is 8000 feet per second at the well then we have only +/- 20 feet per second leeway in estimating the velocity at the water contact. Very moderate horizontal and vertical gradients would typically indicate a velocity change of a few hundred feet per second if they are in the same direction. Clearly the interpreter needs to by very careful. A methodology is demonstrated which takes into account all the information that is available, velocities, tops, depositional and lithologic spatial patterns, and common sense. It is assumed that through appropriate use of check shot and other time-depth information, that the interpreter has correctly tied the reflection picks to the well tops. Such ties are ordinarily too soft for direct time-depth conversion to give adequate depth ties. The proposed method uses a common compaction law as its basis and incorporates time picks, tops and stratigraphic maps into the depth conversion process. The resulting depth map ties the known well tops in an optimum fashion.« less

  9. Estimates and implications of the costs of compliance with biosafety regulations in developing countries.

    PubMed

    Falck-Zepeda, Jose; Yorobe, Jose; Husin, Bahagiawati Amir; Manalo, Abraham; Lokollo, Erna; Ramon, Godfrey; Zambrano, Patricia; Sutrisno

    2012-01-01

    Estimating the cost of compliance with biosafety regulations is important as it helps developers focus their investments in producer development. We provide estimates for the cost of compliance for a set of technologies in Indonesia, the Philippines and other countries. These costs vary from US $100,000 to 1.7 million. These are estimates of regulatory costs and do not include product development or deployment costs. Cost estimates need to be compared with potential gains when the technology is introduced in these countries and the gains in knowledge accumulate during the biosafety assessment process. Although the cost of compliance is important, time delays and uncertainty are even more important and may have an adverse impact on innovations reaching farmers.

  10. County-level estimates of nitrogen and phosphorus from animal manure for the conterminous United States, 2007 and 2012

    USGS Publications Warehouse

    Gronberg, JoAnn M.; Arnold, Terri L.

    2017-03-24

    County-level estimates of nitrogen and phosphorus inputs from animal manure for the conterminous United States were calculated from animal population inventories in the 2007 and 2012 Census of Agriculture, using previously published methods. These estimates of non-point nitrogen and phosphorus inputs from animal manure were compiled in support of the U.S. Geological Survey’s National Water-Quality Assessment Project of the National Water Quality Program and are needed to support national-scale investigations of stream and groundwater water quality. The estimates published in this report are comparable with older estimates which can be compared to show changes in nitrogen and phosphorus inputs from manure over time.

  11. Estimation of gastric emptying time (GET) in clownfish (Amphiprion ocellaris) using X-radiography technique

    NASA Astrophysics Data System (ADS)

    Ling, Khoo Mei; Ghaffar, Mazlan Abd.

    2014-09-01

    This study examines the movement of food item and the estimation of gastric emptying time using the X-radiography techniques, in the clownfish (Amphiprion ocellaris) fed in captivity. Fishes were voluntarily fed to satiation after being deprived of food for 72 hours, using pellets that were tampered with barium sulphate (BaSO4). The movement of food item was monitored over different time of feeding. As a result, a total of 36 hours were needed for the food items to be evacuated completely from the stomach. Results on the modeling of meal satiation were also discussed. The size of satiation meal to body weight relationship was allometric, with the power value equal to 1.28.

  12. Survival of female northern pintails wintering in southwestern Louisiana

    USGS Publications Warehouse

    Cox, R.R.; Afton, A.D.; Pace, R. M.

    1998-01-01

    The North American breeding population of northern pintails (Anas acuta) has reached previously unprecedented low numbers 4 times since 1983. Because pintails show high fidelity to wintering areas, regional survival estimates and identification of factors influencing survival are needed to guide management of wintering pintails. We used raidiotelemetry to estimate survival rates of female pintails wintering in southwestern Lousiaina. We tested for variation in survival and hunting mortality rates in realtiaon to age (immature or adult), winter (1990-91, 1991-92, 1992-93), time period (prehunting season, first hunting season, time between split hunting seasons, second hunting season, posthunting season), body condition (body mass when released, adjusted for body size), and region (southwestern Louisiana or elsewhere on the Texas-Louisiana Gulf Coast or Mississippi Alluvial Valley).

  13. Estimation of gastric emptying time (GET) in clownfish (Amphiprion ocellaris) using X-radiography technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ling, Khoo Mei; Ghaffar, Mazlan Abd.

    2014-09-03

    This study examines the movement of food item and the estimation of gastric emptying time using the X-radiography techniques, in the clownfish (Amphiprion ocellaris) fed in captivity. Fishes were voluntarily fed to satiation after being deprived of food for 72 hours, using pellets that were tampered with barium sulphate (BaSO{sub 4}). The movement of food item was monitored over different time of feeding. As a result, a total of 36 hours were needed for the food items to be evacuated completely from the stomach. Results on the modeling of meal satiation were also discussed. The size of satiation meal tomore » body weight relationship was allometric, with the power value equal to 1.28.« less

  14. A Computer for Low Context-Switch Time

    DTIC Science & Technology

    1990-03-01

    Results To find out how an implementation performs, we use a set of programs that make up a simulation system. These programs compile C language programs ...have worse relative context-switch performance: the time needed to switch contexts has not de- creased as much as the time to run programs . Much of...this study is: How seriously is throughput performance im- paired by this approach to computer architecture? Reasonable estimates are possible only

  15. Applicability of Deep-Learning Technology for Relative Object-Based Navigation

    DTIC Science & Technology

    2017-09-01

    burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing...possible selections for navigating an unmanned ground vehicle (UGV) is through real- time visual odometry. To navigate in such an environment, the UGV...UGV) is through real- time visual odometry. To navigate in such an environment, the UGV needs to be able to detect, identify, and relate the static

  16. Equipment for the Transient Capture of Chaotic Microwave Signals

    DTIC Science & Technology

    2017-09-14

    estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the... times are needed and over-sampling by a factor of 8 is required so that the effective number of bits can be increased from the actual bit resolution... time acquisition of transient signals with analog bandwidths up to 70 GHz for one channel, and 30 GHz for two channels.. Training Opportunities

  17. Estimating the value of life and injury for pedestrians using a stated preference framework.

    PubMed

    Niroomand, Naghmeh; Jenkins, Glenn P

    2017-09-01

    The incidence of pedestrian death over the period 2010 to 2014 per 1000,000 in North Cyprus is about 2.5 times that of the EU, with 10.5 times more pedestrian road injuries than deaths. With the prospect of North Cyprus entering the EU, many investments need to be undertaken to improve road safety in order to reach EU benchmarks. We conducted a stated choice experiment to identify the preferences and tradeoffs of pedestrians in North Cyprus for improved walking times, pedestrian costs, and safety. The choice of route was examined using mixed logit models to obtain the marginal utilities associated with each attribute of the routes that consumers chose. These were used to estimate the individuals' willingness to pay (WTP) to save walking time and to avoid pedestrian fatalities and injuries. We then used the results to obtain community-wide estimates of the value of a statistical life (VSL) saved, the value of an injury (VI) prevented, and the value per hour of walking time saved. The estimate of the VSL was €699,434 and the estimate of VI was €20,077. These values are consistent, after adjusting for differences in incomes, with the median results of similar studies done for EU countries. The estimated value of time to pedestrians is €7.20 per person hour. The ratio of deaths to injuries is much higher for pedestrians than for road accidents, and this is completely consistent with the higher estimated WTP to avoid a pedestrian accident than to avoid a car accident. The value of time of €7.20 is quite high relative to the wages earned. Findings provide a set of information on the VRR for fatalities and injuries and the value of pedestrian time that is critical for conducing ex ante appraisals of investments to improve pedestrian safety. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.

  18. Modeling in Real Time During the Ebola Response.

    PubMed

    Meltzer, Martin I; Santibanez, Scott; Fischer, Leah S; Merlin, Toby L; Adhikari, Bishwa B; Atkins, Charisma Y; Campbell, Caresse; Fung, Isaac Chun-Hai; Gambhir, Manoj; Gift, Thomas; Greening, Bradford; Gu, Weidong; Jacobson, Evin U; Kahn, Emily B; Carias, Cristina; Nerlander, Lina; Rainisch, Gabriel; Shankar, Manjunath; Wong, Karen; Washington, Michael L

    2016-07-08

    To aid decision-making during CDC's response to the 2014-2016 Ebola virus disease (Ebola) epidemic in West Africa, CDC activated a Modeling Task Force to generate estimates on various topics related to the response in West Africa and the risk for importation of cases into the United States. Analysis of eight Ebola response modeling projects conducted during August 2014-July 2015 provided insight into the types of questions addressed by modeling, the impact of the estimates generated, and the difficulties encountered during the modeling. This time frame was selected to cover the three phases of the West African epidemic curve. Questions posed to the Modeling Task Force changed as the epidemic progressed. Initially, the task force was asked to estimate the number of cases that might occur if no interventions were implemented compared with cases that might occur if interventions were implemented; however, at the peak of the epidemic, the focus shifted to estimating resource needs for Ebola treatment units. Then, as the epidemic decelerated, requests for modeling changed to generating estimates of the potential number of sexually transmitted Ebola cases. Modeling to provide information for decision-making during the CDC Ebola response involved limited data, a short turnaround time, and difficulty communicating the modeling process, including assumptions and interpretation of results. Despite these challenges, modeling yielded estimates and projections that public health officials used to make key decisions regarding response strategy and resources required. The impact of modeling during the Ebola response demonstrates the usefulness of modeling in future responses, particularly in the early stages and when data are scarce. Future modeling can be enhanced by planning ahead for data needs and data sharing, and by open communication among modelers, scientists, and others to ensure that modeling and its limitations are more clearly understood. The activities summarized in this report would not have been possible without collaboration with many U.S. and international partners (http://www.cdc.gov/vhf/ebola/outbreaks/2014-west-africa/partners.html).

  19. Results of using the global positioning system to maintain the time and frequency synchronization in the Deep Space Network

    NASA Technical Reports Server (NTRS)

    Clements, P. A.; Kirk, A.; Unglaub, R.

    1987-01-01

    There are two hydrogen maser clocks located at each signal processing center (SPC) in the DSN. Close coordination of the time and frequency of the SPC clocks is needed to navigate spacecraft to the outer planets. A recent example was the Voyager spacecraft's encounter with Uranus in January 1986. The clocks were adjusted with the goal of minimizing time and frequency offsets between the SPCs at encounter. How time and frequency at each SPC is estimated using data acquired from the Global Positioning System Timing Receivers operating on the NBS-BIH (National Bureau of Standards-Bureau International de l'Heure) tracking schedule is described. These data are combined with other available timing receiver data to calculate the time offset estimates. The adjustment of the clocks is described. It was determined that long range hydrogen maser drift is quite predictable and adjustable within limits. This enables one to minimize time and frequency differences between the three SPCs for many months by matching the drift rates of the three standards. Data acquisition and processing techniques using a Kalman filter to make estimates of time and frequency offsets between the clocks at the SPCs and UTC(NBS) (Coordinated Universal Time realized at NBS) are described.

  20. Optimal design of clinical trials with biologics using dose-time-response models.

    PubMed

    Lange, Markus R; Schmidli, Heinz

    2014-12-30

    Biologics, in particular monoclonal antibodies, are important therapies in serious diseases such as cancer, psoriasis, multiple sclerosis, or rheumatoid arthritis. While most conventional drugs are given daily, the effect of monoclonal antibodies often lasts for months, and hence, these biologics require less frequent dosing. A good understanding of the time-changing effect of the biologic for different doses is needed to determine both an adequate dose and an appropriate time-interval between doses. Clinical trials provide data to estimate the dose-time-response relationship with semi-mechanistic nonlinear regression models. We investigate how to best choose the doses and corresponding sample size allocations in such clinical trials, so that the nonlinear dose-time-response model can be precisely estimated. We consider both local and conservative Bayesian D-optimality criteria for the design of clinical trials with biologics. For determining the optimal designs, computer-intensive numerical methods are needed, and we focus here on the particle swarm optimization algorithm. This metaheuristic optimizer has been successfully used in various areas but has only recently been applied in the optimal design context. The equivalence theorem is used to verify the optimality of the designs. The methodology is illustrated based on results from a clinical study in patients with gout, treated by a monoclonal antibody. Copyright © 2014 John Wiley & Sons, Ltd.

  1. Investigation of the implementation of a probe-vehicle based pavement roughness estimation system.

    DOT National Transportation Integrated Search

    2011-08-01

    As roadway systems age and maintenance budgets shrink, a need emerges for timely and roughness data for pavement maintenance decision-making. The Virginia Department of Transportation (VDOT) maintains the third-largest state network of roadways in Am...

  2. Evaluation of consolidation characteristics of cohesive soils from piezocone penetration tests : technical summary.

    DOT National Transportation Integrated Search

    2004-07-01

    The main objective of this study was to evaluate the current interpretation methods for their capability to reasonably predict the consolidation parameters needed to estimate the magnitude and time rate of consolidation settlement in fine-grained soi...

  3. Estimating the coverage of mental health programmes: a systematic review

    PubMed Central

    De Silva, Mary J; Lee, Lucy; Fuhr, Daniela C; Rathod, Sujit; Chisholm, Dan; Schellenberg, Joanna; Patel, Vikram

    2014-01-01

    Background The large treatment gap for people suffering from mental disorders has led to initiatives to scale up mental health services. In order to track progress, estimates of programme coverage, and changes in coverage over time, are needed. Methods Systematic review of mental health programme evaluations that assess coverage, measured either as the proportion of the target population in contact with services (contact coverage) or as the proportion of the target population who receive appropriate and effective care (effective coverage). We performed a search of electronic databases and grey literature up to March 2013 and contacted experts in the field. Methods to estimate the numerator (service utilization) and the denominator (target population) were reviewed to explore methods which could be used in programme evaluations. Results We identified 15 735 unique records of which only seven met the inclusion criteria. All studies reported contact coverage. No study explicitly measured effective coverage, but it was possible to estimate this for one study. In six studies the numerator of coverage, service utilization, was estimated using routine clinical information, whereas one study used a national community survey. The methods for estimating the denominator, the population in need of services, were more varied and included national prevalence surveys case registers, and estimates from the literature. Conclusions Very few coverage estimates are available. Coverage could be estimated at low cost by combining routine programme data with population prevalence estimates from national surveys. PMID:24760874

  4. Afghan National Police: More than $300 Million in Annual, U.S.-funded Salary Payments Is Based on Partially Verified or Reconciled Data

    DTIC Science & Technology

    2015-01-01

    Data SIGAR JANUARY 2015 Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data...needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this

  5. MIZMAS: Modeling the Evolution of Ice Thickness and Floe Size Distributions in the Marginal Ice Zone of the Chukchi and Beaufort Seas

    DTIC Science & Technology

    2013-09-30

    is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of... downscaling future projection simulations. APPROACH To address the scientific objectives, we plan to develop, implement, and validate a new

  6. Retrospective Evaluation of the Protocol for US Army Corps of Engineers Aquatic Ecosystem Restoration Projects. Part 2. Database Content and Data Entry Guidelines

    DTIC Science & Technology

    2014-01-01

    entry and review procedures; (2) explain the various database components; (3) outline included datafields and datasets; and (4) document the...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or

  7. Further Improvements Needed in Navy’s Oversight and Management of Contracting for Facilities Construction on Diego Garcia.

    DTIC Science & Technology

    1984-05-23

    Because the cost accounting reports provide the historical cost information for the cost estimating reports, we also tested the reasonableness of... accounting and cost estimating reports must be based on timely and accurate infor- mation. The reports, therefore, require the continual attention of... accounting system reported less than half the value of site direct charges (labor, materials, equipment usage, and other costs ) that should have been

  8. Marsh birds and the North American Breeding Bird Survey: judging the value of a landscape level survey for habitat specialist species with low detection rates

    USGS Publications Warehouse

    Sauer, J.R.

    1999-01-01

    The North American Breeding Bird Survey was started in 1966, and provides information on population change for >400 species of birds. it covers the continental United States, Canada, and Alaska, and is conducted once each year, in June, by volunteer observers. A 39.4 kIn roadside survey route is driven starting 30 min before sunrise, and a 3 min point count is conducted at each of 50 stops spaced every 0.8 kIn. Existing analyses of the data are internet-based (http://www.mbr-pwrc.usgs.govlbbslbbs.html), and include maps of relative abundance, estimates of population change including trends (%/yr), composite annual indices (pattern in time), and maps of population trend (pattern in space). At least 36 species of marsh birds are encountered on the BBS, and the survey provides estimates with greatly varying levels of efficiency for the species. It is often difficult to understand how well the BBS surveys a species. Often, efficiency is judged by estimating trend and its variance for a species, then by calculating power and needed samples to detect a prespecified trend over some time period (e.g., a 2%/yr trend over 31 yr). Unfortunately, this approach is not always valid, as estimated trends and variances can be of little use if the population is poorly sampled. Lurking concerns with BBS data include (1) incomplete coverage of species range; (2) undersampling of habitats; and (3) low and variable visibility of birds during point counts. It is difficult to evaluate these concerns, because known populations do not exist for comparison with counts, and detection rates are time-consuming and costly to estimate. I evaluated the efficiency of the BBS for selected rails (Rallidae) and snipes (Scolopacidae), presenting estimates of population trend over 1966-1996 (T), power to detect 2%/yr trend over 31 yr, needed samples to achieve power of 0.75 with alpha= 0.1, number of survey routes with data for the species (N), average abundance on survey routes (RA), and maps of relative abundance. Examples include Yellow Rail (Coturnicops noveboracensis) (T=12 %/yr; P= 0.0085; N =28; routes; RA=0.05; Power=0.37; Needed samples=85), Black Rail (Laterallus jamaicensis) (No trend data or power information available, N =8), Clapper Rail (Rallus longirostris) (T=1.9%/yr; P=0.55; N =64; RA=0.31; Power=0.35; Needed samples=590), King Rail (Rallus elegans) (T=-4.2 %/yr; P= 0.03; N =76; Power=0.41; Needed samples=159), Sora (Porzana carolina) (T=0.98 %/yr; P= 0.24; N =720; RA= 0.92; Power=0.69; Needed samples= 377), and Common Snipe (Gallinago gallinago) (T=-0.24 %/yr; P= 0.54; N =1412; RA= 2.19; Power=0.98; Needed samples=205). With regard to quality of BBS data, marsh birds fall into 3 categories: (1) almost never encountered on BBS routes; (2) encountered at extremely low abundances on BBS routes; and (3) probably fairly well sampled by BBS roadside counts. BBS data can provide useful information for many marsh bird species, but users should be aware of the limitations of the BBS sample for monitoring species that have low visibility from point counts and prefer habitats not often encountered on roadsides.

  9. Estimating the need for dental sedation. 2. Using IOSN as a health needs assessment tool.

    PubMed

    Pretty, I A; Goodwin, M; Coulthard, P; Bridgman, C M; Gough, L; Jenner, T; Sharif, M O

    2011-09-09

    This service evaluation assessed the need for sedation in a population of dental attenders (n = 607) in the North West of England. Using the novel IOSN tool, three clinical domains of sedation need were assessed: treatment complexity, medical and behavioural indicators and patient reported anxiety using the Modified Dental Anxiety Scale. The findings suggest that 5% of the population are likely to require a course of treatment under sedation at some time. All three clinical domains contributed to the IOSN score and indication of treatment need. Females were 3.8 times more likely than males to be placed within the high need for sedation group. Factors such as age, deprivation and practice location were not associated with the need for sedation. Primary care trusts (PCTs) need health needs assessment data in order to commission effectively and in line with World Class Commissioning guidelines. This study provides both an indicative figure of need as well as a tool by which individual PCTs can undertake local health needs assessment work. Caution should be taken with the figure as a total need within a population as the study has only included those patients that attended dental practices.

  10. Summary of: estimating the need for dental sedation. 2. Using IOSN as a health needs assessment tool.

    PubMed

    Newton, T

    2011-09-09

    This service evaluation assessed the need for sedation in a population of dental attenders (n = 607) in the North West of England. Using the novel IOSN tool, three clinical domains of sedation need were assessed: treatment complexity, medical and behavioural indicators and patient reported anxiety using the Modified Dental Anxiety Scale. The findings suggest that 5% of the population are likely to require a course of treatment under sedation at some time. All three clinical domains contributed to the IOSN score and indication of treatment need. Females were 3.8 times more likely than males to be placed within the high need for sedation group. Factors such as age, deprivation and practice location were not associated with the need for sedation. Primary care trusts (PCTs) need health needs assessment data in order to commission effectively and in line with World Class Commissioning guidelines. This study provides both an indicative figure of need as well as a tool by which individual PCTs can undertake local health needs assessment work. Caution should be taken with the figure as a total need within a population as the study has only included those patients that attended dental practices.

  11. Incorporating availability for detection in estimates of bird abundance

    USGS Publications Warehouse

    Diefenbach, D.R.; Marshall, M.R.; Mattice, J.A.; Brauning, D.W.

    2007-01-01

    Several bird-survey methods have been proposed that provide an estimated detection probability so that bird-count statistics can be used to estimate bird abundance. However, some of these estimators adjust counts of birds observed by the probability that a bird is detected and assume that all birds are available to be detected at the time of the survey. We marked male Henslow's Sparrows (Ammodramus henslowii) and Grasshopper Sparrows (A. savannarum) and monitored their behavior during May-July 2002 and 2003 to estimate the proportion of time they were available for detection. We found that the availability of Henslow's Sparrows declined in late June to <10% for 5- or 10-min point counts when a male had to sing and be visible to the observer; but during 20 May-19 June, males were available for detection 39.1% (SD = 27.3) of the time for 5-min point counts and 43.9% (SD = 28.9) of the time for 10-min point counts (n = 54). We detected no temporal changes in availability for Grasshopper Sparrows, but estimated availability to be much lower for 5-min point counts (10.3%, SD = 12.2) than for 10-min point counts (19.2%, SD = 22.3) when males had to be visible and sing during the sampling period (n = 80). For distance sampling, we estimated the availability of Henslow's Sparrows to be 44.2% (SD = 29.0) and the availability of Grasshopper Sparrows to be 20.6% (SD = 23.5). We show how our estimates of availability can be incorporated in the abundance and variance estimators for distance sampling and modify the abundance and variance estimators for the double-observer method. Methods that directly estimate availability from bird counts but also incorporate detection probabilities need further development and will be important for obtaining unbiased estimates of abundance for these species.

  12. Why are you late? Investigating the role of time management in time-based prospective memory.

    PubMed

    Waldum, Emily R; McDaniel, Mark A

    2016-08-01

    Time-based prospective memory tasks (TBPM) are those that are to be performed at a specific future time. Contrary to typical laboratory TBPM tasks (e.g., hit the Z key every 5 min), many real-world TBPM tasks require more complex time-management processes. For instance, to attend an appointment on time, one must estimate the duration of the drive to the appointment and then use this estimate to create and execute a secondary TBPM intention (e.g., "I need to start driving by 1:30 to make my 2:00 appointment on time"). Future under- and overestimates of drive time can lead to inefficient TBPM performance with the former lending to missed appointments and the latter to long stints in the waiting room. Despite the common occurrence of complex TBPM tasks in everyday life, to date, no studies have investigated how components of time management, including time estimation, affect behavior in such complex TBPM tasks. Therefore, the current study aimed to investigate timing biases in both older and younger adults and, further, to determine how such biases along with additional time management components including planning and plan fidelity influence complex TBPM performance. Results suggest for the first time that younger and older adults do not always utilize similar timing strategies, and as a result, can produce differential timing biases under the exact same environmental conditions. These timing biases, in turn, play a vital role in how efficiently both younger and older adults perform a later TBPM task that requires them to utilize their earlier time estimate. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  13. Missile Defense: Opportunities Exist to Reduce Acquisition Risk and Improve Reporting on System Capabilities

    DTIC Science & Technology

    2015-05-01

    effort on an unsound acquisition footing and pursuing a kill vehicle that may not be the best solution to meet the warfighter’s needs within cost...No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information

  14. Real-time recognition of feedback error-related potentials during a time-estimation task.

    PubMed

    Lopez-Larraz, Eduardo; Iturrate, Iñaki; Montesano, Luis; Minguez, Javier

    2010-01-01

    Feedback error-related potentials are a promising brain process in the field of rehabilitation since they are related to human learning. Due to the fact that many therapeutic strategies rely on the presentation of feedback stimuli, potentials generated by these stimuli could be used to ameliorate the patient's progress. In this paper we propose a method that can identify, in real-time, feedback evoked potentials in a time-estimation task. We have tested our system with five participants in two different days with a separation of three weeks between them, achieving a mean single-trial detection performance of 71.62% for real-time recognition, and 78.08% in offline classification. Additionally, an analysis of the stability of the signal between the two days is performed, suggesting that the feedback responses are stable enough to be used without the needing of training again the user.

  15. The Effects of the Introduction of Bachelor Degrees on College Enrollment and Dropout Rates

    ERIC Educational Resources Information Center

    Horstschräer, Julia; Sprietsma, Maresa

    2015-01-01

    We estimate the short-term effects of the introduction of the Bachelor degree system in Germany, a change in degree regulations such that students need less time to earn a first degree, on college enrollment and dropout rates. We use variation in the timing of the reform at the university department level to identify the effects of the reform…

  16. Using Landsat Time-Series and LiDAR to Inform Aboveground Forest Biomass Baselines in Northern Minnesota, USA

    Treesearch

    Ram K. Deo; Matthew B. Russell; Grant M. Domke; Christopher W. Woodall; Michael J. Falkowski; Warren B. Cohen

    2017-01-01

    The publicly accessible archive of Landsat imagery and increasing regional-scale LiDAR acquisitions offer an opportunity to periodically estimate aboveground forest biomass (AGB) from 1990 to the present to alignwith the reporting needs ofNationalGreenhouseGas Inventories (NGHGIs). This study integrated Landsat time-series data, a state-wide LiDAR dataset, and a recent...

  17. Comparing methods of ploidy estimation in potato.

    USDA-ARS?s Scientific Manuscript database

    Ploidy manipulation and the resulting need for rapid ploidy screening is an important part of a potato research and breeding program. Determining ploidy by counting chromosomes or measuring DNA in individual cells is definitive, but takes time, technical skills and equipment. We tested three predi...

  18. KINETICS OF THM AND HAA PRODUCTION IN A SIMULATED DISTRIBUTION SYSTEM

    EPA Science Inventory

    Limited data exist on how the growth of halogenated disinfection by-products (DBPs) is affected by time spent in a distribution system. such information is needed to estimate human exposures to these chemicals for both regulatory analyses and epidemiological studies. Current me...

  19. Development of a Tetrathioether (S4) Bifunctional Chelate System for Rh-105

    DTIC Science & Technology

    2012-07-01

    hour per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining the data needed, and...completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of

  20. Using ESAP Software for Predicting the Spatial Distributions of NDVI and Transpiration of Cotton

    USDA-ARS?s Scientific Manuscript database

    The normalized difference vegetation index (NDVI) has many applications in agricultural management, including monitoring real-time crop coefficients for estimating crop evapotranspiration (ET). However, frequent monitoring of NDVI as needed in such applications is generally not feasible from aerial ...

  1. Space-Time Smoothing of Complex Survey Data: Small Area Estimation for Child Mortality

    PubMed Central

    Mercer, Laina D; Wakefield, Jon; Pantazis, Athena; Lutambi, Angelina M; Masanja, Honorati; Clark, Samuel

    2016-01-01

    Many people living in low and middle-income countries are not covered by civil registration and vital statistics systems. Consequently, a wide variety of other types of data including many household sample surveys are used to estimate health and population indicators. In this paper we combine data from sample surveys and demographic surveillance systems to produce small area estimates of child mortality through time. Small area estimates are necessary to understand geographical heterogeneity in health indicators when full-coverage vital statistics are not available. For this endeavor spatio-temporal smoothing is beneficial to alleviate problems of data sparsity. The use of conventional hierarchical models requires careful thought since the survey weights may need to be considered to alleviate bias due to non-random sampling and non-response. The application that motivated this work is estimation of child mortality rates in five-year time intervals in regions of Tanzania. Data come from Demographic and Health Surveys conducted over the period 1991–2010 and two demographic surveillance system sites. We derive a variance estimator of under five years child mortality that accounts for the complex survey weighting. For our application, the hierarchical models we consider include random effects for area, time and survey and we compare models using a variety of measures including the conditional predictive ordinate (CPO). The method we propose is implemented via the fast and accurate integrated nested Laplace approximation (INLA). PMID:27468328

  2. Time-to-impact sensors in robot vision applications based on the near-sensor image processing concept

    NASA Astrophysics Data System (ADS)

    Åström, Anders; Forchheimer, Robert

    2012-03-01

    Based on the Near-Sensor Image Processing (NSIP) concept and recent results concerning optical flow and Time-to- Impact (TTI) computation with this architecture, we show how these results can be used and extended for robot vision applications. The first case involves estimation of the tilt of an approaching planar surface. The second case concerns the use of two NSIP cameras to estimate absolute distance and speed similar to a stereo-matching system but without the need to do image correlations. Going back to a one-camera system, the third case deals with the problem to estimate the shape of the approaching surface. It is shown that the previously developed TTI method not only gives a very compact solution with respect to hardware complexity, but also surprisingly high performance.

  3. Empirical Evaluation of Advanced Electromagnetic Induction Systems - Factors Affecting Classification Effectiveness in Challenging Geologic Environments

    DTIC Science & Technology

    2017-02-01

    because of a variety of challenging conditions including (but not limited to) dense woods, long travel times to and from the site, and TEMTADS software...the following reasons: • Approximately 1.5 hours of travel time was needed after reaching the site each day to access the collection grids. • Daily...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining

  4. Does China Need a String of Pearls?

    DTIC Science & Technology

    2012-09-01

    the Gwadar Hot Potato ,” Asia Times , May 28, 2011. 30 Maseeh Rahman, “Chinese plans in Seychelles revive Indian fears of encirclement,” The Guardian...idUSTRE7501U420110601 Lee, Peter. “China Drops the Gwadar Hot Potato .” Asia Times , May 28, 2011. http://www.atimes.com/atimes/China/ME28Ad01.html Lister...of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering

  5. Missing the Mark? A Two Time Point Cohort Study Estimating Intestinal Parasite Prevalence in Informal Settlements in Lima, Peru.

    PubMed

    Cooper, Michael Townsend; Searing, Rapha A; Thompson, David M; Bard, David; Carabin, Hélène; Gonzales, Carlos; Zavala, Carmen; Woodson, Kyle; Naifeh, Monique

    2017-01-01

    Objectives: The World Health Organization's (WHO) recommendations list Peru as potentially needing prevention of soil-transmitted helminthiasis (STH). Prevalence of STH varies regionally and remains understudied in the newest informal settlements of the capital city, Lima. The purpose of this study was to evaluate the need for Mass Drug Administration (MDA) of antiparasitic drugs in the newest informal settlements of Lima. The aim of this study was to estimate the season-specific prevalence of STH to determine if these prevalence estimates met the WHO threshold for MDA in 3 informal settlements. Methods : A 2 time point cohort study was conducted among a sample of 140 children aged 1 to 10 years living in 3 purposively sampled informal settlements of Lima, Peru. Children were asked to provide 2 stool samples that were analyzed with the spontaneous sedimentation in tube technique. The season-specific prevalence proportions of MDA-targeted STH were estimated using a hidden (latent) Markov modeling approach to adjust for repeated measurements over the 2 seasons and the imperfect validity of the screening tests. Results : The prevalence of MDA targeted STH was low at 2.2% (95% confidence interval = 0.3% to 6%) and 3.8% (95% confidence interval = 0.7% to 9.3%) among children sampled in the summer and winter months, respectively, when using the most conservative estimate of test sensitivity. These estimates were below the WHO threshold for MDA (20%). Conclusions : Empiric treatment for STH by organizations active in the newest informal settlements is not supported by the data and could contribute to unnecessary medication exposures and poor allocation of resources.

  6. Missing the Mark? A Two Time Point Cohort Study Estimating Intestinal Parasite Prevalence in Informal Settlements in Lima, Peru

    PubMed Central

    Cooper, Michael Townsend; Searing, Rapha A.; Thompson, David M.; Bard, David; Carabin, Hélène; Gonzales, Carlos; Zavala, Carmen; Woodson, Kyle; Naifeh, Monique

    2017-01-01

    Objectives: The World Health Organization’s (WHO) recommendations list Peru as potentially needing prevention of soil-transmitted helminthiasis (STH). Prevalence of STH varies regionally and remains understudied in the newest informal settlements of the capital city, Lima. The purpose of this study was to evaluate the need for Mass Drug Administration (MDA) of antiparasitic drugs in the newest informal settlements of Lima. The aim of this study was to estimate the season-specific prevalence of STH to determine if these prevalence estimates met the WHO threshold for MDA in 3 informal settlements. Methods: A 2 time point cohort study was conducted among a sample of 140 children aged 1 to 10 years living in 3 purposively sampled informal settlements of Lima, Peru. Children were asked to provide 2 stool samples that were analyzed with the spontaneous sedimentation in tube technique. The season-specific prevalence proportions of MDA-targeted STH were estimated using a hidden (latent) Markov modeling approach to adjust for repeated measurements over the 2 seasons and the imperfect validity of the screening tests. Results: The prevalence of MDA targeted STH was low at 2.2% (95% confidence interval = 0.3% to 6%) and 3.8% (95% confidence interval = 0.7% to 9.3%) among children sampled in the summer and winter months, respectively, when using the most conservative estimate of test sensitivity. These estimates were below the WHO threshold for MDA (20%). Conclusions: Empiric treatment for STH by organizations active in the newest informal settlements is not supported by the data and could contribute to unnecessary medication exposures and poor allocation of resources. PMID:29152541

  7. Photochemical grid model performance with varying horizontal grid resolution and sub-grid plume treatment for the Martins Creek near-field SO2 study

    NASA Astrophysics Data System (ADS)

    Baker, Kirk R.; Hawkins, Andy; Kelly, James T.

    2014-12-01

    Near source modeling is needed to assess primary and secondary pollutant impacts from single sources and single source complexes. Source-receptor relationships need to be resolved from tens of meters to tens of kilometers. Dispersion models are typically applied for near-source primary pollutant impacts but lack complex photochemistry. Photochemical models provide a realistic chemical environment but are typically applied using grid cell sizes that may be larger than the distance between sources and receptors. It is important to understand the impacts of grid resolution and sub-grid plume treatments on photochemical modeling of near-source primary pollution gradients. Here, the CAMx photochemical grid model is applied using multiple grid resolutions and sub-grid plume treatment for SO2 and compared with a receptor mesonet largely impacted by nearby sources approximately 3-17 km away in a complex terrain environment. Measurements are compared with model estimates of SO2 at 4- and 1-km resolution, both with and without sub-grid plume treatment and inclusion of finer two-way grid nests. Annual average estimated SO2 mixing ratios are highest nearest the sources and decrease as distance from the sources increase. In general, CAMx estimates of SO2 do not compare well with the near-source observations when paired in space and time. Given the proximity of these sources and receptors, accuracy in wind vector estimation is critical for applications that pair pollutant predictions and observations in time and space. In typical permit applications, predictions and observations are not paired in time and space and the entire distributions of each are directly compared. Using this approach, model estimates using 1-km grid resolution best match the distribution of observations and are most comparable to similar studies that used dispersion and Lagrangian modeling systems. Model-estimated SO2 increases as grid cell size decreases from 4 km to 250 m. However, it is notable that the 1-km model estimates using 1-km meteorological model input are higher than the 1-km model simulation that used interpolated 4-km meteorology. The inclusion of sub-grid plume treatment did not improve model skill in predicting SO2 in time and space and generally acts to keep emitted mass aloft.

  8. Using diurnal temperature signals to infer vertical groundwater-surface water exchange

    USGS Publications Warehouse

    Irvine, Dylan J.; Briggs, Martin A.; Lautz, Laura K.; Gordon, Ryan P.; McKenzie, Jeffrey M.; Cartwright, Ian

    2017-01-01

    Heat is a powerful tracer to quantify fluid exchange between surface water and groundwater. Temperature time series can be used to estimate pore water fluid flux, and techniques can be employed to extend these estimates to produce detailed plan-view flux maps. Key advantages of heat tracing include cost-effective sensors and ease of data collection and interpretation, without the need for expensive and time-consuming laboratory analyses or induced tracers. While the collection of temperature data in saturated sediments is relatively straightforward, several factors influence the reliability of flux estimates that are based on time series analysis (diurnal signals) of recorded temperatures. Sensor resolution and deployment are particularly important in obtaining robust flux estimates in upwelling conditions. Also, processing temperature time series data involves a sequence of complex steps, including filtering temperature signals, selection of appropriate thermal parameters, and selection of the optimal analytical solution for modeling. This review provides a synthesis of heat tracing using diurnal temperature oscillations, including details on optimal sensor selection and deployment, data processing, model parameterization, and an overview of computing tools available. Recent advances in diurnal temperature methods also provide the opportunity to determine local saturated thermal diffusivity, which can improve the accuracy of fluid flux modeling and sensor spacing, which is related to streambed scour and deposition. These parameters can also be used to determine the reliability of flux estimates from the use of heat as a tracer.

  9. Real-time soft tissue motion estimation for lung tumors during radiotherapy delivery

    PubMed Central

    Rottmann, Joerg; Keall, Paul; Berbeco, Ross

    2013-01-01

    Purpose: To provide real-time lung tumor motion estimation during radiotherapy treatment delivery without the need for implanted fiducial markers or additional imaging dose to the patient. Methods: 2D radiographs from the therapy beam's-eye-view (BEV) perspective are captured at a frame rate of 12.8 Hz with a frame grabber allowing direct RAM access to the image buffer. An in-house developed real-time soft tissue localization algorithm is utilized to calculate soft tissue displacement from these images in real-time. The system is tested with a Varian TX linear accelerator and an AS-1000 amorphous silicon electronic portal imaging device operating at a resolution of 512 × 384 pixels. The accuracy of the motion estimation is verified with a dynamic motion phantom. Clinical accuracy was tested on lung SBRT images acquired at 2 fps. Results: Real-time lung tumor motion estimation from BEV images without fiducial markers is successfully demonstrated. For the phantom study, a mean tracking error <1.0 mm [root mean square (rms) error of 0.3 mm] was observed. The tracking rms accuracy on BEV images from a lung SBRT patient (≈20 mm tumor motion range) is 1.0 mm. Conclusions: The authors demonstrate for the first time real-time markerless lung tumor motion estimation from BEV images alone. The described system can operate at a frame rate of 12.8 Hz and does not require prior knowledge to establish traceable landmarks for tracking on the fly. The authors show that the geometric accuracy is similar to (or better than) previously published markerless algorithms not operating in real-time. PMID:24007146

  10. A minimalist approach to bias estimation for passive sensor measurements with targets of opportunity

    NASA Astrophysics Data System (ADS)

    Belfadel, Djedjiga; Osborne, Richard W.; Bar-Shalom, Yaakov

    2013-09-01

    In order to carry out data fusion, registration error correction is crucial in multisensor systems. This requires estimation of the sensor measurement biases. It is important to correct for these bias errors so that the multiple sensor measurements and/or tracks can be referenced as accurately as possible to a common tracking coordinate system. This paper provides a solution for bias estimation for the minimum number of passive sensors (two), when only targets of opportunity are available. The sensor measurements are assumed time-coincident (synchronous) and perfectly associated. Since these sensors provide only line of sight (LOS) measurements, the formation of a single composite Cartesian measurement obtained from fusing the LOS measurements from different sensors is needed to avoid the need for nonlinear filtering. We evaluate the Cramer-Rao Lower Bound (CRLB) on the covariance of the bias estimate, i.e., the quantification of the available information about the biases. Statistical tests on the results of simulations show that this method is statistically efficient, even for small sample sizes (as few as two sensors and six points on the trajectory of a single target of opportunity). We also show that the RMS position error is significantly improved with bias estimation compared with the target position estimation using the original biased measurements.

  11. Solar Radiation Pressure Estimation and Analysis of a GEO Class of High Area-to-Mass Ratio Debris Objects

    NASA Technical Reports Server (NTRS)

    Kelecy, Tom; Payne, Tim; Thurston, Robin; Stansbery, Gene

    2007-01-01

    A population of deep space objects is thought to be high area-to-mass ratio (AMR) debris having origins from sources in the geosynchronous orbit (GEO) belt. The typical AMR values have been observed to range anywhere from 1's to 10's of m(sup 2)/kg, and hence, higher than average solar radiation pressure effects result in long-term migration of eccentricity (0.1-0.6) and inclination over time. However, the nature of the debris orientation-dependent dynamics also results time-varying solar radiation forces about the average which complicate the short-term orbit determination processing. The orbit determination results are presented for several of these debris objects, and highlight their unique and varied dynamic attributes. Estimation or the solar pressure dynamics over time scales suitable for resolving the shorter term dynamics improves the orbit estimation, and hence, the orbit predictions needed to conduct follow-up observations.

  12. Analytic model to estimate thermonuclear neutron yield in z-pinches using the magnetic Noh problem

    NASA Astrophysics Data System (ADS)

    Allen, Robert C.

    The objective was to build a model which could be used to estimate neutron yield in pulsed z-pinch experiments, benchmark future z-pinch simulation tools and to assist scaling for breakeven systems. To accomplish this, a recent solution to the magnetic Noh problem was utilized which incorporates a self-similar solution with cylindrical symmetry and azimuthal magnetic field (Velikovich, 2012). The self-similar solution provides the conditions needed to calculate the time dependent implosion dynamics from which batch burn is assumed and used to calculate neutron yield. The solution to the model is presented. The ion densities and time scales fix the initial mass and implosion velocity, providing estimates of the experimental results given specific initial conditions. Agreement is shown with experimental data (Coverdale, 2007). A parameter sweep was done to find the neutron yield, implosion velocity and gain for a range of densities and time scales for DD reactions and a curve fit was done to predict the scaling as a function of preshock conditions.

  13. A proportional hazards regression model for the subdistribution with right-censored and left-truncated competing risks data

    PubMed Central

    Zhang, Xu; Zhang, Mei-Jie; Fine, Jason

    2012-01-01

    With competing risks failure time data, one often needs to assess the covariate effects on the cumulative incidence probabilities. Fine and Gray proposed a proportional hazards regression model to directly model the subdistribution of a competing risk. They developed the estimating procedure for right-censored competing risks data, based on the inverse probability of censoring weighting. Right-censored and left-truncated competing risks data sometimes occur in biomedical researches. In this paper, we study the proportional hazards regression model for the subdistribution of a competing risk with right-censored and left-truncated data. We adopt a new weighting technique to estimate the parameters in this model. We have derived the large sample properties of the proposed estimators. To illustrate the application of the new method, we analyze the failure time data for children with acute leukemia. In this example, the failure times for children who had bone marrow transplants were left truncated. PMID:21557288

  14. Evaluation of the return rate of volunteer blood donors

    PubMed Central

    Lourençon, Adriana de Fátima; Almeida, Rodrigo Guimarães dos Santos; Ferreira, Oranice; Martinez, Edson Zangiacomi

    2011-01-01

    Background To convert first-time blood donors into regular volunteer donors is a challenge to transfusion services. Objectives This study aims to estimate the return rate of first time donors of the Ribeirão Preto Blood Center and of other blood centers in its coverage region. Methods The histories of 115,553 volunteer donors between 1996 and 2005 were analyzed. Statistical analysis was based on a parametric long-term survival model that allows an estimation of the proportion of donors who never return for further donations. Results Only 40% of individuals return within one year after the first donation and 53% return within two years. It is estimated that 30% never return to donate. Higher return rates were observed among Black donors. No significant difference was found in non-return rates regarding gender, blood type, Rh blood group and blood collection unit. Conclusions The low percentage of first-time donors who return for further blood donation reinforces the need for marketing actions and strategies aimed at increasing the return rates. PMID:23049294

  15. A confidential inquiry estimating the number of patients affected with sickle cell disease and thalassemia major confirms the need for a prevention strategy in the Netherlands.

    PubMed

    Giordano, Piero C; Bouva, Marelle J; Harteveld, Cornelis L

    2004-01-01

    We have conducted a broad confidential inquiry among 401 hospital departments trying to estimate the number of patients affected with severe forms of hemoglobinopathies living in The Netherlands. With less than 30% response we have registered 559 patients in all age categories of whom 77.0% are affected with sickle cell disease and 17.5% with beta-thalassemia (thal) major. We estimate that the real figure could be around 800 patients, a figure more than six times higher than the number published in 1995 on which the reluctance to offer prevention was based. The actual figures and the incidence estimation of approximately 60 patients a year underline the urgent need for the official implementation of a prevention strategy in The Netherlands. During the last 5 years we have been working towards the implementation of a multi-intervention strategy for primary prevention using the existing structures of public health. The obstacles we have encountered to endorse such a strategy are discussed as a possible guide for other immigration countries.

  16. New agreement measures based on survival processes

    PubMed Central

    Guo, Ying; Li, Ruosha; Peng, Limin; Manatunga, Amita K.

    2013-01-01

    Summary The need to assess agreement arises in many scenarios in biomedical sciences when measurements were taken by different methods on the same subjects. When the endpoints are survival outcomes, the study of agreement becomes more challenging given the special characteristics of time-to-event data. In this paper, we propose a new framework for assessing agreement based on survival processes that can be viewed as a natural representation of time-to-event outcomes. Our new agreement measure is formulated as the chance-corrected concordance between survival processes. It provides a new perspective for studying the relationship between correlated survival outcomes and offers an appealing interpretation as the agreement between survival times on the absolute distance scale. We provide a multivariate extension of the proposed agreement measure for multiple methods. Furthermore, the new framework enables a natural extension to evaluate time-dependent agreement structure. We develop nonparametric estimation of the proposed new agreement measures. Our estimators are shown to be strongly consistent and asymptotically normal. We evaluate the performance of the proposed estimators through simulation studies and then illustrate the methods using a prostate cancer data example. PMID:23844617

  17. Traveltime and longitudinal dispersion in Illinois streams

    USGS Publications Warehouse

    Graf, J.B.

    1984-01-01

    Twenty-seven measurements of traveltime and longitudinal dispersion in 10 Illinois streams provide data needed for estimating traveltime of peak concentration of a conservative solute, traveltime of the leading edge of a solute cloud, peak concentration resulting from a given quantity of solute, and passage time of solute past a given point on a stream for both measured and unmeasured streams. Traveltime of peak concentration and of the leading edge of the cloud are related to discharge at the downstream end of the reach, distance of travel, and the fraction of the time that discharge at a given location on the stream is equaled or exceeded. Peak concentration and passage time are best estimated from the relation of each to traveltime. In measured streams, dispersion efficiency is greater than that predicted by Fickian diffusion theory. The rate of decrease in peak concentration with traveltime is about equal to the rate of increase in passage time. Average velocity in a stream reach, given by the velocity of the center of solute mass in that reach, also can be estimated from an equation developed from measured values. (USGS)

  18. Interactions Between Boreal Summer Intraseasonal Oscillations and Synoptic-Scale Disturbances over the Western North Pacific, Part I: Energetics Diagnosis

    DTIC Science & Technology

    2010-08-23

    typhoon. Part I: Satel- lite data analyses. J. Atmos. Sci., 63, 1377–1389. ——, ——, X. Ge, B. Wang, and M. Peng, 2003: Satellite data analysis and numerical...is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining...the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this

  19. SIRT3 Is a Mitochondrial Tumor Suppressor and Genetic Loss Results in a Murine Model for ER/PR-Positive Mammary Tumors Connecting Metabolism and Carcinogenesis

    DTIC Science & Technology

    2011-09-01

    0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...instructions, searching existing data sources , gathering and maintaining the data needed, and completing and reviewing this collection of information...Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden to

  20. Angular-contact ball-bearing internal load estimation algorithm using runtime adaptive relaxation

    NASA Astrophysics Data System (ADS)

    Medina, H.; Mutu, R.

    2017-07-01

    An algorithm to estimate internal loads for single-row angular contact ball bearings due to externally applied thrust loads and high-operating speeds is presented. A new runtime adaptive relaxation procedure and blending function is proposed which ensures algorithm stability whilst also reducing the number of iterations needed to reach convergence, leading to an average reduction in computation time in excess of approximately 80%. The model is validated based on a 218 angular contact bearing and shows excellent agreement compared to published results.

  1. Breast Cancer and Early Onset Childhood Obesity: Cell Specific Gene Expression in Mammary Epithelia and Adipocytes

    DTIC Science & Technology

    2006-07-01

    information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other...Western Diet with representative Sucrose 29.1 25.2 17 medium and high fat diets (Ghibaudi, Maltodextrin 8.5 6.5 10 et al., Obesity Research, pp 956-963 10(9

  2. Estimation of excitation forces for wave energy converters control using pressure measurements

    NASA Astrophysics Data System (ADS)

    Abdelkhalik, O.; Zou, S.; Robinett, R.; Bacelli, G.; Wilson, D.

    2017-08-01

    Most control algorithms of wave energy converters require prediction of wave elevation or excitation force for a short future horizon, to compute the control in an optimal sense. This paper presents an approach that requires the estimation of the excitation force and its derivatives at present time with no need for prediction. An extended Kalman filter is implemented to estimate the excitation force. The measurements in this approach are selected to be the pressures at discrete points on the buoy surface, in addition to the buoy heave position. The pressures on the buoy surface are more directly related to the excitation force on the buoy as opposed to wave elevation in front of the buoy. These pressure measurements are also more accurate and easier to obtain. A singular arc control is implemented to compute the steady-state control using the estimated excitation force. The estimated excitation force is expressed in the Laplace domain and substituted in the control, before the latter is transformed to the time domain. Numerical simulations are presented for a Bretschneider wave case study.

  3. Compile-time estimation of communication costs in multicomputers

    NASA Technical Reports Server (NTRS)

    Gupta, Manish; Banerjee, Prithviraj

    1991-01-01

    An important problem facing numerous research projects on parallelizing compilers for distributed memory machines is that of automatically determining a suitable data partitioning scheme for a program. Any strategy for automatic data partitioning needs a mechanism for estimating the performance of a program under a given partitioning scheme, the most crucial part of which involves determining the communication costs incurred by the program. A methodology is described for estimating the communication costs at compile-time as functions of the numbers of processors over which various arrays are distributed. A strategy is described along with its theoretical basis, for making program transformations that expose opportunities for combining of messages, leading to considerable savings in the communication costs. For certain loops with regular dependences, the compiler can detect the possibility of pipelining, and thus estimate communication costs more accurately than it could otherwise. These results are of great significance to any parallelization system supporting numeric applications on multicomputers. In particular, they lay down a framework for effective synthesis of communication on multicomputers from sequential program references.

  4. Seasonal Variability in Global Eddy Diffusion and the Effect on Thermospheric Neutral Density

    NASA Astrophysics Data System (ADS)

    Pilinski, M.; Crowley, G.

    2014-12-01

    We describe a method for making single-satellite estimates of the seasonal variability in global-average eddy diffusion coefficients. Eddy diffusion values as a function of time between January 2004 and January 2008 were estimated from residuals of neutral density measurements made by the CHallenging Minisatellite Payload (CHAMP) and simulations made using the Thermosphere Ionosphere Mesosphere Electrodynamics - Global Circulation Model (TIME-GCM). The eddy diffusion coefficient results are quantitatively consistent with previous estimates based on satellite drag observations and are qualitatively consistent with other measurement methods such as sodium lidar observations and eddy-diffusivity models. The eddy diffusion coefficient values estimated between January 2004 and January 2008 were then used to generate new TIME-GCM results. Based on these results, the RMS difference between the TIME-GCM model and density data from a variety of satellites is reduced by an average of 5%. This result, indicates that global thermospheric density modeling can be improved by using data from a single satellite like CHAMP. This approach also demonstrates how eddy diffusion could be estimated in near real-time from satellite observations and used to drive a global circulation model like TIME-GCM. Although the use of global values improves modeled neutral densities, there are some limitations of this method, which are discussed, including that the latitude-dependence of the seasonal neutral-density signal is not completely captured by a global variation of eddy diffusion coefficients. This demonstrates the need for a latitude-dependent specification of eddy diffusion consistent with diffusion observations made by other techniques.

  5. Seasonal variability in global eddy diffusion and the effect on neutral density

    NASA Astrophysics Data System (ADS)

    Pilinski, M. D.; Crowley, G.

    2015-04-01

    We describe a method for making single-satellite estimates of the seasonal variability in global-average eddy diffusion coefficients. Eddy diffusion values as a function of time were estimated from residuals of neutral density measurements made by the Challenging Minisatellite Payload (CHAMP) and simulations made using the thermosphere-ionosphere-mesosphere electrodynamics global circulation model (TIME-GCM). The eddy diffusion coefficient results are quantitatively consistent with previous estimates based on satellite drag observations and are qualitatively consistent with other measurement methods such as sodium lidar observations and eddy diffusivity models. Eddy diffusion coefficient values estimated between January 2004 and January 2008 were then used to generate new TIME-GCM results. Based on these results, the root-mean-square sum for the TIME-GCM model is reduced by an average of 5% when compared to density data from a variety of satellites, indicating that the fidelity of global density modeling can be improved by using data from a single satellite like CHAMP. This approach also demonstrates that eddy diffusion could be estimated in near real-time from satellite observations and used to drive a global circulation model like TIME-GCM. Although the use of global values improves modeled neutral densities, there are limitations to this method, which are discussed, including that the latitude dependence of the seasonal neutral-density signal is not completely captured by a global variation of eddy diffusion coefficients. This demonstrates the need for a latitude-dependent specification of eddy diffusion which is also consistent with diffusion observations made by other techniques.

  6. Determination of the mechanical and physical properties of cartilage by coupling poroelastic-based finite element models of indentation with artificial neural networks.

    PubMed

    Arbabi, Vahid; Pouran, Behdad; Campoli, Gianni; Weinans, Harrie; Zadpoor, Amir A

    2016-03-21

    One of the most widely used techniques to determine the mechanical properties of cartilage is based on indentation tests and interpretation of the obtained force-time or displacement-time data. In the current computational approaches, one needs to simulate the indentation test with finite element models and use an optimization algorithm to estimate the mechanical properties of cartilage. The modeling procedure is cumbersome, and the simulations need to be repeated for every new experiment. For the first time, we propose a method for fast and accurate estimation of the mechanical and physical properties of cartilage as a poroelastic material with the aid of artificial neural networks. In our study, we used finite element models to simulate the indentation for poroelastic materials with wide combinations of mechanical and physical properties. The obtained force-time curves are then divided into three parts: the first two parts of the data is used for training and validation of an artificial neural network, while the third part is used for testing the trained network. The trained neural network receives the force-time curves as the input and provides the properties of cartilage as the output. We observed that the trained network could accurately predict the properties of cartilage within the range of properties for which it was trained. The mechanical and physical properties of cartilage could therefore be estimated very fast, since no additional finite element modeling is required once the neural network is trained. The robustness of the trained artificial neural network in determining the properties of cartilage based on noisy force-time data was assessed by introducing noise to the simulated force-time data. We found that the training procedure could be optimized so as to maximize the robustness of the neural network against noisy force-time data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Long-term Observation of Soil Creep Activity around a Landslide Scar

    EPA Science Inventory

    Rate of sediment infilling into landslide scars by soil creep is needed to estimate the timing of subsequent landslide activity at a particular site. However, knowledge about the spatial distribution of its activity around the landslide scar is scarce. Additionally, there are few...

  8. State of Technology for Rehabilitation of Water Distribution Systems

    EPA Science Inventory

    The impact that the lack of investment in water infrastructure will have on the performance of aging underground infrastructure over time is well documented and the needed funding estimates range as high as $325 billion over the next 20 years. With the current annual replacement...

  9. Spatio-temporal distribution of soil-transmitted helminth infections in Brazil.

    PubMed

    Chammartin, Frédérique; Guimarães, Luiz H; Scholte, Ronaldo Gc; Bavia, Mara E; Utzinger, Jürg; Vounatsou, Penelope

    2014-09-18

    In Brazil, preventive chemotherapy targeting soil-transmitted helminthiasis is being scaled-up. Hence, spatially explicit estimates of infection risks providing information about the current situation are needed to guide interventions. Available high-resolution national model-based estimates either rely on analyses of data restricted to a given period of time, or on historical data collected over a longer period. While efforts have been made to take into account the spatial structure of the data in the modelling approach, little emphasis has been placed on the temporal dimension. We extracted georeferenced survey data on the prevalence of infection with soil-transmitted helminths (i.e. Ascaris lumbricoides, hookworm and Trichuris trichiura) in Brazil from the Global Neglected Tropical Diseases (GNTD) database. Selection of the most important predictors of infection risk was carried out using a Bayesian geostatistical approach and temporal models that address non-linearity and correlation of the explanatory variables. The spatial process was estimated through a predictive process approximation. Spatio-temporal models were built on the selected predictors with integrated nested Laplace approximation using stochastic partial differential equations. Our models revealed that, over the past 20 years, the risk of soil-transmitted helminth infection has decreased in Brazil, mainly because of the reduction of A. lumbricoides and hookworm infections. From 2010 onwards, we estimate that the infection prevalences with A. lumbricoides, hookworm and T. trichiura are 3.6%, 1.7% and 1.4%, respectively. We also provide a map highlighting municipalities in need of preventive chemotherapy, based on a predicted soil-transmitted helminth infection risk in excess of 20%. The need for treatments in the school-aged population at the municipality level was estimated at 1.8 million doses of anthelminthic tablets per year. The analysis of the spatio-temporal aspect of the risk of infection with soil-transmitted helminths contributes to a better understanding of the evolution of risk over time. Risk estimates provide the soil-transmitted helminthiasis control programme in Brazil with useful benchmark information for prioritising and improving spatial and temporal targeting of interventions.

  10. Applying Deep Learning in Medical Images: The Case of Bone Age Estimation.

    PubMed

    Lee, Jang Hyung; Kim, Kwang Gi

    2018-01-01

    A diagnostic need often arises to estimate bone age from X-ray images of the hand of a subject during the growth period. Together with measured physical height, such information may be used as indicators for the height growth prognosis of the subject. We present a way to apply the deep learning technique to medical image analysis using hand bone age estimation as an example. Age estimation was formulated as a regression problem with hand X-ray images as input and estimated age as output. A set of hand X-ray images was used to form a training set with which a regression model was trained. An image preprocessing procedure is described which reduces image variations across data instances that are unrelated to age-wise variation. The use of Caffe, a deep learning tool is demonstrated. A rather simple deep learning network was adopted and trained for tutorial purpose. A test set distinct from the training set was formed to assess the validity of the approach. The measured mean absolute difference value was 18.9 months, and the concordance correlation coefficient was 0.78. It is shown that the proposed deep learning-based neural network can be used to estimate a subject's age from hand X-ray images, which eliminates the need for tedious atlas look-ups in clinical environments and should improve the time and cost efficiency of the estimation process.

  11. Large-scale assessment of soil erosion in Africa: satellites help to jointly account for dynamic rainfall and vegetation cover

    NASA Astrophysics Data System (ADS)

    Vrieling, Anton; Hoedjes, Joost C. B.; van der Velde, Marijn

    2015-04-01

    Efforts to map and monitor soil erosion need to account for the erratic nature of the soil erosion process. Soil erosion by water occurs on sloped terrain when erosive rainfall and consequent surface runoff impact soils that are not well-protected by vegetation or other soil protective measures. Both rainfall erosivity and vegetation cover are highly variable through space and time. Due to data paucity and the relative ease of spatially overlaying geographical data layers into existing models like USLE (Universal Soil Loss Equation), many studies and mapping efforts merely use average annual values for erosivity and vegetation cover as input. We first show that rainfall erosivity can be estimated from satellite precipitation data. We obtained average annual erosivity estimates from 15 yr of 3-hourly TRMM Multi-satellite Precipitation Analysis (TMPA) data (1998-2012) using intensity-erosivity relationships. Our estimates showed a positive correlation (r = 0.84) with long-term annual erosivity values of 37 stations obtained from literature. Using these TMPA erosivity retrievals, we demonstrate the large interannual variability, with maximum annual erosivity often exceeding two to three times the mean value, especially in semi-arid areas. We then calculate erosivity at a 10-daily time-step and combine this with vegetation cover development for selected locations in Africa using NDVI - normalized difference vegetation index - time series from SPOT VEGETATION. Although we do not integrate the data at this point, the joint analysis of both variables stresses the need for joint accounting for erosivity and vegetation cover for large-scale erosion assessment and monitoring.

  12. Stated time preferences for health: a systematic review and meta analysis of private and social discount rates.

    PubMed

    Mahboub-Ahari, Alireza; Pourreza, Abolghasem; Sari, Ali Akbari; Rahimi Foroushani, Abbas; Heydari, Hassan

    2014-01-01

    The present study aimed to provide better insight on methodological issues related to time preference studies, and to estimate private and social discount rates, using a rigorous systematic review and meta-analysis. We searched PubMed, EMBASE and Proquest databases in June 2013. All studies had estimated private and social time preference rates for health outcomes through stated preference approach, recognized eligible for inclusion. We conducted both fixed and random effect meta-analyses using mean discount rate and standard deviation of the included studies. I-square statistics was used for testing heterogeneity of the studies. Private and social discount rates were estimated separately via Stata11 software. Out of 44 screened full texts, 8 population-based empirical studies were included in qualitative synthesis. Reported time preference rates for own health were from 0.036 to 0.07 and for social health from 0.04 to 0.2. Private and social discount rates were estimated at 0.056 (95% CI: 0.038, 0.074) and 0.066 (95% CI: 0.064, 0.068), respectively. Considering the impact of time preference on healthy behaviors and because of timing issues, individual's time preference as a key determinant of policy making should be taken into account. Direct translation of elicited discount rates to the official discount rates has been remained questionable. Decisions about the proper discount rate for health context, may need a cross-party consensus among health economists and policy makers.

  13. Rapid estimation of the economic consequences of global earthquakes

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2011-01-01

    The U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, operational since mid 2007, rapidly estimates the most affected locations and the population exposure at different levels of shaking intensities. The PAGER system has significantly improved the way aid agencies determine the scale of response needed in the aftermath of an earthquake. For example, the PAGER exposure estimates provided reasonably accurate assessments of the scale and spatial extent of the damage and losses following the 2008 Wenchuan earthquake (Mw 7.9) in China, the 2009 L'Aquila earthquake (Mw 6.3) in Italy, the 2010 Haiti earthquake (Mw 7.0), and the 2010 Chile earthquake (Mw 8.8). Nevertheless, some engineering and seismological expertise is often required to digest PAGER's exposure estimate and turn it into estimated fatalities and economic losses. This has been the focus of PAGER's most recent development. With the new loss-estimation component of the PAGER system it is now possible to produce rapid estimation of expected fatalities for global earthquakes (Jaiswal and others, 2009). While an estimate of earthquake fatalities is a fundamental indicator of potential human consequences in developing countries (for example, Iran, Pakistan, Haiti, Peru, and many others), economic consequences often drive the responses in much of the developed world (for example, New Zealand, the United States, and Chile), where the improved structural behavior of seismically resistant buildings significantly reduces earthquake casualties. Rapid availability of estimates of both fatalities and economic losses can be a valuable resource. The total time needed to determine the actual scope of an earthquake disaster and to respond effectively varies from country to country. It can take days or sometimes weeks before the damage and consequences of a disaster can be understood both socially and economically. The objective of the U.S. Geological Survey's PAGER system is to reduce this time gap to more rapidly and effectively mobilize response. We present here a procedure to rapidly and approximately ascertain the economic impact immediately following a large earthquake anywhere in the world. In principle, the approach presented is similar to the empirical fatality estimation methodology proposed and implemented by Jaiswal and others (2009). In order to estimate economic losses, we need an assessment of the economic exposure at various levels of shaking intensity. The economic value of all the physical assets exposed at different locations in a given area is generally not known and extremely difficult to compile at a global scale. In the absence of such a dataset, we first estimate the total Gross Domestic Product (GDP) exposed at each shaking intensity by multiplying the per-capita GDP of the country by the total population exposed at that shaking intensity level. We then scale the total GDP estimated at each intensity by an exposure correction factor, which is a multiplying factor to account for the disparity between wealth and/or economic assets to the annual GDP. The economic exposure obtained using this procedure is thus a proxy estimate for the economic value of the actual inventory that is exposed to the earthquake. The economic loss ratio, defined in terms of a country-specific lognormal cumulative distribution function of shaking intensity, is derived and calibrated against the losses from past earthquakes. This report describes the development of a country or region-specific economic loss ratio model using economic loss data available for global earthquakes from 1980 to 2007. The proposed model is a potential candidate for directly estimating economic losses within the currently-operating PAGER system. PAGER's other loss models use indirect methods that require substantially more data (such as building/asset inventories, vulnerabilities, and the asset values exposed at the time of earthquake) to implement on a global basis and will thus take more time to develop and implement within the PAGER system.

  14. Continuous Shape Estimation of Continuum Robots Using X-ray Images

    PubMed Central

    Lobaton, Edgar J.; Fu, Jinghua; Torres, Luis G.; Alterovitz, Ron

    2015-01-01

    We present a new method for continuously and accurately estimating the shape of a continuum robot during a medical procedure using a small number of X-ray projection images (e.g., radiographs or fluoroscopy images). Continuum robots have curvilinear structure, enabling them to maneuver through constrained spaces by bending around obstacles. Accurately estimating the robot’s shape continuously over time is crucial for the success of procedures that require avoidance of anatomical obstacles and sensitive tissues. Online shape estimation of a continuum robot is complicated by uncertainty in its kinematic model, movement of the robot during the procedure, noise in X-ray images, and the clinical need to minimize the number of X-ray images acquired. Our new method integrates kinematics models of the robot with data extracted from an optimally selected set of X-ray projection images. Our method represents the shape of the continuum robot over time as a deformable surface which can be described as a linear combination of time and space basis functions. We take advantage of probabilistic priors and numeric optimization to select optimal camera configurations, thus minimizing the expected shape estimation error. We evaluate our method using simulated concentric tube robot procedures and demonstrate that obtaining between 3 and 10 images from viewpoints selected by our method enables online shape estimation with errors significantly lower than using the kinematic model alone or using randomly spaced viewpoints. PMID:26279960

  15. Continuous Shape Estimation of Continuum Robots Using X-ray Images.

    PubMed

    Lobaton, Edgar J; Fu, Jinghua; Torres, Luis G; Alterovitz, Ron

    2013-05-06

    We present a new method for continuously and accurately estimating the shape of a continuum robot during a medical procedure using a small number of X-ray projection images (e.g., radiographs or fluoroscopy images). Continuum robots have curvilinear structure, enabling them to maneuver through constrained spaces by bending around obstacles. Accurately estimating the robot's shape continuously over time is crucial for the success of procedures that require avoidance of anatomical obstacles and sensitive tissues. Online shape estimation of a continuum robot is complicated by uncertainty in its kinematic model, movement of the robot during the procedure, noise in X-ray images, and the clinical need to minimize the number of X-ray images acquired. Our new method integrates kinematics models of the robot with data extracted from an optimally selected set of X-ray projection images. Our method represents the shape of the continuum robot over time as a deformable surface which can be described as a linear combination of time and space basis functions. We take advantage of probabilistic priors and numeric optimization to select optimal camera configurations, thus minimizing the expected shape estimation error. We evaluate our method using simulated concentric tube robot procedures and demonstrate that obtaining between 3 and 10 images from viewpoints selected by our method enables online shape estimation with errors significantly lower than using the kinematic model alone or using randomly spaced viewpoints.

  16. Model- based filtering for artifact and noise suppression with state estimation for electrodermal activity measurements in real time.

    PubMed

    Tronstad, Christian; Staal, Odd M; Saelid, Steinar; Martinsen, Orjan G

    2015-08-01

    Measurement of electrodermal activity (EDA) has recently made a transition from the laboratory into daily life with the emergence of wearable devices. Movement and nongelled electrodes make these devices more susceptible to noise and artifacts. In addition, real-time interpretation of the measurement is needed for user feedback. The Kalman filter approach may conveniently deal with both these issues. This paper presents a biophysical model for EDA implemented in an extended Kalman filter. Employing the filter on data from Physionet along with simulated noise and artifacts demonstrates noise and artifact suppression while implicitly providing estimates of model states and parameters such as the sudomotor nerve activation.

  17. Implementation of Kalman filter algorithm on models reduced using singular pertubation approximation method and its application to measurement of water level

    NASA Astrophysics Data System (ADS)

    Rachmawati, Vimala; Khusnul Arif, Didik; Adzkiya, Dieky

    2018-03-01

    The systems contained in the universe often have a large order. Thus, the mathematical model has many state variables that affect the computation time. In addition, generally not all variables are known, so estimations are needed to measure the magnitude of the system that cannot be measured directly. In this paper, we discuss the model reduction and estimation of state variables in the river system to measure the water level. The model reduction of a system is an approximation method of a system with a lower order without significant errors but has a dynamic behaviour that is similar to the original system. The Singular Perturbation Approximation method is one of the model reduction methods where all state variables of the equilibrium system are partitioned into fast and slow modes. Then, The Kalman filter algorithm is used to estimate state variables of stochastic dynamic systems where estimations are computed by predicting state variables based on system dynamics and measurement data. Kalman filters are used to estimate state variables in the original system and reduced system. Then, we compare the estimation results of the state and computational time between the original and reduced system.

  18. A general dead-time correction method based on live-time stamping. Application to the measurement of short-lived radionuclides.

    PubMed

    Chauvenet, B; Bobin, C; Bouchard, J

    2017-12-01

    Dead-time correction formulae are established in the general case of superimposed non-homogeneous Poisson processes. Based on the same principles as conventional live-timed counting, this method exploits the additional information made available using digital signal processing systems, and especially the possibility to store the time stamps of live-time intervals. No approximation needs to be made to obtain those formulae. Estimates of the variances of corrected rates are also presented. This method is applied to the activity measurement of short-lived radionuclides. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Time-frequency analysis of backscattered signals from diffuse radar targets

    NASA Astrophysics Data System (ADS)

    Kenny, O. P.; Boashash, B.

    1993-06-01

    The need for analysis of time-varying signals has led to the formulation of a class of joint time-frequency distributions (TFDs). One of these TFDs, the Wigner-Ville distribution (WVD), has useful properties which can be applied to radar imaging. The authors discuss time-frequency representation of the backscattered signal from a diffuse radar target. It is then shown that for point scatterers which are statistically dependent or for which the reflectivity coefficient has a nonzero mean value, reconstruction using time of flight positron emission tomography on time-frequency images is effective for estimating the scattering function of the target.

  20. Comparing epidemiologically estimated treatment need with treatment provided in two dental schemes in Ireland

    PubMed Central

    2012-01-01

    Background Valid estimation of dental treatment needed at population level is important for service planning. In many instances, planning is informed by survey data, which provide epidemiologically estimated need from the dental fieldworkers’ perspective. The aim of this paper is to determine the validity of this type of information for planning. A comparison of normative (epidemiologically estimated) need for selected treatments, as measured on a randomly-selected representative sample, is compared with treatment actually provided in the population from which the sample was drawn. Methods This paper compares dental treatment need-estimates, from a national survey, with treatment provided within two choice-of-dentist schemes: Scheme 1, a co-payment scheme for employed adults, and Scheme 2, a ‘free’ service for less-well-off adults. Epidemiologically estimated need for extractions, restorations, advanced restorations and denture treatments was recorded for a nationally representative sample in 2000/02. Treatments provided to employed and less-well-off adults were retrieved from the claims databases for both schemes. We used the chi-square test to compare proportions, and the student’s t-test to compare means between the survey and claims databases. Results Among employed adults, the proportion of 35-44-year-olds whose teeth had restorations was greater than estimated as needed in the survey (55.7% vs. 36.7%;p <0.0001). Mean number of teeth extracted was less than estimated as needed among 35-44 and 65+ year-olds. Among less-well-off adults, the proportion of 16-24-year-olds who had teeth extracted was greater than estimated as needed in the survey (27.4% vs. 7.9%;p <0.0001). Mean number of restorations provided was greater than estimated as needed in the survey for 16-24-year-olds (3.0 vs. 0.9; p <0.0001) and 35-44-year-olds (2.7 vs. 1.4;p <0.01). Conclusions Significant differences were found between epidemiologically estimated need and treatment provided for selected treatments, which may be accounted for by measurement differences. The gap between epidemiologically estimated need and treatment provided seems to be greatest for less-well-off adults. PMID:22898307

  1. Scaling in Free-Swimming Fish and Implications for Measuring Size-at-Time in the Wild

    PubMed Central

    Broell, Franziska; Taggart, Christopher T.

    2015-01-01

    This study was motivated by the need to measure size-at-age, and thus growth rate, in fish in the wild. We postulated that this could be achieved using accelerometer tags based first on early isometric scaling models that hypothesize that similar animals should move at the same speed with a stroke frequency that scales with length-1, and second on observations that the speed of primarily air-breathing free-swimming animals, presumably swimming ‘efficiently’, is independent of size, confirming that stroke frequency scales as length-1. However, such scaling relations between size and swimming parameters for fish remain mostly theoretical. Based on free-swimming saithe and sturgeon tagged with accelerometers, we introduce a species-specific scaling relationship between dominant tail beat frequency (TBF) and fork length. Dominant TBF was proportional to length-1 (r2 = 0.73, n = 40), and estimated swimming speed within species was independent of length. Similar scaling relations accrued in relation to body mass-0.29. We demonstrate that the dominant TBF can be used to estimate size-at-time and that accelerometer tags with onboard processing may be able to provide size-at-time estimates among free-swimming fish and thus the estimation of growth rate (change in size-at-time) in the wild. PMID:26673777

  2. A time-lapse photography method for monitoring salmon (Oncorhynchus spp.) passage and abundance in streams

    PubMed Central

    Leacock, William B.; Eby, Lisa A.; Stanford, Jack A.

    2016-01-01

    Accurately estimating population sizes is often a critical component of fisheries research and management. Although there is a growing appreciation of the importance of small-scale salmon population dynamics to the stability of salmon stock-complexes, our understanding of these populations is constrained by a lack of efficient and cost-effective monitoring tools for streams. Weirs are expensive, labor intensive, and can disrupt natural fish movements. While conventional video systems avoid some of these shortcomings, they are expensive and require excessive amounts of labor to review footage for data collection. Here, we present a novel method for quantifying salmon in small streams (<15 m wide, <1 m deep) that uses both time-lapse photography and video in a model-based double sampling scheme. This method produces an escapement estimate nearly as accurate as a video-only approach, but with substantially less labor, money, and effort. It requires servicing only every 14 days, detects salmon 24 h/day, is inexpensive, and produces escapement estimates with confidence intervals. In addition to escapement estimation, we present a method for estimating in-stream salmon abundance across time, data needed by researchers interested in predator--prey interactions or nutrient subsidies. We combined daily salmon passage estimates with stream specific estimates of daily mortality developed using previously published data. To demonstrate proof of concept for these methods, we present results from two streams in southwest Kodiak Island, Alaska in which high densities of sockeye salmon spawn. PMID:27326378

  3. Temporal Data Set Reduction Based on D-Optimality for Quantitative FLIM-FRET Imaging.

    PubMed

    Omer, Travis; Intes, Xavier; Hahn, Juergen

    2015-01-01

    Fluorescence lifetime imaging (FLIM) when paired with Förster resonance energy transfer (FLIM-FRET) enables the monitoring of nanoscale interactions in living biological samples. FLIM-FRET model-based estimation methods allow the quantitative retrieval of parameters such as the quenched (interacting) and unquenched (non-interacting) fractional populations of the donor fluorophore and/or the distance of the interactions. The quantitative accuracy of such model-based approaches is dependent on multiple factors such as signal-to-noise ratio and number of temporal points acquired when sampling the fluorescence decays. For high-throughput or in vivo applications of FLIM-FRET, it is desirable to acquire a limited number of temporal points for fast acquisition times. Yet, it is critical to acquire temporal data sets with sufficient information content to allow for accurate FLIM-FRET parameter estimation. Herein, an optimal experimental design approach based upon sensitivity analysis is presented in order to identify the time points that provide the best quantitative estimates of the parameters for a determined number of temporal sampling points. More specifically, the D-optimality criterion is employed to identify, within a sparse temporal data set, the set of time points leading to optimal estimations of the quenched fractional population of the donor fluorophore. Overall, a reduced set of 10 time points (compared to a typical complete set of 90 time points) was identified to have minimal impact on parameter estimation accuracy (≈5%), with in silico and in vivo experiment validations. This reduction of the number of needed time points by almost an order of magnitude allows the use of FLIM-FRET for certain high-throughput applications which would be infeasible if the entire number of time sampling points were used.

  4. Estimating Software-Development Costs With Greater Accuracy

    NASA Technical Reports Server (NTRS)

    Baker, Dan; Hihn, Jairus; Lum, Karen

    2008-01-01

    COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.

  5. Course and predictors of supportive care needs among Mexican breast cancer patients: A longitudinal study.

    PubMed

    Pérez-Fortis, Adriana; Fleer, Joke; Schroevers, Maya J; López, Patricia Alanís; Sosa, Juan José Sánchez; Eulenburg, Christine; Ranchor, Adelita V

    2018-05-26

    This study examined the course and predictors of supportive care needs among Mexican breast cancer patients for different cancer treatment trajectories. Data from 172 (66.4% response rate) patients were considered in this observational longitudinal study. Participants were measured after diagnosis, neoadjuvant treatment, surgery, adjuvant treatment and the first post-treatment follow-up visit. Psychological, Health System and Information, Physical and Daily Living, Patient Care and Support, Sexual, and Additional care needs were measured with the Supportive Care Needs Survey (SCNS-SF34). Linear mixed models with maximum-likelihood estimation were computed. The course of supportive care needs was similar across the different cancer treatment trajectories. Supportive care needs declined significantly from diagnosis to the first post-treatment follow-up visit. Health System and Information care needs were the highest needs over time. Depressive symptoms and time since diagnosis were the most consistent predictors of changes in course of supportive care needs of these patients. Health system and information care needs of Mexican breast cancer patients need to be addressed with priority because these needs are the least met. Furthermore, patients with high depressive symptoms at the start of the disease trajectory have greater needs for supportive care throughout the disease trajectory. This article is protected by copyright. All rights reserved.

  6. Time distortion when users at-risk for social media addiction engage in non-social media tasks.

    PubMed

    Turel, Ofir; Brevers, Damien; Bechara, Antoine

    2018-02-01

    There is a growing concern over the addictiveness of Social Media use. Additional representative indicators of impaired control are needed in order to distinguish presumed social media addiction from normal use. (1) To examine the existence of time distortion during non-social media use tasks that involve social media cues among those who may be considered at-risk for social media addiction. (2) To examine the usefulness of this distortion for at-risk vs. low/no-risk classification. We used a task that prevented Facebook use and invoked Facebook reflections (survey on self-control strategies) and subsequently measured estimated vs. actual task completion time. We captured the level of addiction using the Bergen Facebook Addiction Scale in the survey, and we used a common cutoff criterion to classify people as at-risk vs. low/no-risk of Facebook addiction. The at-risk group presented significant upward time estimate bias and the low/no-risk group presented significant downward time estimate bias. The bias was positively correlated with Facebook addiction scores. It was efficacious, especially when combined with self-reported estimates of extent of Facebook use, in classifying people to the two categories. Our study points to a novel, easy to obtain, and useful marker of at-risk for social media addiction, which may be considered for inclusion in diagnosis tools and procedures. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Scientist/AMPS equipment interface study

    NASA Technical Reports Server (NTRS)

    Anderson, H. R.

    1977-01-01

    The principal objective was to determine for each experiment how the operating procedures and modes of equipment onboard shuttle can be managed in real-time or near-real-time to enhance the quality of results. As part of this determination the data and display devices that a man will need for real-time management are defined. The secondary objectives, as listed in the RFQ and technical proposal, were to: (1) determine what quantities are to be measured (2) determine permissible background levels (3) decide in what portions of space measurements are to be made (4) estimate bit rates (5) establish time-lines for operating the experiments on a mission or set of missions and (6) determine the minimum set of hardware needed for real-time display. Experiment descriptions and requirements were written. The requirements of the various experiments are combined and a minimal set of joint requirements are defined.

  8. A switched systems approach to image-based estimation

    NASA Astrophysics Data System (ADS)

    Parikh, Anup

    With the advent of technological improvements in imaging systems and computational resources, as well as the development of image-based reconstruction techniques, it is necessary to understand algorithm performance when subject to real world conditions. Specifically, this dissertation focuses on the stability and performance of a class of image-based observers in the presence of intermittent measurements, caused by e.g., occlusions, limited FOV, feature tracking losses, communication losses, or finite frame rates. Observers or filters that are exponentially stable under persistent observability may have unbounded error growth during intermittent sensing, even while providing seemingly accurate state estimates. In Chapter 3, dwell time conditions are developed to guarantee state estimation error convergence to an ultimate bound for a class of observers while undergoing measurement loss. Bounds are developed on the unstable growth of the estimation errors during the periods when the object being tracked is not visible. A Lyapunov-based analysis for the switched system is performed to develop an inequality in terms of the duration of time the observer can view the moving object and the duration of time the object is out of the field of view. In Chapter 4, a motion model is used to predict the evolution of the states of the system while the object is not visible. This reduces the growth rate of the bounding function to an exponential and enables the use of traditional switched systems Lyapunov analysis techniques. The stability analysis results in an average dwell time condition to guarantee state error convergence with a known decay rate. In comparison with the results in Chapter 3, the estimation errors converge to zero rather than a ball, with relaxed switching conditions, at the cost of requiring additional information about the motion of the feature. In some applications, a motion model of the object may not be available. Numerous adaptive techniques have been developed to compensate for unknown parameters or functions in system dynamics; however, persistent excitation (PE) conditions are typically required to ensure parameter convergence, i.e., learning. Since the motion model is needed in the predictor, model learning is desired; however, PE is difficult to insure a priori and infeasible to check online for nonlinear systems. Concurrent learning (CL) techniques have been developed to use recorded data and a relaxed excitation condition to ensure convergence. In CL, excitation is only required for a finite period of time, and the recorded data can be checked to determine if it is sufficiently rich. However, traditional CL requires knowledge of state derivatives, which are typically not measured and require extensive filter design and tuning to develop satisfactory estimates. In Chapter 5 of this dissertation, a novel formulation of CL is developed in terms of an integral (ICL), removing the need to estimate state derivatives while preserving parameter convergence properties. Using ICL, an estimator is developed in Chapter 6 for simultaneously estimating the pose of an object as well as learning a model of its motion for use in a predictor when the object is not visible. A switched systems analysis is provided to demonstrate the stability of the estimation and prediction with learning scheme. Dwell time conditions as well as excitation conditions are developed to ensure estimation errors converge to an arbitrarily small bound. Experimental results are provided to illustrate the performance of each of the developed estimation schemes. The dissertation concludes with a discussion of the contributions and limitations of the developed techniques, as well as avenues for future extensions.

  9. 40 CFR 63.151 - Initial notification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... statement of the reasons additional time is needed and the date when the owner or operator first learned of... control technology or pollution prevention measure that will be used for each emission point included in... paragraph (i)(2)(ii) of this section. (iii) The estimated percent reduction if a control technology...

  10. Job-Sharing: Another Way to Work

    ERIC Educational Resources Information Center

    Rich, Les

    1978-01-01

    A permanent part-time work force estimated at sixteen to seventeen million is one of the fastest-growing segments of the work population. The article discusses and presents some examples of job sharing--two persons handling one job--as a means of increasing employment, avoiding layoffs, and meeting individual needs. (MF)

  11. Bi-fluorescence imaging for estimating accurately the nuclear condition of Rhizoctonia spp.

    USDA-ARS?s Scientific Manuscript database

    Aims: To simplify the determination of the nuclear condition of the pathogenic Rhizoctonia, which currently needs to be performed either using two fluorescent dyes, thus is more costly and time-consuming, or using only one fluorescent dye, and thus less accurate. Methods and Results: A red primary ...

  12. Educational Play: Mathematics. Games and Activities To Stimulate Your Child in Mathematics.

    ERIC Educational Resources Information Center

    Valentine, Deborah

    This book, written for parents, presents short mathematics activities for use with young children. Most chapters contain an overview, educational objectives, needed materials, an estimate of initial time investment, introduction and preparation instructions, and activities. Activities are grouped as follows: kitchen math, calculators, measurement,…

  13. The calibration of video cameras for quantitative measurements

    NASA Technical Reports Server (NTRS)

    Snow, Walter L.; Childers, Brooks A.; Shortis, Mark R.

    1993-01-01

    Several different recent applications of velocimetry at Langley Research Center are described in order to show the need for video camera calibration for quantitative measurements. Problems peculiar to video sensing are discussed, including synchronization and timing, targeting, and lighting. The extension of the measurements to include radiometric estimates is addressed.

  14. Modal parameter estimation and monitoring for on-line flight flutter analysis

    NASA Astrophysics Data System (ADS)

    Verboven, P.; Cauberghe, B.; Guillaume, P.; Vanlanduit, S.; Parloo, E.

    2004-05-01

    The clearance of the flight envelope of a new airplane by means of flight flutter testing is time consuming and expensive. Most common approach is to track the modal damping ratios during a number of flight conditions, and hence the accuracy of the damping estimates plays a crucial role. However, aircraft manufacturers desire to decrease the flight flutter testing time for practical, safety and economical reasons by evolving from discrete flight test points to a more continuous flight test pattern. Therefore, this paper presents an approach that provides modal parameter estimation and monitoring for an aircraft with a slowly time-varying structural behaviour that will be observed during a faster and more continuous exploration of the flight envelope. The proposed identification approach estimates the modal parameters directly from input/output Fourier data. This avoids the need for an averaging-based pre-processing of the data, which becomes inapplicable in the case that only short data records are measured. Instead of using a Hanning window to reduce effects of leakage, these transient effects are modelled simultaneously with the dynamical behaviour of the airplane. The method is validated for the monitoring of the system poles during flight flutter testing.

  15. Low-signal, coronagraphic wavefront estimation with Kalman filtering in the high contrast imaging testbed

    NASA Astrophysics Data System (ADS)

    Riggs, A. J. Eldorado; Cady, Eric J.; Prada, Camilo M.; Kern, Brian D.; Zhou, Hanying; Kasdin, N. Jeremy; Groff, Tyler D.

    2016-07-01

    For direct imaging and spectral characterization of cold exoplanets in reflected light, the proposed Wide-Field Infrared Survey Telescope (WFIRST) Coronagraph Instrument (CGI) will carry two types of coronagraphs. The High Contrast Imaging Testbed (HCIT) at the Jet Propulsion Laboratory has been testing both coronagraph types and demonstrated their abilities to achieve high contrast. Focal plane wavefront correction is used to estimate and mitigate aberrations. As the most time-consuming part of correction during a space mission, the acquisition of probed images for electric field estimation needs to be as short as possible. We present results from the HCIT of narrowband, low-signal wavefront estimation tests using a shaped pupil Lyot coronagraph (SPLC) designed for the WFIRST CGI. In the low-flux regime, the Kalman filter and iterated extended Kalman filter provide faster correction, better achievable contrast, and more accurate estimates than batch process estimation.

  16. An anti-disturbing real time pose estimation method and system

    NASA Astrophysics Data System (ADS)

    Zhou, Jian; Zhang, Xiao-hu

    2011-08-01

    Pose estimation relating two-dimensional (2D) images to three-dimensional (3D) rigid object need some known features to track. In practice, there are many algorithms which perform this task in high accuracy, but all of these algorithms suffer from features lost. This paper investigated the pose estimation when numbers of known features or even all of them were invisible. Firstly, known features were tracked to calculate pose in the current and the next image. Secondly, some unknown but good features to track were automatically detected in the current and the next image. Thirdly, those unknown features which were on the rigid and could match each other in the two images were retained. Because of the motion characteristic of the rigid object, the 3D information of those unknown features on the rigid could be solved by the rigid object's pose at the two moment and their 2D information in the two images except only two case: the first one was that both camera and object have no relative motion and camera parameter such as focus length, principle point, and etc. have no change at the two moment; the second one was that there was no shared scene or no matched feature in the two image. Finally, because those unknown features at the first time were known now, pose estimation could go on in the followed images in spite of the missing of known features in the beginning by repeating the process mentioned above. The robustness of pose estimation by different features detection algorithms such as Kanade-Lucas-Tomasi (KLT) feature, Scale Invariant Feature Transform (SIFT) and Speed Up Robust Feature (SURF) were compared and the compact of the different relative motion between camera and the rigid object were discussed in this paper. Graphic Processing Unit (GPU) parallel computing was also used to extract and to match hundreds of features for real time pose estimation which was hard to work on Central Processing Unit (CPU). Compared with other pose estimation methods, this new method can estimate pose between camera and object when part even all known features are lost, and has a quick response time benefit from GPU parallel computing. The method present here can be used widely in vision-guide techniques to strengthen its intelligence and generalization, which can also play an important role in autonomous navigation and positioning, robots fields at unknown environment. The results of simulation and experiments demonstrate that proposed method could suppress noise effectively, extracted features robustly, and achieve the real time need. Theory analysis and experiment shows the method is reasonable and efficient.

  17. Noncoherent DTTLs for Symbol Synchronization

    NASA Technical Reports Server (NTRS)

    Simon, Marvin; Tkacenko, Andre

    2007-01-01

    Noncoherent data-transition tracking loops (DTTLs) have been proposed for use as symbol synchronizers in digital communication receivers. [Communication- receiver subsystems that can perform their assigned functions in the absence of synchronization with the phases of their carrier signals ( carrier synchronization ) are denoted by the term noncoherent, while receiver subsystems that cannot function without carrier synchronization are said to be coherent. ] The proposal applies, more specifically, to receivers of binary phase-shift-keying (BPSK) signals generated by directly phase-modulating binary non-return-to-zero (NRZ) data streams onto carrier signals having known frequencies but unknown phases. The proposed noncoherent DTTLs would be modified versions of traditional DTTLs, which are coherent. The symbol-synchronization problem is essentially the problem of recovering symbol timing from a received signal. In the traditional, coherent approach to symbol synchronization, it is necessary to establish carrier synchronization in order to recover symbol timing. A traditional DTTL effects an iterative process in which it first generates an estimate of the carrier phase in the absence of symbol-synchronization information, then uses the carrier-phase estimate to obtain an estimate of the symbol-synchronization information, then feeds the symbol-synchronization estimate back to the carrier-phase-estimation subprocess. In a noncoherent symbol-synchronization process, there is no need for carrier synchronization and, hence, no need for iteration between carrier-synchronization and symbol- synchronization subprocesses. The proposed noncoherent symbolsynchronization process is justified theoretically by a mathematical derivation that starts from a maximum a posteriori (MAP) method of estimation of symbol timing utilized in traditional, coherent DTTLs. In that MAP method, one chooses the value of a variable of interest (in this case, the offset in the estimated symbol timing) that causes a likelihood function of symbol estimates over some number of symbol periods to assume a maximum value. In terms that are necessarily oversimplified to fit within the space available for this article, it can be said that the mathematical derivation involves a modified interpretation of the likelihood function that lends itself to noncoherent DTTLs. The proposal encompasses both linear and nonlinear noncoherent DTTLs. The performances of both have been computationally simulated; for comparison, the performances of linear and nonlinear coherent DTTLs have also been computationally simulated. The results of these simulations show that, among other things, the expected mean-square timing errors of coherent and noncoherent DTTLs are relatively insensitive to window width. The results also show that at high signal-to-noise ratios (SNRs), the performances of the noncoherent DTTLs approach those of their coherent counterparts at, while at low SNRs, the noncoherent DTTLs incur penalties of the order of 1.5 to 2 dB.

  18. On the unified estimation of turbulence eddy dissipation rate using Doppler cloud radars and lidars: Radar and Lidar Turbulence Estimation

    DOE PAGES

    Borque, Paloma; Luke, Edward; Kollias, Pavlos

    2016-05-27

    Coincident profiling observations from Doppler lidars and radars are used to estimate the turbulence energy dissipation rate (ε) using three different data sources: (i) Doppler radar velocity (DRV), (ii) Doppler lidar velocity (DLV), and (iii) Doppler radar spectrum width (DRW) measurements. Likewise, the agreement between the derived ε estimates is examined at the cloud base height of stratiform warm clouds. Collocated ε estimates based on power spectra analysis of DRV and DLV measurements show good agreement (correlation coefficient of 0.86 and 0.78 for both cases analyzed here) during both drizzling and nondrizzling conditions. This suggests that unified (below and abovemore » cloud base) time-height estimates of ε in cloud-topped boundary layer conditions can be produced. This also suggests that eddy dissipation rate can be estimated throughout the cloud layer without the constraint that clouds need to be nonprecipitating. Eddy dissipation rate estimates based on DRW measurements compare well with the estimates based on Doppler velocity but their performance deteriorates as precipitation size particles are introduced in the radar volume and broaden the DRW values. And, based on this finding, a methodology to estimate the Doppler spectra broadening due to the spread of the drop size distribution is presented. Furthermore, the uncertainties in ε introduced by signal-to-noise conditions, the estimation of the horizontal wind, the selection of the averaging time window, and the presence of precipitation are discussed in detail.« less

  19. On the unified estimation of turbulence eddy dissipation rate using Doppler cloud radars and lidars: Radar and Lidar Turbulence Estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borque, Paloma; Luke, Edward; Kollias, Pavlos

    Coincident profiling observations from Doppler lidars and radars are used to estimate the turbulence energy dissipation rate (ε) using three different data sources: (i) Doppler radar velocity (DRV), (ii) Doppler lidar velocity (DLV), and (iii) Doppler radar spectrum width (DRW) measurements. Likewise, the agreement between the derived ε estimates is examined at the cloud base height of stratiform warm clouds. Collocated ε estimates based on power spectra analysis of DRV and DLV measurements show good agreement (correlation coefficient of 0.86 and 0.78 for both cases analyzed here) during both drizzling and nondrizzling conditions. This suggests that unified (below and abovemore » cloud base) time-height estimates of ε in cloud-topped boundary layer conditions can be produced. This also suggests that eddy dissipation rate can be estimated throughout the cloud layer without the constraint that clouds need to be nonprecipitating. Eddy dissipation rate estimates based on DRW measurements compare well with the estimates based on Doppler velocity but their performance deteriorates as precipitation size particles are introduced in the radar volume and broaden the DRW values. And, based on this finding, a methodology to estimate the Doppler spectra broadening due to the spread of the drop size distribution is presented. Furthermore, the uncertainties in ε introduced by signal-to-noise conditions, the estimation of the horizontal wind, the selection of the averaging time window, and the presence of precipitation are discussed in detail.« less

  20. Estimating local costs associated with Clostridium difficile infection using machine learning and electronic medical records

    PubMed Central

    Pak, Theodore R.; Chacko, Kieran; O’Donnell, Timothy; Huprikar, Shirish; van Bakel, Harm; Kasarskis, Andrew; Scott, Erick R.

    2018-01-01

    Background Reported per-patient costs of Clostridium difficile infection (CDI) vary by two orders of magnitude among different hospitals, implying that infection control officers need precise, local analyses to guide rational decision-making between interventions. Objective We sought to comprehensively estimate changes in length of stay (LOS) attributable to CDI at one urban tertiary-care facility using only data automatically extractable from the electronic medical record (EMR). Methods We performed a retrospective cohort study of 171,938 visits spanning a 7-year period. 23,968 variables were extracted from EMR data recorded within 24 hours of admission to train elastic net regularized logistic regression models for propensity score matching. To address time-dependent bias (reverse causation), we separately stratified comparisons by time-of-infection and fit multistate models. Results The estimated difference in median LOS for propensity-matched cohorts varied from 3.1 days (95% CI, 2.2–3.9) to 10.1 days (95% CI, 7.3–12.2) depending on the case definition; however, dependency of the estimate on time-to-infection was observed. Stratification by time to first positive toxin assay, excluding probable community-acquired infections, showed a minimum excess LOS of 3.1 days (95% CI, 1.7–4.4). Under the same case definition, the multistate model averaged an excess LOS of 3.3 days (95% CI, 2.6–4.0). Conclusions Two independent time-to-infection adjusted methods converged on similar excess LOS estimates. Changes in LOS can be extrapolated to a marginal dollar costs by multiplying by average costs of an inpatient-day. Infection control officers can leverage automatically extractable EMR data to estimate costs of CDI at their own institution. PMID:29103378

  1. Method and apparatus for autonomous, in-receiver prediction of GNSS ephemerides

    NASA Technical Reports Server (NTRS)

    Bar-Sever, Yoaz E. (Inventor); Bertiger, William I. (Inventor)

    2012-01-01

    Methods and apparatus for autonomous in-receiver prediction of orbit and clock states of Global Navigation Satellite Systems (GNSS) are described. Only the GNSS broadcast message is used, without need for periodic externally-communicated information. Earth orientation information is extracted from the GNSS broadcast ephemeris. With the accurate estimation of the Earth orientation parameters it is possible to propagate the best-fit GNSS orbits forward in time in an inertial reference frame. Using the estimated Earth orientation parameters, the predicted orbits are then transformed into Earth-Centered-Earth-Fixed (ECEF) coordinates to be used to assist the GNSS receiver in the acquisition of the signals. GNSS satellite clock states are also extracted from the broadcast ephemeris and a parameterized model of clock behavior is fit to that data. The estimated modeled clocks are then propagated forward in time to enable, together with the predicted orbits, quicker GNSS signal acquisition.

  2. Blind identification of nonlinear models with non-Gaussian inputs

    NASA Astrophysics Data System (ADS)

    Prakriya, Shankar; Pasupathy, Subbarayan; Hatzinakos, Dimitrios

    1995-12-01

    Some methods are proposed for the blind identification of finite-order discrete-time nonlinear models with non-Gaussian circular inputs. The nonlinear models consist of two finite memory linear time invariant (LTI) filters separated by a zero-memory nonlinearity (ZMNL) of the polynomial type (the LTI-ZMNL-LTI models). The linear subsystems are allowed to be of non-minimum phase (NMP). The methods base their estimates of the impulse responses on slices of the N plus 1th order polyspectra of the output sequence. It is shown that the identification of LTI-ZMNL systems requires only a 1-D moment or polyspectral slice. The coefficients of the ZMNL are not estimated, and need not be known. The order of the nonlinearity can, in theory, be estimated from the received signal. These methods possess several noise and interference suppression characteristics, and have applications in modeling nonlinearly amplified QAM/QPSK signals in digital satellite and microwave communications.

  3. Using the MCPLXS Generator for Technology Transfer

    NASA Technical Reports Server (NTRS)

    Moore, Arlene A.; Dean, Edwin B.

    1987-01-01

    The objective of this paper is to acquaint you with some of the approaches we are taking at Langley to incorporate escalations (or de-escalations) of technology when modeling futuristic systems. Since we have a short turnaround between the time we receive enough descriptive information to start estimating the project and when the estimate is needed (the "we-want-it-yesterday syndrome"), creativity is often necessary. There is not much time available for tool development. It is expedient to use existing tools in an adaptive manner to model the situation at hand. Specifically, this paper describes the use of the RCA PRICE MCPLXS Generator to incorporate technology transfer and technology escalation in estimates for advanced space systems such as Shuttle II and NASA advanced technology vehicles. It is assumed that the reader is familiar with the RCA PRICE family of models as well as the RCA PRICE utility programs such as SCPLX, PARAM, PARASYN, and the MCPLXS Generator.

  4. [Detection of palliative care needs in an acute care hospital unit. Pilot study].

    PubMed

    Rodríguez-Calero, Miguel Ángel; Julià-Mora, Joana María; Prieto-Alomar, Araceli

    2016-01-01

    Previous to wider prevalence studies, we designed the present pilot study to assess concordance and time invested in patient evaluations using a palliative care needs assessment tool. We also sought to estimate the prevalence of palliative care needs in an acute care hospital unit. A cross-sectional study was carried out, 4 researchers (2 doctors and 2 nurses) independently assessed all inpatients in an acute care hospital unit in Manacor Hospital, Mallorca (Spain), using the validated tool NECPAL CCOMS-ICO©, measuring time invested in every case. Another researcher revised clinical recordings to analise the sample profile. Every researcher assessed 29 patients, 15 men and 14 women, mean age 74,03 ± 10.25 years. 4-observer concordance was moderate (Kappa 0,5043), tuning out to be higher between nurses. Mean time per patient evaluation was 1.9 to 7.72 minutes, depending on researcher. Prevalence of palliative care needs was 23,28%. Moderate concordance lean us towards multidisciplinary shared assessments as a method for future research. Avarage of time invested in evaluations was less than 8 minutes, no previous publications were identified regarding this variable. More than 20% of inpatients of the acute care unit were in need of palliative care. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.

  5. PSA: A program to streamline orbit determination for launch support operations

    NASA Technical Reports Server (NTRS)

    Legerton, V. N.; Mottinger, N. A.

    1988-01-01

    An interactive, menu driven computer program was written to streamline the orbit determination process during the critical launch support phase of a mission. Residing on a virtual memory minicomputer, this program retains the quantities in-core needed to obtain a least squares estimate of the spacecraft trajectory with interactive displays to assist in rapid radio metric data evaluation. Menu-driven displays allow real time filter and data strategy development. Graphical and tabular displays can be sent to a laser printer for analysis without exiting the program. Products generated by this program feed back to the main orbit determination program in order to further refine the estimate of the trajectory. The final estimate provides a spacecraft ephemeris which is transmitted to the mission control center and used for antenna pointing and frequency predict generation by the Deep Space Network. The development and implementation process of this program differs from that used for most other navigation software by allowing the users to check important operating features during development and have changes made as needed.

  6. Assessment of constraints on space shuttle launch rates

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The range of number of annual STS flights with 4- and 5-orbiter fleets was estimated and an overview of capabilities needed to support annual rates of 24 and up with a survey of known constraints and emphasis on External Tank (ET) production requirements was provided. Facility capability estimates are provided for ground turnaround, cargo handling, flight training and flight operations. Emphasizing the complexity of the STS systems and the R&D nature of present flight experience, it is concluded that the most prominent constraints in the early growth of the STS as an operational system may manifest themselves not as shortages of investment items such as the ET or SRB, but as inability to provide timely repairs or replacement of flight system components needed to sustain launch rates.

  7. Water Residence Time estimation by 1D deconvolution in the form of a l2 -regularized inverse problem with smoothness, positivity and causality constraints

    NASA Astrophysics Data System (ADS)

    Meresescu, Alina G.; Kowalski, Matthieu; Schmidt, Frédéric; Landais, François

    2018-06-01

    The Water Residence Time distribution is the equivalent of the impulse response of a linear system allowing the propagation of water through a medium, e.g. the propagation of rain water from the top of the mountain towards the aquifers. We consider the output aquifer levels as the convolution between the input rain levels and the Water Residence Time, starting with an initial aquifer base level. The estimation of Water Residence Time is important for a better understanding of hydro-bio-geochemical processes and mixing properties of wetlands used as filters in ecological applications, as well as protecting fresh water sources for wells from pollutants. Common methods of estimating the Water Residence Time focus on cross-correlation, parameter fitting and non-parametric deconvolution methods. Here we propose a 1D full-deconvolution, regularized, non-parametric inverse problem algorithm that enforces smoothness and uses constraints of causality and positivity to estimate the Water Residence Time curve. Compared to Bayesian non-parametric deconvolution approaches, it has a fast runtime per test case; compared to the popular and fast cross-correlation method, it produces a more precise Water Residence Time curve even in the case of noisy measurements. The algorithm needs only one regularization parameter to balance between smoothness of the Water Residence Time and accuracy of the reconstruction. We propose an approach on how to automatically find a suitable value of the regularization parameter from the input data only. Tests on real data illustrate the potential of this method to analyze hydrological datasets.

  8. Manpower studies for the United States. Part II. Demand for eye care. A public opinion poll based upon a Gallup poll survey.

    PubMed

    Reinecke, R D; Steinberg, T

    1981-04-01

    This is the second in the series of Ophthalmology Manpower Studies. Part I presented estimates of disease prevalence and incidence, the average amount of time required to care for such conditions, and based on that information, the total hours of ophthalmological services required to care for all the projected need in the population. Using different estimates of the average number of hours worked per year per ophthalmologist (based on a 35, 40 and 48 hours/week in patient care), estimates of the total number of ophthalmologists required were calculated. This method is basically similar to the method later adopted by the Graduate Medical Education National Advisory Committee (GMENAC) to arrive at estimates of hours of ophthalmological services required for 1990. However, instead of using all the need present in the population, the GMENAC panel chose to use an "adjusted-needs based" model as a compromise between total need and actual utilization, the former being an overestimation and the latter being an underestimation since it is in part a function of the barriers to medical care. Since some of these barriers to medical care include informational factors, as well as availability and accessibility, this study was undertaken to assess the utilization of these services and the adequacy of present ophthalmological manpower in the opinion of the consumer. Also, since the consumer's choice or behavior depends on being informed about the differences between optometrists and ophthalmologists, such knowledge was assessed and the responses further evaluated after explanatory statements were made to the responders.

  9. Time vs. Money: A Quantitative Evaluation of Monitoring Frequency vs. Monitoring Duration.

    PubMed

    McHugh, Thomas E; Kulkarni, Poonam R; Newell, Charles J

    2016-09-01

    The National Research Council has estimated that over 126,000 contaminated groundwater sites are unlikely to achieve low ug/L clean-up goals in the foreseeable future. At these sites, cost-effective, long-term monitoring schemes are needed in order to understand the long-term changes in contaminant concentrations. Current monitoring optimization schemes rely on site-specific evaluations to optimize groundwater monitoring frequency. However, when using linear regression to estimate the long-term zero-order or first-order contaminant attenuation rate, the effect of monitoring frequency and monitoring duration on the accuracy and confidence for the estimated attenuation rate is not site-specific. For a fixed number of monitoring events, doubling the time between monitoring events (e.g., changing from quarterly monitoring to semi-annual monitoring) will double the accuracy of estimated attenuation rate. For a fixed monitoring frequency (e.g., semi-annual monitoring), increasing the number of monitoring events by 60% will double the accuracy of the estimated attenuation rate. Combining these two factors, doubling the time between monitoring events (e.g., quarterly monitoring to semi-annual monitoring) while decreasing the total number of monitoring events by 38% will result in no change in the accuracy of the estimated attenuation rate. However, the time required to collect this dataset will increase by 25%. Understanding that the trade-off between monitoring frequency and monitoring duration is not site-specific should simplify the process of optimizing groundwater monitoring frequency at contaminated groundwater sites. © 2016 The Authors. Groundwater published by Wiley Periodicals, Inc. on behalf of National Ground Water Association.

  10. Improving the precision of our ecosystem calipers: a modified morphometric technique for estimating marine mammal mass and body composition.

    PubMed

    Shero, Michelle R; Pearson, Linnea E; Costa, Daniel P; Burns, Jennifer M

    2014-01-01

    Mass and body composition are indices of overall animal health and energetic balance and are often used as indicators of resource availability in the environment. This study used morphometric models and isotopic dilution techniques, two commonly used methods in the marine mammal field, to assess body composition of Weddell seals (Leptonychotes weddellii, N = 111). Findings indicated that traditional morphometric models that use a series of circular, truncated cones to calculate marine mammal blubber volume and mass overestimated the animal's measured body mass by 26.9±1.5% SE. However, we developed a new morphometric model that uses elliptical truncated cones, and estimates mass with only -2.8±1.7% error (N = 10). Because this elliptical truncated cone model can estimate body mass without the need for additional correction factors, it has the potential to be a broadly applicable method in marine mammal species. While using elliptical truncated cones yielded significantly smaller blubber mass estimates than circular cones (10.2±0.8% difference; or 3.5±0.3% total body mass), both truncated cone models significantly underestimated total body lipid content as compared to isotopic dilution results, suggesting that animals have substantial internal lipid stores (N = 76). Multiple linear regressions were used to determine the minimum number of morphometric measurements needed to reliably estimate animal mass and body composition so that future animal handling times could be reduced. Reduced models estimated body mass and lipid mass with reasonable accuracy using fewer than five morphometric measurements (root-mean-square-error: 4.91% for body mass, 10.90% for lipid mass, and 10.43% for % lipid). This indicates that when test datasets are available to create calibration coefficients, regression models also offer a way to improve body mass and condition estimates in situations where animal handling times must be short and efficient.

  11. Potential for bias and low precision in molecular divergence time estimation of the Canopy of Life: an example from aquatic bird families

    PubMed Central

    van Tuinen, Marcel; Torres, Christopher R.

    2015-01-01

    Uncertainty in divergence time estimation is frequently studied from many angles but rarely from the perspective of phylogenetic node age. If appropriate molecular models and fossil priors are used, a multi-locus, partitioned analysis is expected to equally minimize error in accuracy and precision across all nodes of a given phylogeny. In contrast, if available models fail to completely account for rate heterogeneity, substitution saturation and incompleteness of the fossil record, uncertainty in divergence time estimation may increase with node age. While many studies have stressed this concern with regard to deep nodes in the Tree of Life, the inference that molecular divergence time estimation of shallow nodes is less sensitive to erroneous model choice has not been tested explicitly in a Bayesian framework. Because of available divergence time estimation methods that permit fossil priors across any phylogenetic node and the present increase in efficient, cheap collection of species-level genomic data, insight is needed into the performance of divergence time estimation of shallow (<10 MY) nodes. Here, we performed multiple sensitivity analyses in a multi-locus data set of aquatic birds with six fossil constraints. Comparison across divergence time analyses that varied taxon and locus sampling, number and position of fossil constraint and shape of prior distribution showed various insights. Deviation from node ages obtained from a reference analysis was generally highest for the shallowest nodes but determined more by temporal placement than number of fossil constraints. Calibration with only the shallowest nodes significantly underestimated the aquatic bird fossil record, indicating the presence of saturation. Although joint calibration with all six priors yielded ages most consistent with the fossil record, ages of shallow nodes were overestimated. This bias was found in both mtDNA and nDNA regions. Thus, divergence time estimation of shallow nodes may suffer from bias and low precision, even when appropriate fossil priors and best available substitution models are chosen. Much care must be taken to address the possible ramifications of substitution saturation across the entire Tree of Life. PMID:26106406

  12. Nutrition surveillance using a small open cohort: experience from Burkina Faso.

    PubMed

    Altmann, Mathias; Fermanian, Christophe; Jiao, Boshen; Altare, Chiara; Loada, Martin; Myatt, Mark

    2016-01-01

    Nutritional surveillance remains generally weak and early warning systems are needed in areas with high burden of acute under-nutrition. In order to enhance insight into nutritional surveillance, a community-based sentinel sites approach, known as the Listening Posts (LP) Project, was piloted in Burkina Faso by Action Contre la Faim (ACF). This paper presents ACF's experience with the LP approach and investigates potential selection and observational biases. Six primary sampling units (PSUs) were selected in each livelihood zone using the centric systematic area sampling methodology. In each PSU, 22 children aged between 6 and 24 months were selected by proximity sampling. The prevalence of GAM for each month from January 2011 to December 2013 was estimated using a Bayesian normal-normal conjugate analysis followed by PROBIT estimation. To validate the LP approach in detecting changes over time, the time trends of MUAC from LP and from five cross-sectional surveys were modelled using polynomial regression and compared by using a Wald test. The differences between prevalence estimates from the two data sources were used to assess selection and observational biases. The 95 % credible interval around GAM prevalence estimates using LP approach ranged between +6.5 %/-6.0 % on a prevalence of 36.1 % and +3.5 %/-2.9 % on a prevalence of 10.8 %. LP and cross-sectional surveys time trend models were well correlated (p = 0.6337). Although LP showed a slight but significant trend for GAM to decrease over time at a rate of -0.26 %/visit, the prevalence estimates from the two data sources showed good agreement over a 3-year period. The LP methodology has proved to be valid in following trends of GAM prevalence for a period of 3 years without selection bias. However, a slight observational bias was observed, requiring a periodical reselection of the sentinel sites. This kind of surveillance project is suited to use in areas with high burden of acute under-nutrition where early warning systems are strongly needed. Advocacy is necessary to develop sustainable nutrition surveillance system and to support the use of surveillance data in guiding nutritional programs.

  13. Trend Estimation and Regression Analysis in Climatological Time Series: An Application of Structural Time Series Models and the Kalman Filter.

    NASA Astrophysics Data System (ADS)

    Visser, H.; Molenaar, J.

    1995-05-01

    The detection of trends in climatological data has become central to the discussion on climate change due to the enhanced greenhouse effect. To prove detection, a method is needed (i) to make inferences on significant rises or declines in trends, (ii) to take into account natural variability in climate series, and (iii) to compare output from GCMs with the trends in observed climate data. To meet these requirements, flexible mathematical tools are needed. A structural time series model is proposed with which a stochastic trend, a deterministic trend, and regression coefficients can be estimated simultaneously. The stochastic trend component is described using the class of ARIMA models. The regression component is assumed to be linear. However, the regression coefficients corresponding with the explanatory variables may be time dependent to validate this assumption. The mathematical technique used to estimate this trend-regression model is the Kaiman filter. The main features of the filter are discussed.Examples of trend estimation are given using annual mean temperatures at a single station in the Netherlands (1706-1990) and annual mean temperatures at Northern Hemisphere land stations (1851-1990). The inclusion of explanatory variables is shown by regressing the latter temperature series on four variables: Southern Oscillation index (SOI), volcanic dust index (VDI), sunspot numbers (SSN), and a simulated temperature signal, induced by increasing greenhouse gases (GHG). In all analyses, the influence of SSN on global temperatures is found to be negligible. The correlations between temperatures and SOI and VDI appear to be negative. For SOI, this correlation is significant, but for VDI it is not, probably because of a lack of volcanic eruptions during the sample period. The relation between temperatures and GHG is positive, which is in agreement with the hypothesis of a warming climate because of increasing levels of greenhouse gases. The prediction performance of the model is rather poor, and possible explanations are discussed.

  14. The mental health workforce gap in low- and middle-income countries: a needs-based approach

    PubMed Central

    Scheffler, Richard M; Shen, Gordon; Yoon, Jangho; Chisholm, Dan; Morris, Jodi; Fulton, Brent D; Dal Poz, Mario R; Saxena, Shekhar

    2011-01-01

    Abstract Objective To estimate the shortage of mental health professionals in low- and middle-income countries (LMICs). Methods We used data from the World Health Organization’s Assessment Instrument for Mental Health Systems (WHO-AIMS) from 58 LMICs, country-specific information on the burden of various mental disorders and a hypothetical core service delivery package to estimate how many psychiatrists, nurses and psychosocial care providers would be needed to provide mental health care to the total population of the countries studied. We focused on the following eight problems, to which WHO has attached priority: depression, schizophrenia, psychoses other than schizophrenia, suicide, epilepsy, dementia, disorders related to the use of alcohol and illicit drugs, and paediatric mental disorders. Findings All low-income countries and 59% of the middle-income countries in our sample were found to have far fewer professionals than they need to deliver a core set of mental health interventions. The 58 LMICs sampled would need to increase their total mental health workforce by 239 000 full-time equivalent professionals to address the current shortage. Conclusion Country-specific policies are needed to overcome the large shortage of mental health-care staff and services throughout LMICs. PMID:21379414

  15. Radar Imaging Using The Wigner-Ville Distribution

    NASA Astrophysics Data System (ADS)

    Boashash, Boualem; Kenny, Owen P.; Whitehouse, Harper J.

    1989-12-01

    The need for analysis of time-varying signals has led to the formulation of a class of joint time-frequency distributions (TFDs). One of these TFDs, the Wigner-Ville distribution (WVD), has useful properties which can be applied to radar imaging. This paper first discusses the radar equation in terms of the time-frequency representation of the signal received from a radar system. It then presents a method of tomographic reconstruction for time-frequency images to estimate the scattering function of the aircraft. An optical archi-tecture is then discussed for the real-time implementation of the analysis method based on the WVD.

  16. Temporal rainfall estimation using input data reduction and model inversion

    NASA Astrophysics Data System (ADS)

    Wright, A. J.; Vrugt, J. A.; Walker, J. P.; Pauwels, V. R. N.

    2016-12-01

    Floods are devastating natural hazards. To provide accurate, precise and timely flood forecasts there is a need to understand the uncertainties associated with temporal rainfall and model parameters. The estimation of temporal rainfall and model parameter distributions from streamflow observations in complex dynamic catchments adds skill to current areal rainfall estimation methods, allows for the uncertainty of rainfall input to be considered when estimating model parameters and provides the ability to estimate rainfall from poorly gauged catchments. Current methods to estimate temporal rainfall distributions from streamflow are unable to adequately explain and invert complex non-linear hydrologic systems. This study uses the Discrete Wavelet Transform (DWT) to reduce rainfall dimensionality for the catchment of Warwick, Queensland, Australia. The reduction of rainfall to DWT coefficients allows the input rainfall time series to be simultaneously estimated along with model parameters. The estimation process is conducted using multi-chain Markov chain Monte Carlo simulation with the DREAMZS algorithm. The use of a likelihood function that considers both rainfall and streamflow error allows for model parameter and temporal rainfall distributions to be estimated. Estimation of the wavelet approximation coefficients of lower order decomposition structures was able to estimate the most realistic temporal rainfall distributions. These rainfall estimates were all able to simulate streamflow that was superior to the results of a traditional calibration approach. It is shown that the choice of wavelet has a considerable impact on the robustness of the inversion. The results demonstrate that streamflow data contains sufficient information to estimate temporal rainfall and model parameter distributions. The extent and variance of rainfall time series that are able to simulate streamflow that is superior to that simulated by a traditional calibration approach is a demonstration of equifinality. The use of a likelihood function that considers both rainfall and streamflow error combined with the use of the DWT as a model data reduction technique allows the joint inference of hydrologic model parameters along with rainfall.

  17. A sensor-based energy balance method for the distributed estimation of evaporation over the North American Great Lakes

    NASA Astrophysics Data System (ADS)

    Fries, K. J.; Kerkez, B.; Gronewold, A.; Lenters, J. D.

    2014-12-01

    We introduce a novel energy balance method to estimate evaporation across large lakes using real-time data from moored buoys and mobile, satellite-tracked drifters. Our work is motivated by the need to improve our understanding of the water balance of the Laurentian Great Lakes basin, a complex hydrologic system that comprises 90% of the United States' and 20% of the world's fresh surface water. Recently, the lakes experienced record-setting water level drops despite above-average precipitation, and given that lake surface area comprises nearly one third of the entire basin, evaporation is suspected to be the primary driver behind the decrease in water levels. There has historically been a need to measure evaporation over the Great Lakes, and recent hydrological phenomena (including not only record low levels, but also extreme changes in ice cover and surface water temperatures) underscore the urgency of addressing that need. Our method tracks the energy fluxes of the lake system - namely net radiation, heat storage and advection, and Bowen ratio. By measuring each of these energy budget terms and combining the results with mass-transfer based estimates, we can calculate real-time evaporation rates on sub-hourly timescales. To mitigate the cost prohibitive nature of large-scale, distributed energy flux measurements, we present a novel approach in which we leverage existing investments in seasonal buoys (which, while providing intensive, high quality data, are costly and sparsely distributed across the surface of the Great Lakes) and then integrate data from less costly satellite-tracked drifter data. The result is an unprecedented, hierarchical sensor and modeling architecture that can be used to derive estimates of evaporation in real-time through cloud-based computing. We discuss recent deployments of sensor-equipped buoys and drifters, which are beginning to provide us with some of the first in situ measurements of overlake evaporation from Earth's largest lake system, opening up the potential for improved and integrated monitoring and modeling of the Great Lakes water budget.

  18. Improving The Discipline of Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Piland, William M.; Pine, David J.; Wilson, Delano M.

    2000-01-01

    The need to improve the quality and accuracy of cost estimates of proposed new aerospace systems has been widely recognized. The industry has done the best job of maintaining related capability with improvements in estimation methods and giving appropriate priority to the hiring and training of qualified analysts. Some parts of Government, and National Aeronautics and Space Administration (NASA) in particular, continue to need major improvements in this area. Recently, NASA recognized that its cost estimation and analysis capabilities had eroded to the point that the ability to provide timely, reliable estimates was impacting the confidence in planning many program activities. As a result, this year the Agency established a lead role for cost estimation and analysis. The Independent Program Assessment Office located at the Langley Research Center was given this responsibility. This paper presents the plans for the newly established role. Described is how the Independent Program Assessment Office, working with all NASA Centers, NASA Headquarters, other Government agencies, and industry, is focused on creating cost estimation and analysis as a professional discipline that will be recognized equally with the technical disciplines needed to design new space and aeronautics activities. Investments in selected, new analysis tools, creating advanced training opportunities for analysts, and developing career paths for future analysts engaged in the discipline are all elements of the plan. Plans also include increasing the human resources available to conduct independent cost analysis of Agency programs during their formulation, to improve near-term capability to conduct economic cost-benefit assessments, to support NASA management's decision process, and to provide cost analysis results emphasizing "full-cost" and "full-life cycle" considerations. The Agency cost analysis improvement plan has been approved for implementation starting this calendar year. Adequate financial and human resources are being made available to accomplish the goals of this important effort, and all indications are that NASA's cost estimation and analysis core competencies will be substantially improved within the foreseeable future.

  19. Internal displacement and the Syrian crisis: an analysis of trends from 2011-2014.

    PubMed

    Doocy, Shannon; Lyles, Emily; Delbiso, Tefera D; Robinson, Courtland W

    2015-01-01

    Since the start of the Syrian crisis in 2011, civil unrest and armed conflict in the country have resulted in a rapidly increasing number of people displaced both within and outside of Syria. Those displaced face immense challenges in meeting their basic needs. This study sought to characterize internal displacement in Syria, including trends in both time and place, and to provide insights on the association between displacement and selected measures of household well-being and humanitarian needs. This study presents findings from two complementary methods: a desk review of displaced population estimates and movements and a needs assessment of 3930 Syrian households affected by the crisis. The first method, a desk review of displaced population estimates and movements, provides a retrospective analysis of national trends in displacement from March 2011 through June 2014. The second method, analysis of findings from a 2014 needs assessment by displacement status, provides insight into the displaced population and the association between displacement and humanitarian needs. Findings indicate that while displacement often corresponds to conflict levels, such trends were not uniformly observed in governorate-level analysis. Governorate level IDP estimates do not provide information on a scale detailed enough to adequately plan humanitarian assistance. Furthermore, such estimates are often influenced by obstructed access to certain areas, unsubstantiated reports, and substantial discrepancies in reporting. Secondary displacement is not consistently reported across sources nor are additional details about displacement, including whether displaced individuals originated within the current governorate or outside of the governorate. More than half (56.4 %) of households reported being displaced more than once, with a majority displaced for more than one year (73.3 %). Some differences between displaced and non-displaced population were observed in residence crowding, food consumption, health access, and education. Differences in reported living conditions and key health, nutrition, and education indicators between displaced and non-displaced populations indicate a need to better understand migration trends in order to inform planning and provision of live saving humanitarian assistance.

  20. Estimation of Staphylococcus aureus growth parameters from turbidity data: characterization of strain variation and comparison of methods.

    PubMed

    Lindqvist, R

    2006-07-01

    Turbidity methods offer possibilities for generating data required for addressing microorganism variability in risk modeling given that the results of these methods correspond to those of viable count methods. The objectives of this study were to identify the best approach for determining growth parameters based on turbidity data and use of a Bioscreen instrument and to characterize variability in growth parameters of 34 Staphylococcus aureus strains of different biotypes isolated from broiler carcasses. Growth parameters were estimated by fitting primary growth models to turbidity growth curves or to detection times of serially diluted cultures either directly or by using an analysis of variance (ANOVA) approach. The maximum specific growth rates in chicken broth at 17 degrees C estimated by time to detection methods were in good agreement with viable count estimates, whereas growth models (exponential and Richards) underestimated growth rates. Time to detection methods were selected for strain characterization. The variation of growth parameters among strains was best described by either the logistic or lognormal distribution, but definitive conclusions require a larger data set. The distribution of the physiological state parameter ranged from 0.01 to 0.92 and was not significantly different from a normal distribution. Strain variability was important, and the coefficient of variation of growth parameters was up to six times larger among strains than within strains. It is suggested to apply a time to detection (ANOVA) approach using turbidity measurements for convenient and accurate estimation of growth parameters. The results emphasize the need to consider implications of strain variability for predictive modeling and risk assessment.

  1. Improving slowness estimate stability and visualization using limited sensor pair correlation on seismic arrays

    NASA Astrophysics Data System (ADS)

    Gibbons, Steven J.; Näsholm, S. P.; Ruigrok, E.; Kværna, T.

    2018-04-01

    Seismic arrays enhance signal detection and parameter estimation by exploiting the time-delays between arriving signals on sensors at nearby locations. Parameter estimates can suffer due to both signal incoherence, with diminished waveform similarity between sensors, and aberration, with time-delays between coherent waveforms poorly represented by the wave-front model. Sensor-to-sensor correlation approaches to parameter estimation have an advantage over direct beamforming approaches in that individual sensor-pairs can be omitted without necessarily omitting entirely the data from each of the sensors involved. Specifically, we can omit correlations between sensors for which signal coherence in an optimal frequency band is anticipated to be poor or for which anomalous time-delays are anticipated. In practice, this usually means omitting correlations between more distant sensors. We present examples from International Monitoring System seismic arrays with poor parameter estimates resulting when classical f-k analysis is performed over the full array aperture. We demonstrate improved estimates and slowness grid displays using correlation beamforming restricted to correlations between sufficiently closely spaced sensors. This limited sensor-pair correlation (LSPC) approach has lower slowness resolution than would ideally be obtained by considering all sensor-pairs. However, this ideal estimate may be unattainable due to incoherence and/or aberration and the LSPC estimate can often exploit all channels, with the associated noise-suppression, while mitigating the complications arising from correlations between very distant sensors. The greatest need for the method is for short-period signals on large aperture arrays although we also demonstrate significant improvement for secondary regional phases on a small aperture array. LSPC can also provide a robust and flexible approach to parameter estimation on three-component seismic arrays.

  2. Can differentiated care models solve the crisis in HIV treatment financing? Analysis of prospects for 38 countries in sub-Saharan Africa.

    PubMed

    Barker, Catherine; Dutta, Arin; Klein, Kate

    2017-07-21

    Rapid scale-up of antiretroviral therapy (ART) in the context of financial and health system constraints has resulted in calls to maximize efficiency in ART service delivery. Adopting differentiated care models (DCMs) for ART could potentially be more cost-efficient and improve outcomes. However, no study comprehensively projects the cost savings across countries. We model the potential reduction in facility-level costs and number of health workers needed when implementing two types of DCMs while attempting to reach 90-90-90 targets in 38 sub-Saharan African countries from 2016 to 2020. We estimated the costs of three service delivery models: (1) undifferentiated care, (2) differentiated care by patient age and stability, and (3) differentiated care by patient age, stability, key vs. general population status, and urban vs. rural location. Frequency of facility visits, type and frequency of laboratory testing, and coverage of community ART support vary by patient subgroup. For each model, we estimated the total costs of antiretroviral drugs, laboratory commodities, and facility-level personnel and overhead. Certain groups under four-criteria differentiation require more intensive inputs. Community-based ART costs were included in the DCMs. We take into account underlying uncertainty in the projected numbers on ART and unit costs. Total five-year facility-based ART costs for undifferentiated care are estimated to be US$23.33 billion (95% confidence interval [CI]: $23.3-$23.5 billion). An estimated 17.5% (95% CI: 17.4%-17.7%) and 16.8% (95% CI: 16.7%-17.0%) could be saved from 2016 to 2020 from implementing the age and stability DCM and four-criteria DCM, respectively, with annual cost savings increasing over time. DCMs decrease the full-time equivalent (FTE) health workforce requirements for ART. An estimated 46.4% (95% CI: 46.1%-46.7%) fewer FTE health workers are needed in 2020 for the age and stability DCM compared with undifferentiated care. Adopting DCMs can result in significant efficiency gains in terms of reduced costs and health workforce needs, even with the costs of scaling up community-based ART support under DCMs. Efficiency gains remained flat with increased differentiation. More evidence is needed on how to translate analyzed efficiency gains into implemented cost reductions at the facility level.

  3. Using Diurnal Temperature Signals to Infer Vertical Groundwater-Surface Water Exchange.

    PubMed

    Irvine, Dylan J; Briggs, Martin A; Lautz, Laura K; Gordon, Ryan P; McKenzie, Jeffrey M; Cartwright, Ian

    2017-01-01

    Heat is a powerful tracer to quantify fluid exchange between surface water and groundwater. Temperature time series can be used to estimate pore water fluid flux, and techniques can be employed to extend these estimates to produce detailed plan-view flux maps. Key advantages of heat tracing include cost-effective sensors and ease of data collection and interpretation, without the need for expensive and time-consuming laboratory analyses or induced tracers. While the collection of temperature data in saturated sediments is relatively straightforward, several factors influence the reliability of flux estimates that are based on time series analysis (diurnal signals) of recorded temperatures. Sensor resolution and deployment are particularly important in obtaining robust flux estimates in upwelling conditions. Also, processing temperature time series data involves a sequence of complex steps, including filtering temperature signals, selection of appropriate thermal parameters, and selection of the optimal analytical solution for modeling. This review provides a synthesis of heat tracing using diurnal temperature oscillations, including details on optimal sensor selection and deployment, data processing, model parameterization, and an overview of computing tools available. Recent advances in diurnal temperature methods also provide the opportunity to determine local saturated thermal diffusivity, which can improve the accuracy of fluid flux modeling and sensor spacing, which is related to streambed scour and deposition. These parameters can also be used to determine the reliability of flux estimates from the use of heat as a tracer. © 2016, National Ground Water Association.

  4. How many research nurses for how many clinical trials in an oncology setting? Definition of the Nursing Time Required by Clinical Trial-Assessment Tool (NTRCT-AT).

    PubMed

    Milani, Alessandra; Mazzocco, Ketti; Stucchi, Sara; Magon, Giorgio; Pravettoni, Gabriella; Passoni, Claudia; Ciccarelli, Chiara; Tonali, Alessandra; Profeta, Teresa; Saiani, Luisa

    2017-02-01

    Few resources are available to quantify clinical trial-associated workload, needed to guide staffing and budgetary planning. The aim of the study is to describe a tool to measure clinical trials nurses' workload expressed in time spent to complete core activities. Clinical trials nurses drew up a list of nursing core activities, integrating results from literature searches with personal experience. The final 30 core activities were timed for each research nurse by an outside observer during daily practice in May and June 2014. Average times spent by nurses for each activity were calculated. The "Nursing Time Required by Clinical Trial-Assessment Tool" was created as an electronic sheet that combines the average times per specified activities and mathematic functions to return the total estimated time required by a research nurse for each specific trial. The tool was tested retrospectively on 141 clinical trials. The increasing complexity of clinical research requires structured approaches to determine workforce requirements. This study provides a tool to describe the activities of a clinical trials nurse and to estimate the associated time required to deliver individual trials. The application of the proposed tool in clinical research practice could provide a consistent structure for clinical trials nursing workload estimation internationally. © 2016 John Wiley & Sons Australia, Ltd.

  5. Characterizing and minimizing the effects of noise in tide gauge time series: relative and geocentric sea level rise around Australia

    NASA Astrophysics Data System (ADS)

    Burgette, Reed J.; Watson, Christopher S.; Church, John A.; White, Neil J.; Tregoning, Paul; Coleman, Richard

    2013-08-01

    We quantify the rate of sea level rise around the Australian continent from an analysis of tide gauge and Global Positioning System (GPS) data sets. To estimate the underlying linear rates of sea level change in the presence of significant interannual and decadal variability (treated here as noise), we adopt and extend a novel network adjustment approach. We simultaneously estimate time-correlated noise as well as linear model parameters and realistic uncertainties from sea level time series at individual gauges, as well as from time-series differences computed between pairs of gauges. The noise content at individual gauges is consistent with a combination of white and time-correlated noise. We find that the noise in time series from the western coast of Australia is best described by a first-order Gauss-Markov model, whereas east coast stations generally exhibit lower levels of time-correlated noise that is better described by a power-law process. These findings suggest several decades of monthly tide gauge data are needed to reduce rate uncertainties to <0.5 mm yr-1 for undifferenced single site time series with typical noise characteristics. Our subsequent adjustment strategy exploits the more precise differential rates estimated from differenced time series from pairs of tide gauges to estimate rates among the network of 43 tide gauges that passed a stability analysis. We estimate relative sea level rates over three temporal windows (1900-2011, 1966-2011 and 1993-2011), accounting for covariance between time series. The resultant adjustment reduces the rate uncertainty across individual gauges, and partially mitigates the need for century-scale time series at all sites in the network. Our adjustment reveals a spatially coherent pattern of sea level rise around the coastline, with the highest rates in northern Australia. Over the time periods beginning in 1900, 1966 and 1993, we find weighted average rates of sea level rise of 1.4 ± 0.6, 1.7 ± 0.6 and 4.6 ± 0.8 mm yr-1, respectively. While the temporal pattern of the rate estimates is consistent with acceleration in sea level rise, it may not be significant, as the uncertainties for the shorter analysis periods may not capture the full range of temporal variation. Analysis of the available continuous GPS records that have been collected within 80 km of Australian tide gauges suggests that rates of vertical crustal motion are generally low, with the majority of sites showing motion statistically insignificant from zero. A notable exception is the significant component of vertical land motion that contributes to the rapid rate of relative sea level change (>4 mm yr-1) at the Hillarys site in the Perth area. This corresponds to crustal subsidence that we estimate in our GPS analysis at a rate of -3.1 ± 0.7 mm yr-1, and appears linked to groundwater withdrawal. Uncertainties on the rates of vertical displacement at GPS sites collected over a decade are similar to what we measure in several decades of tide gauge data. Our results motivate continued observations of relative sea level using tide gauges, maintained with high-accuracy terrestrial and continuous co-located satellite-based surveying.

  6. Setting the scene for SWOT: global maps of river reach hydrodynamic variables

    NASA Astrophysics Data System (ADS)

    Schumann, Guy J.-P.; Durand, Michael; Pavelsky, Tamlin; Lion, Christine; Allen, George

    2017-04-01

    Credible and reliable characterization of discharge from the Surface Water and Ocean Topography (SWOT) mission using the Manning-based algorithms needs a prior estimate constraining reach-scale channel roughness, base flow and river bathymetry. For some places, any one of those variables may exist locally or even regionally as a measurement, which is often only at a station, or sometimes as a basin-wide model estimate. However, to date none of those exist at the scale required for SWOT and thus need to be mapped at a continental scale. The prior estimates will be employed for producing initial discharge estimates, which will be used as starting-guesses for the various Manning-based algorithms, to be refined using the SWOT measurements themselves. A multitude of reach-scale variables were derived, including Landsat-based width, SRTM slope and accumulation area. As a possible starting point for building the prior database of low flow, river bathymetry and channel roughness estimates, we employed a variety of sources, including data from all GRDC records, simulations from the long-time runs of the global water balance model (WBM), and reach-based calculations from hydraulic geometry relationships as well as Manning's equation. Here, we present the first global maps of this prior database with some initial validation, caveats and prospective uses.

  7. Estimated Financing Amount Needed for Essential Medicines in China, 2014.

    PubMed

    Xu, Wei; Xu, Zheng-Yuan; Cai, Gong-Jie; Kuo, Chiao-Yun; Li, Jing; Huang, Yi-Syuan

    2016-03-20

    At the present time, the government is considering to establish the independent financing system for essential medicines (EMs). However, it is still in the exploration phase. The objectives of this study were to calculate and estimate financing amount of EMs in China in 2014 and to provide data evidence for establishing financing mechanism of EMs. Two approaches were adopted in this study. First, we used a retrospective research to estimate the cost of EMs in China in 2014. We identified all the 520 drugs listed in the latest national EMs list (2012) and calculated the total sales amount of these drugs in 2014. The other approach included the steps that first selecting the 109 most common diseases in China, then identifying the EMs used to treat them, and finally estimating the total cost of these drugs. The results of the two methods, which showed the estimated financing amounts of EMs in China in 2014, were 17,776.44 million USD and 19,094.09 million USD, respectively. Comparing these two results, we concluded that the annual budget needed to provide for the EMs in China would be about 20 billion USD. Our study also indicated that the irrational drug use continued to plague the health system with intravenous fluids and antibiotics being the typical examples, as observed in other studies.

  8. Variability of rainfall over small areas

    NASA Technical Reports Server (NTRS)

    Runnels, R. C.

    1983-01-01

    A preliminary investigation was made to determine estimates of the number of raingauges needed in order to measure the variability of rainfall in time and space over small areas (approximately 40 sq miles). The literature on rainfall variability was examined and the types of empirical relationships used to relate rainfall variations to meteorological and catchment-area characteristics were considered. Relations between the coefficient of variation and areal-mean rainfall and area have been used by several investigators. These parameters seemed reasonable ones to use in any future study of rainfall variations. From a knowledge of an appropriate coefficient of variation (determined by the above-mentioned relations) the number rain gauges needed for the precise determination of areal-mean rainfall may be calculated by statistical estimation theory. The number gauges needed to measure the coefficient of variation over a 40 sq miles area, with varying degrees of error, was found to range from 264 (10% error, mean precipitation = 0.1 in) to about 2 (100% error, mean precipitation = 0.1 in).

  9. A test of basic psychological needs theory in young soccer players: time-lagged design at the individual and team levels.

    PubMed

    González, L; Tomás, I; Castillo, I; Duda, J L; Balaguer, I

    2017-11-01

    Within the framework of basic psychological needs theory (Deci & Ryan, 2000), multilevel structural equation modeling (MSEM) with a time-lagged design was used to test a mediation model examining the relationship between perceptions of coaches' interpersonal styles (autonomy supportive and controlling), athletes' basic psychological needs (satisfaction and thwarting), and indicators of well-being (subjective vitality) and ill-being (burnout), estimating separately between and within effects. The participants were 597 Spanish male soccer players aged between 11 and 14 years (M = 12.57, SD = 0.54) from 40 teams who completed a questionnaire package at two time points in a competitive season. Results revealed that at the individual level, athletes' perceptions of autonomy support positively predicted athletes' need satisfaction (autonomy, competence, and relatedness), whereas athletes' perceptions of controlling style positively predicted athletes' need thwarting (autonomy, competence, and relatedness). In turn, all three athletes' need satisfaction dimensions predicted athletes' subjective vitality and burnout (positively and negatively, respectively), whereas competence thwarting negatively predicted subjective vitality and competence and relatedness positively predicted burnout. At the team level, team perceptions of autonomy supportive style positively predicted team autonomy and relatedness satisfaction. Mediation effects only appeared at the individual level. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  10. High Resolution, Consistent Online Estimation of Potential Flood Damage in The Netherlands

    NASA Astrophysics Data System (ADS)

    Hoes, O.; Hut, R.; van Leeuwen, E.

    2014-12-01

    In the current age where water authorities no longer blindly design and maintain all infrastructure just to meet a certain standardized return period, accurate estimation of potential flood damage is important in decision making with regards to flood prevention measures. We identify three issues with current methods of estimating flood damages. Firstly, common practice is to assume that for a given land use type, damage is mainly dependent on inundation depth, and sometimes flow velocity. We recognize that depending on the type of land use inundation depth, velocity, flood duration, season, detour time and recovery time influences the amount of damage significantly. Secondly, setting stage-damage curves is usually left to an end user and can thus vary between different water authorities within a single country. What was needed at a national level is a common way of calculating flood damages, so different prevention measures can be fairly compared. Finally, most flood models use relatively large grid cells, usually in the order of 25 m2 or coarser. Especially in urban areas this leads to obvious errors: different land uses (shops, housing, park, are all classified as "urban" and treated equally. To tackle these issues we developed a web-based model which can be accessed via www.waterschadeschatter.nl (water schade schatter is Dutch for water damage estimator). It includes all necessary data sources to calculate the damage of any potential flood in the Netherlands. It uses different damage functions for different land use types, which the user can, but need not change. It runs on 0.25m2 grid cells. Both the datasets required and the amount of calculation needed is more than a desktop computer can handle. In order to start a calculation a user needs to upload the relevant flood information to the website. The calculation is divided over several multicore servers, after which the user will receive an email with a link to the results of his calculations. Our presentation will include a life demonstration of our online model.

  11. Remote sensing for grassland management in the arid Southwest

    USGS Publications Warehouse

    Marsett, R.C.; Qi, J.; Heilman, P.; Biedenbender, S.H.; Watson, M.C.; Amer, S.; Weltz, M.; Goodrich, D.; Marsett, R.

    2006-01-01

    We surveyed a group of rangeland managers in the Southwest about vegetation monitoring needs on grassland. Based on their responses, the objective of the RANGES (Rangeland Analysis Utilizing Geospatial Information Science) project was defined to be the accurate conversion of remotely sensed data (satellite imagery) to quantitative estimates of total (green and senescent) standing cover and biomass on grasslands and semidesert grasslands. Although remote sensing has been used to estimate green vegetation cover, in arid grasslands herbaceous vegetation is senescent much of the year and is not detected by current remote sensing techniques. We developed a ground truth protocol compatible with both range management requirements and Landsat's 30 m resolution imagery. The resulting ground-truth data were then used to develop image processing algorithms that quantified total herbaceous vegetation cover, height, and biomass. Cover was calculated based on a newly developed Soil Adjusted Total Vegetation Index (SATVI), and height and biomass were estimated based on reflectance in the near infrared (NIR) band. Comparison of the remotely sensed estimates with independent ground measurements produced r2 values of 0.80, 0.85, and 0.77 and Nash Sutcliffe values of 0.78, 0.70, and 0.77 for the cover, plant height, and biomass, respectively. The approach for estimating plant height and biomass did not work for sites where forbs comprised more than 30% of total vegetative cover. The ground reconnaissance protocol and image processing techniques together offer land managers accurate and timely methods for monitoring extensive grasslands. The time-consuming requirement to collect concurrent data in the field for each image implies a need to share the high fixed costs of processing an image across multiple users to reduce the costs for individual rangeland managers.

  12. Simulating Glacial Outburst Lake Releases for Suicide Basin, Mendenhall Glacier, Juneau, Alaska

    NASA Astrophysics Data System (ADS)

    Jacobs, A. B.; Moran, T.; Hood, E. W.

    2017-12-01

    Glacial Lake outbursts from Suicide Basin are recent phenomenon first characterized in 2011. The 2014 event resulted in record river stage and moderate flooding on the Mendenhall River in Juneau. Recognizing that these events can adversely impact residential areas of Juneau's Mendenhall Valley, the Alaska-Pacific River Forecast Center developed a real-time modeling technique capable of forecasting the timing and magnitude of the flood-wave crest due to releases from Suicide Basin. The 2014 event was estimated at about 37,000 acre feet with water levels cresting within 36 hours from the time the flood wave hit Mendenhall Lake. Given the magnitude of possible impacts to the public, accurate hydrological forecasting is essential for public safety and Emergency Managers. However, the data needed to effectively forecast magnitudes of specific jökulhlaup events are limited. Estimating this event as related to river stage depended upon three variables: 1) the timing of the lag between Suicide Basin water level declines and the related rise of Mendenhall Lake, 2) continuous monitoring of Mendenhall Lake water levels, and 3) estimating the total water volume stored in Suicide Basin. Real-time modeling of the event utilized a Time of Concentration hydrograph with independent power equations representing the rising and falling limbs of the hydrograph. The initial accuracy of the model — as forecasted about 24 hours prior to crest — resulted in an estimated crest within 0.5 feet of the actual with a timing error of about six hours later than the actual crest.

  13. Technical and economic feasibility of integrated video service by satellite

    NASA Technical Reports Server (NTRS)

    Price, Kent M.; Garlow, R. K.; Henderson, T. R.; Kwan, Robert K.; White, L. W.

    1992-01-01

    The trends and roles of satellite based video services in the year 2010 time frame are examined based on an overall network and service model for that period. Emphasis is placed on point to point and multipoint service, but broadcast could also be accommodated. An estimate of the video traffic is made and the service and general network requirements are identified. User charges are then estimated based on several usage scenarios. In order to accommodate these traffic needs, a 28 spot beam satellite architecture with on-board processing and signal mixing is suggested.

  14. Rainfall estimation in the context of post-event flash flood analysis

    NASA Astrophysics Data System (ADS)

    Bouilloud, L.; Delrieu, G.; Boudevillain, B.

    2009-04-01

    Due to their spatial coverage and space-time resolution, operational weather radar networks offer unprecedented opportunities for the observation of flash flood generating storms. However, the radar rainfall estimation quality highly depends on the relative locations of the event and the radar(s). A mountainous environment obviously adds to the complexity of the radar quantitative precipitation estimation (QPE). A pragmatic methodology is proposed to take the best benefit of the existing rainfall observations (radar and raingauge data) for given flash-flood cases: 1) A precise documentation of the radar characteristics (location, parameters, operating protocol, data archives and processing) needs first to be established. The radar(s) detection domain(s) can then be characterized using the "hydrologic visibility" concepts (Pellarin et al. J Hydrometeor 3(5) 539-555 2002). 2) Rather dense raingauge observations (operational, amateur) are usually available at the event time scale while few raingauge time series exist at the hydrologic time steps. Such raingauge datasets need to be critically analysed; a geostatistical approach is proposed for this task. 3) A number of identifications can be implemented prior to the radar data re-processing: a) Special care needs to be paid to (residual) ground clutter which has a dramatic impact of radar QPE. Dry-weather maps and rainfall accumulation maps may help in this task. b) Various sources of power losses such as screening, wet radome, attenuation in rain need to be identified and quantified. It will be shown that mountain returns can be used to quantify attenuation effects at C-band. c) Radar volume data is required to characterize the vertical profile of reflectivity (VPR), eventually conditioned on rain type (convective, widespread). When such data is not available, knowledge of the 0°C isotherm and the scanning protocol may help detecting bright-band contaminations that critically affect radar QPE. d) With conventional radar technology, the radar calibration accuracy and the relevance of the Z-R relationship can only be assessed with external data (raingauges here). Ways for characterizing the equifinality structure and optimal parameters will be presented. Such a procedure will be illustrated and assessed with the radar and raingauge datasets collected during the Aude 1999, Gard 2002 and Slovenia 2007 rain events of interest in the HYDRATE project.

  15. Rainfall estimation in the context of post-event flash flood analysis

    NASA Astrophysics Data System (ADS)

    Delrieu, Guy; Boudevillain, Brice; Bouilloud, Ludovic

    2010-05-01

    Due to their spatial coverage and space-time resolution, operational weather radar networks offer unprecedented opportunities for the observation of flash flood generating storms. However, the radar rainfall estimation quality highly depends on the relative locations of the event and the radar(s). A mountainous environment obviously adds to the complexity of the radar quantitative precipitation estimation (QPE). A pragmatic methodology was developed within the EC-funded HYDRATE project to take the best benefit of the existing rainfall observations (radar and raingauge data) for given flash-flood cases: 1) A precise documentation of the radar characteristics (location, parameters, operating protocol, data archives and processing) needs first to be established. The radar(s) detection domain(s) can then be characterized using the "hydrologic visibility" concepts (Pellarin et al. J Hydrometeor 3(5) 539-555 2002). 2) Rather dense raingauge observations (operational, amateur) are usually available at the event time scale while few raingauge time series exist at the hydrologic time steps. Such raingauge datasets need to be critically analysed; a geostatistical approach is proposed for this task. 3) A number of identifications can be implemented prior to the radar data re-processing: a) Special care needs to be paid to (residual) ground clutter which has a dramatic impact of radar QPE. Dry-weather maps and rainfall accumulation maps may help in this task. b) Various sources of power losses such as screening, wet radome, attenuation in rain need to be identified and quantified. It will be shown that mountain returns can be used to quantify attenuation effects at C-band. c) Radar volume data is required to characterize the vertical profile of reflectivity (VPR), eventually conditioned on rain type (convective, widespread). When such data is not available, knowledge of the 0°C isotherm and the scanning protocol may help detecting bright-band contaminations that critically affect radar QPE. d) With conventional radar technology, the radar calibration accuracy and the relevance of the Z-R relationship can only be assessed with external data (raingauges here). Ways for characterizing the equifinality structure and optimal parameters will be presented. Such a procedure will be illustrated and assessed with the radar and raingauge datasets collected for various rain events of interest in the HYDRATE project.

  16. Wind power error estimation in resource assessments.

    PubMed

    Rodríguez, Osvaldo; Del Río, Jesús A; Jaramillo, Oscar A; Martínez, Manuel

    2015-01-01

    Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.

  17. Wind Power Error Estimation in Resource Assessments

    PubMed Central

    Rodríguez, Osvaldo; del Río, Jesús A.; Jaramillo, Oscar A.; Martínez, Manuel

    2015-01-01

    Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies. PMID:26000444

  18. Optimal sampling for radiotelemetry studies of spotted owl habitat and home range.

    Treesearch

    Andrew B. Carey; Scott P. Horton; Janice A. Reid

    1989-01-01

    Radiotelemetry studies of spotted owl (Strix occidentalis) ranges and habitat-use must be designed efficiently to estimate parameters needed for a sample of individuals sufficient to describe the population. Independent data are required by analytical methods and provide the greatest return of information per effort. We examined time series of...

  19. Estimation of Unreimbursed Patient Education Costs at a Large Group Practice

    ERIC Educational Resources Information Center

    Williams, Arthur R.; McDougall, John C.; Bruggeman, Sandra K.; Erwin, Patricia J.; Kroshus, Margo E.; Naessens, James M.

    2004-01-01

    Introduction: A search of the literature on the cost of patient education found that provider education time per patient per day was rarely reported and usually not derivable from published reports. Costs of continuing education needed by health professionals to support patient education also were not given. Without this information, it is…

  20. Comparing an annual and daily time-step model for predicting field-scale P loss

    USDA-ARS?s Scientific Manuscript database

    Several models with varying degrees of complexity are available for describing P movement through the landscape. The complexity of these models is dependent on the amount of data required by the model, the number of model parameters needed to be estimated, the theoretical rigor of the governing equa...

  1. 77 FR 37638 - Noncommercial Educational Station Fundraising for Third-Party Non-Profit Organizations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-22

    ... educational (NCE) broadcast stations to conduct on-air fundraising activities that interrupt regular... eliminate the need for NCE stations to seek a waiver of the Commission's rules to interrupt regular... Responses: 2,200 respondents/30,800 responses. Estimated Time per Response: 0.25 to 1.5 hours. Frequency of...

  2. The Kane Experimental Forest carbon inventory: Carbon reporting with FVS

    Treesearch

    Coeli Hoover

    2008-01-01

    As the number of state and regional climate change agreements grows, so does the need to assess the carbon implications of planned forest management actions. At the operational level, producing detailed stock estimates for the primary carbon pools becomes time-consuming and cumbersome. Carbon reporting functionality has been fully integrated within the Forest...

  3. Exploring Barriers to the Categorization of Electronic Content in a Global Professional Services Firm

    ERIC Educational Resources Information Center

    Totterdale, Robert L.

    2009-01-01

    Businesses have always maintained records pertinent to the enterprise. With over 90% of new business records now estimated to be available in electronic form, organizations struggle to manage these vast amounts of electronic content while at the same time meeting collaboration, knowledge management, regulatory, and compliance needs. This case…

  4. International Disability Educational Alliance (IDEAnet)

    DTIC Science & Technology

    2009-03-01

    this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden...educational services ................................................32 R2: Conduct literature review and evaluation of cost-effective delivery options

  5. Why we need GMO crops in agriculture

    USDA-ARS?s Scientific Manuscript database

    The fact that in a very short period of 35 years the global population will reach an estimated 9 billion people presents a massive challenge to agriculture: how do we feed all of these people with nutritious food in a sustainable way? At the present time the yields of most of our major crops are sta...

  6. A Human Factors Engineering Assessment of the Buffalo Mine Protection Clearance Vehicle Roof Hatch

    DTIC Science & Technology

    2007-10-01

    this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data ...sources, gathering and maintaining the data needed, and completing and reviewing the collection information. Send comments regarding this burden...3 2. Method 4 2.1 Anthropometric Data

  7. 76 FR 14072 - Agency Information Collection Activities: Proposed Collection; Comments Requested

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-15

    ... the estimated public burden or associated response time, suggestions, or need a copy of the proposed... violence research indicated that youth are rarely involved in research designed to better understand this... of the study, concept mapping will be used to create a visual representation of the ways youth and...

  8. Neural Bases of Sequence Processing in Action and Language

    ERIC Educational Resources Information Center

    Carota, Francesca; Sirigu, Angela

    2008-01-01

    Real-time estimation of what we will do next is a crucial prerequisite of purposive behavior. During the planning of goal-oriented actions, for instance, the temporal and causal organization of upcoming subsequent moves needs to be predicted based on our knowledge of events. A forward computation of sequential structure is also essential for…

  9. Using Satellite Imagery with ET Weather Station Networks to Map Crop Water Use for Irrigation Scheduling: TOPS-SIMS.

    USDA-ARS?s Scientific Manuscript database

    Evapotranspiration estimates for scheduling irrigation must be field specific and real time. Weather station networks provide daily reference ET values, but users need to select crop coefficients for their particular crop and field. A prototype system has been developed that combines satellite image...

  10. The Sensitivity of Measures of Unwanted and Unintended Pregnancy Using Retrospective and Prospective Reporting: Evidence from Malawi

    PubMed Central

    Sennott, Christie

    2015-01-01

    A thorough understanding of the health implications of unwanted and unintended pregnancies is constrained by our ability to accurately identify them. Commonly used techniques for measuring such pregnancies are subject to two main sources of error: the ex post revision of preferences after a pregnancy and the difficulty of identifying preferences at the time of conception. This study examines the implications of retrospective and prospective measurement approaches, which are vulnerable to different sources of error, on estimates of unwanted and unintended pregnancies. We use eight waves of closely-spaced panel data from young women in southern Malawi to generate estimates of unwanted and unintended pregnancies based on fertility preferences measured at various points in time. We then compare estimates using traditional retrospective and prospective approaches to estimates obtained when fertility preferences are measured prospectively within months of conception. The 1,062 young Malawian women in the sample frequently changed their fertility preferences. The retrospective measures slightly underestimated unwanted and unintended pregnancies compared to the time-varying prospective approach; in contrast the fixed prospective measures overestimated them. Nonetheless, most estimates were similar in aggregate, suggesting that frequent changes in fertility preferences need not lead to dramatically different estimates of unwanted and unintended pregnancy. Greater disagreement among measures emerged when classifying individual pregnancies. Carefully designed retrospective measures are not necessarily more problematic for measuring unintended and unwanted fertility than are more expensive fixed prospective ones. PMID:25636647

  11. Estimating dietary costs of low-income women in California: a comparison of 2 approaches.

    PubMed

    Aaron, Grant J; Keim, Nancy L; Drewnowski, Adam; Townsend, Marilyn S

    2013-04-01

    Currently, no simplified approach to estimating food costs exists for a large, nationally representative sample. The objective was to compare 2 approaches for estimating individual daily diet costs in a population of low-income women in California. Cost estimates based on time-intensive method 1 (three 24-h recalls and associated food prices on receipts) were compared with estimates made by using less intensive method 2 [a food-frequency questionnaire (FFQ) and store prices]. Low-income participants (n = 121) of USDA nutrition programs were recruited. Mean daily diet costs, both unadjusted and adjusted for energy, were compared by using Pearson correlation coefficients and the Bland-Altman 95% limits of agreement between methods. Energy and nutrient intakes derived by the 2 methods were comparable; where differences occurred, the FFQ (method 2) provided higher nutrient values than did the 24-h recall (method 1). The crude daily diet cost was $6.32 by the 24-h recall method and $5.93 by the FFQ method (P = 0.221). The energy-adjusted diet cost was $6.65 by the 24-h recall method and $5.98 by the FFQ method (P < 0.001). Although the agreement between methods was weaker than expected, both approaches may be useful. Additional research is needed to further refine a large national survey approach (method 2) to estimate daily dietary costs with the use of this minimal time-intensive method for the participant and moderate time-intensive method for the researcher.

  12. Automated In-Situ Laser Scanner for Monitoring Forest Leaf Area Index

    PubMed Central

    Culvenor, Darius S.; Newnham, Glenn J.; Mellor, Andrew; Sims, Neil C.; Haywood, Andrew

    2014-01-01

    An automated laser rangefinding instrument was developed to characterize overstorey and understorey vegetation dynamics over time. Design criteria were based on information needs within the statewide forest monitoring program in Victoria, Australia. The ground-based monitoring instrument captures the key vegetation structural information needed to overcome ambiguity in the estimation of forest Leaf Area Index (LAI) from satellite sensors. The scanning lidar instrument was developed primarily from low cost, commercially accessible components. While the 635 nm wavelength lidar is not ideally suited to vegetation studies, there was an acceptable trade-off between cost and performance. Tests demonstrated reliable range estimates to live foliage up to a distance of 60 m during night-time operation. Given the instrument's scan angle of 57.5 degrees zenith, the instrument is an effective tool for monitoring LAI in forest canopies up to a height of 30 m. An 18 month field trial of three co-located instruments showed consistent seasonal trends and mean LAI of between 1.32 to 1.56 and a temporal LAI variation of 8 to 17% relative to the mean. PMID:25196006

  13. STV fueling options

    NASA Technical Reports Server (NTRS)

    Flemming, Ken

    1991-01-01

    Lunar vehicles that will be space based and reusable will require resupply of propellants in orbit. Approximately 75 pct. of the total mass delivered to low earth orbit will be propellants. Consequently, the propellant management techniques selected for Space Exploration Initiative (SEI) orbital operations will have a major influence on the overall SEI architecture. Five proposed propellant management facility (PMF) concepts were analyzed and compared in order to determine the best method of resupplying reusable, space based Lunar Transfer Vehicles (LTVs). The processing time needed at the Space Station to prepare LTV for its next lunar mission was estimated for each of the PMF concepts. The estimated times required to assemble and maintain the different PMF concepts were also compared. The results of the maintenance analysis were similar, with co-orbiting depots needing 100 to 350 pct. more annual maintenance. The first few external tanks mating operations at KSC encountered many problems that could cause serious lunar mission schedule delays. The use of drop tanks on lunar vehicles increases by a factor of four the number of critical propellant interface disturbances.

  14. Onboard Atmospheric Modeling and Prediction for Autonomous Aerobraking Missions

    NASA Technical Reports Server (NTRS)

    Tolson, Robert H.; Prince, Jill L. H.

    2011-01-01

    Aerobraking has proven to be an effective means of increasing the science payload for planetary orbiting missions and/or for enabling the use of less expensive launch vehicles. Though aerobraking has numerous benefits, large operations cost have been required to maintain the aerobraking time line without violating aerodynamic heating or other constraints. Two operations functions have been performed on an orbit by orbit basis to estimate atmospheric properties relevant to aerobraking. The Navigation team typically solves for an atmospheric density scale factor using DSN tracking data and the atmospheric modeling team uses telemetric accelerometer data to recover atmospheric density profiles. After some effort, decisions are made about the need for orbit trim maneuvers to adjust periapsis altitude to stay within the aerobraking corridor. Autonomous aerobraking would reduce the need for many ground based tasks. To be successful, atmospheric modeling must be performed on the vehicle in near real time. This paper discusses the issues associated with estimating the planetary atmosphere onboard and evaluates a number of the options for Mars, Venus and Titan aerobraking missions.

  15. Estimating probable flaw distributions in PWR steam generator tubes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorman, J.A.; Turner, A.P.L.

    1997-02-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regardingmore » uncertainties and assumptions in the data and analyses.« less

  16. Feeding methods and efficiencies of selected frugivorous birds

    USGS Publications Warehouse

    Foster, M.S.

    1987-01-01

    I report on handling methods and efficiencies of 26 species of Paraguayan birds freeding on fruits of Allophyllus edulis (Sapindaceae). A bird may swallow fruits whole (Type I: pluck and swallow feeders), hold a fruit and cut the pulp from the seed with the edge of the bill, swallowing the pulp but not the seed (Type II: cut or mash feeders), or take bites of pulp from a fruit that hangs from the tree or that is held and manipulated against a branch (Type III: push and bite feeders). In terms of absolute amount of pulp obtained from a fruit, and amount obtained per unit time. Type I species are far more efficient than Type II and III species. Bill morphology influences feeding methods but is not the only important factor. Diet breadth does not appear to be significant. Consideration of feeding efficiency relative to the needs of the birds indicates that these species need to spend relatively little time feeding to meet their estimated energetic needs, and that handling time has a relatively trivial effect on the time/energy budges of the bird species observed.

  17. On Searching Available Channels with Asynchronous MAC-Layer Spectrum Sensing

    NASA Astrophysics Data System (ADS)

    Jiang, Chunxiao; Ma, Xin; Chen, Canfeng; Ma, Jian; Ren, Yong

    Dynamic spectrum access has become a focal issue recently, in which identifying the available spectrum plays a rather important role. Lots of work has been done concerning secondary user (SU) synchronously accessing primary user's (PU's) network. However, on one hand, SU may have no idea about PU's communication protocols; on the other, it is possible that communications among PU are not based on synchronous scheme at all. In order to address such problems, this paper advances a strategy for SU to search available spectrums with asynchronous MAC-layer sensing. With this method, SUs need not know the communication mechanisms in PU's network when dynamically accessing. We will focus on four aspects: 1) strategy for searching available channels; 2) vacating strategy when PUs come back; 3) estimation of channel parameters; 4) impact of SUs' interference on PU's data rate. The simulations show that our search strategy not only can achieve nearly 50% less interference probability than equal allocation of total search time, but also well adapts to time-varying channels. Moreover, access by our strategies can attain 150% more access time than random access. The moment matching estimator shows good performance in estimating and tracing time-varying channels.

  18. Estimating Highway Volumes Using Vehicle Probe Data - Proof of Concept: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hou, Yi; Young, Stanley E; Sadabadi, Kaveh

    This paper examines the feasibility of using sampled commercial probe data in combination with validated continuous counter data to accurately estimate vehicle volume across the entire roadway network, for any hour during the year. Currently either real time or archived volume data for roadways at specific times are extremely sparse. Most volume data are average annual daily traffic (AADT) measures derived from the Highway Performance Monitoring System (HPMS). Although methods to factor the AADT to hourly averages for typical day of week exist, actual volume data is limited to a sparse collection of locations in which volumes are continuously recorded.more » This paper explores the use of commercial probe data to generate accurate volume measures that span the highway network providing ubiquitous coverage in space, and specific point-in-time measures for a specific date and time. The paper examines the need for the data, fundamental accuracy limitations based on a basic statistical model that take into account the sampling nature of probe data, and early results from a proof of concept exercise revealing the potential of probe type data calibrated with public continuous count data to meet end user expectations in terms of accuracy of volume estimates.« less

  19. Time-dependent classification accuracy curve under marker-dependent sampling.

    PubMed

    Zhu, Zhaoyin; Wang, Xiaofei; Saha-Chaudhuri, Paramita; Kosinski, Andrzej S; George, Stephen L

    2016-07-01

    Evaluating the classification accuracy of a candidate biomarker signaling the onset of disease or disease status is essential for medical decision making. A good biomarker would accurately identify the patients who are likely to progress or die at a particular time in the future or who are in urgent need for active treatments. To assess the performance of a candidate biomarker, the receiver operating characteristic (ROC) curve and the area under the ROC curve (AUC) are commonly used. In many cases, the standard simple random sampling (SRS) design used for biomarker validation studies is costly and inefficient. In order to improve the efficiency and reduce the cost of biomarker validation, marker-dependent sampling (MDS) may be used. In a MDS design, the selection of patients to assess true survival time is dependent on the result of a biomarker assay. In this article, we introduce a nonparametric estimator for time-dependent AUC under a MDS design. The consistency and the asymptotic normality of the proposed estimator is established. Simulation shows the unbiasedness of the proposed estimator and a significant efficiency gain of the MDS design over the SRS design. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. A three domain covariance framework for EEG/MEG data.

    PubMed

    Roś, Beata P; Bijma, Fetsje; de Gunst, Mathisca C M; de Munck, Jan C

    2015-10-01

    In this paper we introduce a covariance framework for the analysis of single subject EEG and MEG data that takes into account observed temporal stationarity on small time scales and trial-to-trial variations. We formulate a model for the covariance matrix, which is a Kronecker product of three components that correspond to space, time and epochs/trials, and consider maximum likelihood estimation of the unknown parameter values. An iterative algorithm that finds approximations of the maximum likelihood estimates is proposed. Our covariance model is applicable in a variety of cases where spontaneous EEG or MEG acts as source of noise and realistic noise covariance estimates are needed, such as in evoked activity studies, or where the properties of spontaneous EEG or MEG are themselves the topic of interest, like in combined EEG-fMRI experiments in which the correlation between EEG and fMRI signals is investigated. We use a simulation study to assess the performance of the estimator and investigate the influence of different assumptions about the covariance factors on the estimated covariance matrix and on its components. We apply our method to real EEG and MEG data sets. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Probabilistic seismic loss estimation via endurance time method

    NASA Astrophysics Data System (ADS)

    Tafakori, Ehsan; Pourzeynali, Saeid; Estekanchi, Homayoon E.

    2017-01-01

    Probabilistic Seismic Loss Estimation is a methodology used as a quantitative and explicit expression of the performance of buildings using terms that address the interests of both owners and insurance companies. Applying the ATC 58 approach for seismic loss assessment of buildings requires using Incremental Dynamic Analysis (IDA), which needs hundreds of time-consuming analyses, which in turn hinders its wide application. The Endurance Time Method (ETM) is proposed herein as part of a demand propagation prediction procedure and is shown to be an economical alternative to IDA. Various scenarios were considered to achieve this purpose and their appropriateness has been evaluated using statistical methods. The most precise and efficient scenario was validated through comparison against IDA driven response predictions of 34 code conforming benchmark structures and was proven to be sufficiently precise while offering a great deal of efficiency. The loss values were estimated by replacing IDA with the proposed ETM-based procedure in the ATC 58 procedure and it was found that these values suffer from varying inaccuracies, which were attributed to the discretized nature of damage and loss prediction functions provided by ATC 58.

  2. Observatory geoelectric fields induced in a two-layer lithosphere during magnetic storms

    USGS Publications Warehouse

    Love, Jeffrey J.; Swidinsky, Andrei

    2015-01-01

    We report on the development and validation of an algorithm for estimating geoelectric fields induced in the lithosphere beneath an observatory during a magnetic storm. To accommodate induction in three-dimensional lithospheric electrical conductivity, we analyze a simple nine-parameter model: two horizontal layers, each with uniform electrical conductivity properties given by independent distortion tensors. With Laplace transformation of the induction equations into the complex frequency domain, we obtain a transfer function describing induction of observatory geoelectric fields having frequency-dependent polarization. Upon inverse transformation back to the time domain, the convolution of the corresponding impulse-response function with a geomagnetic time series yields an estimated geoelectric time series. We obtain an optimized set of conductivity parameters using 1-s resolution geomagnetic and geoelectric field data collected at the Kakioka, Japan, observatory for five different intense magnetic storms, including the October 2003 Halloween storm; our estimated geoelectric field accounts for 93% of that measured during the Halloween storm. This work demonstrates the need for detailed modeling of the Earth’s lithospheric conductivity structure and the utility of co-located geomagnetic and geoelectric monitoring.

  3. By-passing the sign-problem in Fermion Path Integral Monte Carlo simulations by use of high-order propagators

    NASA Astrophysics Data System (ADS)

    Chin, Siu A.

    2014-03-01

    The sign-problem in PIMC simulations of non-relativistic fermions increases in serverity with the number of fermions and the number of beads (or time-slices) of the simulation. A large of number of beads is usually needed, because the conventional primitive propagator is only second-order and the usual thermodynamic energy-estimator converges very slowly from below with the total imaginary time. The Hamiltonian energy-estimator, while more complicated to evaluate, is a variational upper-bound and converges much faster with the total imaginary time, thereby requiring fewer beads. This work shows that when the Hamiltonian estimator is used in conjunction with fourth-order propagators with optimizable parameters, the ground state energies of 2D parabolic quantum-dots with approximately 10 completely polarized electrons can be obtain with ONLY 3-5 beads, before the onset of severe sign problems. This work was made possible by NPRP GRANT #5-674-1-114 from the Qatar National Research Fund (a member of Qatar Foundation). The statements made herein are solely the responsibility of the author.

  4. How large a dataset should be in order to estimate scaling exponents and other statistics correctly in studies of solar wind turbulence

    NASA Astrophysics Data System (ADS)

    Rowlands, G.; Kiyani, K. H.; Chapman, S. C.; Watkins, N. W.

    2009-12-01

    Quantitative analysis of solar wind fluctuations are often performed in the context of intermittent turbulence and center around methods to quantify statistical scaling, such as power spectra and structure functions which assume a stationary process. The solar wind exhibits large scale secular changes and so the question arises as to whether the timeseries of the fluctuations is non-stationary. One approach is to seek a local stationarity by parsing the time interval over which statistical analysis is performed. Hence, natural systems such as the solar wind unavoidably provide observations over restricted intervals. Consequently, due to a reduction of sample size leading to poorer estimates, a stationary stochastic process (time series) can yield anomalous time variation in the scaling exponents, suggestive of nonstationarity. The variance in the estimates of scaling exponents computed from an interval of N observations is known for finite variance processes to vary as ~1/N as N becomes large for certain statistical estimators; however, the convergence to this behavior will depend on the details of the process, and may be slow. We study the variation in the scaling of second-order moments of the time-series increments with N for a variety of synthetic and “real world” time series, and we find that in particular for heavy tailed processes, for realizable N, one is far from this ~1/N limiting behavior. We propose a semiempirical estimate for the minimum N needed to make a meaningful estimate of the scaling exponents for model stochastic processes and compare these with some “real world” time series from the solar wind. With fewer datapoints the stationary timeseries becomes indistinguishable from a nonstationary process and we illustrate this with nonstationary synthetic datasets. Reference article: K. H. Kiyani, S. C. Chapman and N. W. Watkins, Phys. Rev. E 79, 036109 (2009).

  5. Fractional Brownian motion time-changed by gamma and inverse gamma process

    NASA Astrophysics Data System (ADS)

    Kumar, A.; Wyłomańska, A.; Połoczański, R.; Sundar, S.

    2017-02-01

    Many real time-series exhibit behavior adequate to long range dependent data. Additionally very often these time-series have constant time periods and also have characteristics similar to Gaussian processes although they are not Gaussian. Therefore there is need to consider new classes of systems to model these kinds of empirical behavior. Motivated by this fact in this paper we analyze two processes which exhibit long range dependence property and have additional interesting characteristics which may be observed in real phenomena. Both of them are constructed as the superposition of fractional Brownian motion (FBM) and other process. In the first case the internal process, which plays role of the time, is the gamma process while in the second case the internal process is its inverse. We present in detail their main properties paying main attention to the long range dependence property. Moreover, we show how to simulate these processes and estimate their parameters. We propose to use a novel method based on rescaled modified cumulative distribution function for estimation of parameters of the second considered process. This method is very useful in description of rounded data, like waiting times of subordinated processes delayed by inverse subordinators. By using the Monte Carlo method we show the effectiveness of proposed estimation procedures. Finally, we present the applications of proposed models to real time series.

  6. Models of Wake-Vortex Spreading Mechanisms and Their Estimated Uncertainties

    NASA Technical Reports Server (NTRS)

    Rossow, Vernon J.; Hardy, Gordon H.; Meyn, Larry A.

    2006-01-01

    One of the primary constraints on the capacity of the nation's air transportation system is the landing capacity at its busiest airports. Many airports with nearly-simultaneous operations on closely-spaced parallel runways (i.e., as close as 750 ft (246m)) suffer a severe decrease in runway acceptance rate when weather conditions do not allow full utilization. The objective of a research program at NASA Ames Research Center is to develop the technologies needed for traffic management in the airport environment so that operations now allowed on closely-spaced parallel runways under Visual Meteorological Conditions can also be carried out under Instrument Meteorological Conditions. As part of this overall research objective, the study reported here has developed improved models for the various aerodynamic mechanisms that spread and transport wake vortices. The purpose of the study is to continue the development of relationships that increase the accuracy of estimates for the along-trail separation distances available before the vortex wake of a leading aircraft intrudes into the airspace of a following aircraft. Details of the models used and their uncertainties are presented in the appendices to the paper. Suggestions are made as to the theoretical and experimental research needed to increase the accuracy of and confidence level in the models presented and instrumentation required or more precise estimates of the motion and spread of vortex wakes. The improved wake models indicate that, if the following aircraft is upwind of the leading aircraft, the vortex wakes of the leading aircraft will not intrude into the airspace of the following aircraft for about 7s (based on pessimistic assumptions) for most atmospheric conditions. The wake-spreading models also indicate that longer time intervals before wake intrusion are available when atmospheric turbulence levels are mild or moderate. However, if the estimates for those time intervals are to be reliable, further study is necessary to develop the instrumentation and procedures needed to accurately define when the more benign atmospheric conditions exist.

  7. A timeline for predicting durable medical equipment needs and interventions for amyotrophic lateral sclerosis patients.

    PubMed

    Bromberg, Mark B; Brownell, Alexander A; Forshew, Dallas A; Swenson, Michael

    2010-01-01

    ALS is progressive with increasing patient needs for durable medical equipment (DME) and interventions (gastric feeding tube - PEG, and non-invasive ventilation - NIV). We performed a chart review of deceased patients to determine the time-course of needs and their estimated costs. A timeline of needs was based on when clinic personnel felt an item was necessary. The point in time when an item or intervention was needed was expressed as a percentage of a patient's total disease duration. A wide range of DME and interventions was needed irrespective of site of ALS symptom onset (bulbar, upper, lower extremity), beginning at 10% of disease duration of lower extremity onset and increasing thereafter for all sites. The cumulative probability of costs of items and interventions began at 25%-50% of disease duration and increased to between $18,000 and $32,000 (USD), highest for lower extremity onset due to the cost of wheelchairs. We conclude that a high percentage of ALS patients will need a full spectrum of major DME items and interventions during the second half of disease duration. This results in a linear rise in costs over the second half of the disease duration.

  8. State of charge monitoring of vanadium redox flow batteries using half cell potentials and electrolyte density

    NASA Astrophysics Data System (ADS)

    Ressel, Simon; Bill, Florian; Holtz, Lucas; Janshen, Niklas; Chica, Antonio; Flower, Thomas; Weidlich, Claudia; Struckmann, Thorsten

    2018-02-01

    The operation of vanadium redox flow batteries requires reliable in situ state of charge (SOC) monitoring. In this study, two SOC estimation approaches for the negative half cell are investigated. First, in situ open circuit potential measurements are combined with Coulomb counting in a one-step calibration of SOC and Nernst potential which doesn't need additional reference SOCs. In-sample and out-of-sample SOCs are estimated and analyzed, estimation errors ≤ 0.04 are obtained. In the second approach, temperature corrected in situ electrolyte density measurements are used for the first time in vanadium redox flow batteries for SOC estimation. In-sample and out-of-sample SOC estimation errors ≤ 0.04 demonstrate the feasibility of this approach. Both methods allow recalibration during battery operation. The actual capacity obtained from SOC calibration can be used in a state of health model.

  9. A New Formulation of the Filter-Error Method for Aerodynamic Parameter Estimation in Turbulence

    NASA Technical Reports Server (NTRS)

    Grauer, Jared A.; Morelli, Eugene A.

    2015-01-01

    A new formulation of the filter-error method for estimating aerodynamic parameters in nonlinear aircraft dynamic models during turbulence was developed and demonstrated. The approach uses an estimate of the measurement noise covariance to identify the model parameters, their uncertainties, and the process noise covariance, in a relaxation method analogous to the output-error method. Prior information on the model parameters and uncertainties can be supplied, and a post-estimation correction to the uncertainty was included to account for colored residuals not considered in the theory. No tuning parameters, needing adjustment by the analyst, are used in the estimation. The method was demonstrated in simulation using the NASA Generic Transport Model, then applied to the subscale T-2 jet-engine transport aircraft flight. Modeling results in different levels of turbulence were compared with results from time-domain output error and frequency- domain equation error methods to demonstrate the effectiveness of the approach.

  10. Channel Training for Analog FDD Repeaters: Optimal Estimators and Cramér-Rao Bounds

    NASA Astrophysics Data System (ADS)

    Wesemann, Stefan; Marzetta, Thomas L.

    2017-12-01

    For frequency division duplex channels, a simple pilot loop-back procedure has been proposed that allows the estimation of the UL & DL channels at an antenna array without relying on any digital signal processing at the terminal side. For this scheme, we derive the maximum likelihood (ML) estimators for the UL & DL channel subspaces, formulate the corresponding Cram\\'er-Rao bounds and show the asymptotic efficiency of both (SVD-based) estimators by means of Monte Carlo simulations. In addition, we illustrate how to compute the underlying (rank-1) SVD with quadratic time complexity by employing the power iteration method. To enable power control for the data transmission, knowledge of the channel gains is needed. Assuming that the UL & DL channels have on average the same gain, we formulate the ML estimator for the channel norm, and illustrate its robustness against strong noise by means of simulations.

  11. Estimating effect of environmental contaminants on women's subfecundity for the MoBa study data with an outcome-dependent sampling scheme

    PubMed Central

    Ding, Jieli; Zhou, Haibo; Liu, Yanyan; Cai, Jianwen; Longnecker, Matthew P.

    2014-01-01

    Motivated by the need from our on-going environmental study in the Norwegian Mother and Child Cohort (MoBa) study, we consider an outcome-dependent sampling (ODS) scheme for failure-time data with censoring. Like the case-cohort design, the ODS design enriches the observed sample by selectively including certain failure subjects. We present an estimated maximum semiparametric empirical likelihood estimation (EMSELE) under the proportional hazards model framework. The asymptotic properties of the proposed estimator were derived. Simulation studies were conducted to evaluate the small-sample performance of our proposed method. Our analyses show that the proposed estimator and design is more efficient than the current default approach and other competing approaches. Applying the proposed approach with the data set from the MoBa study, we found a significant effect of an environmental contaminant on fecundability. PMID:24812419

  12. Online Estimation of Model Parameters of Lithium-Ion Battery Using the Cubature Kalman Filter

    NASA Astrophysics Data System (ADS)

    Tian, Yong; Yan, Rusheng; Tian, Jindong; Zhou, Shijie; Hu, Chao

    2017-11-01

    Online estimation of state variables, including state-of-charge (SOC), state-of-energy (SOE) and state-of-health (SOH) is greatly crucial for the operation safety of lithium-ion battery. In order to improve estimation accuracy of these state variables, a precise battery model needs to be established. As the lithium-ion battery is a nonlinear time-varying system, the model parameters significantly vary with many factors, such as ambient temperature, discharge rate and depth of discharge, etc. This paper presents an online estimation method of model parameters for lithium-ion battery based on the cubature Kalman filter. The commonly used first-order resistor-capacitor equivalent circuit model is selected as the battery model, based on which the model parameters are estimated online. Experimental results show that the presented method can accurately track the parameters variation at different scenarios.

  13. Estimating chronic hepatitis C prognosis using transient elastography-based liver stiffness: A systematic review and meta-analysis.

    PubMed

    Erman, A; Sathya, A; Nam, A; Bielecki, J M; Feld, J J; Thein, H-H; Wong, W W L; Grootendorst, P; Krahn, M D

    2018-05-01

    Chronic hepatitis C (CHC) is a leading cause of hepatic fibrosis and cirrhosis. The level of fibrosis is traditionally established by histology, and prognosis is estimated using fibrosis progression rates (FPRs; annual probability of progressing across histological stages). However, newer noninvasive alternatives are quickly replacing biopsy. One alternative, transient elastography (TE), quantifies fibrosis by measuring liver stiffness (LSM). Given these developments, the purpose of this study was (i) to estimate prognosis in treatment-naïve CHC patients using TE-based liver stiffness progression rates (LSPR) as an alternative to FPRs and (ii) to compare consistency between LSPRs and FPRs. A systematic literature search was performed using multiple databases (January 1990 to February 2016). LSPRs were calculated using either a direct method (given the difference in serial LSMs and time elapsed) or an indirect method given a single LSM and the estimated duration of infection and pooled using random-effects meta-analyses. For validation purposes, FPRs were also estimated. Heterogeneity was explored by random-effects meta-regression. Twenty-seven studies reporting on 39 groups of patients (N = 5874) were identified with 35 groups allowing for indirect and 8 for direct estimation of LSPR. The majority (~58%) of patients were HIV/HCV-coinfected. The estimated time-to-cirrhosis based on TE vs biopsy was 39 and 38 years, respectively. In univariate meta-regressions, male sex and HIV were positively and age at assessment, negatively associated with LSPRs. Noninvasive prognosis of HCV is consistent with FPRs in predicting time-to-cirrhosis, but more longitudinal studies of liver stiffness are needed to obtain refined estimates. © 2017 John Wiley & Sons Ltd.

  14. Blind source separation and localization using microphone arrays

    NASA Astrophysics Data System (ADS)

    Sun, Longji

    The blind source separation and localization problem for audio signals is studied using microphone arrays. Pure delay mixtures of source signals typically encountered in outdoor environments are considered. Our proposed approach utilizes the subspace methods, including multiple signal classification (MUSIC) and estimation of signal parameters via rotational invariance techniques (ESPRIT) algorithms, to estimate the directions of arrival (DOAs) of the sources from the collected mixtures. Since audio signals are generally considered broadband, the DOA estimates at frequencies with the large sum of squared amplitude values are combined to obtain the final DOA estimates. Using the estimated DOAs, the corresponding mixing and demixing matrices are computed, and the source signals are recovered using the inverse short time Fourier transform. Subspace methods take advantage of the spatial covariance matrix of the collected mixtures to achieve robustness to noise. While the subspace methods have been studied for localizing radio frequency signals, audio signals have their special properties. For instance, they are nonstationary, naturally broadband and analog. All of these make the separation and localization for the audio signals more challenging. Moreover, our algorithm is essentially equivalent to the beamforming technique, which suppresses the signals in unwanted directions and only recovers the signals in the estimated DOAs. Several crucial issues related to our algorithm and their solutions have been discussed, including source number estimation, spatial aliasing, artifact filtering, different ways of mixture generation, and source coordinate estimation using multiple arrays. Additionally, comprehensive simulations and experiments have been conducted to examine various aspects of the algorithm. Unlike the existing blind source separation and localization methods, which are generally time consuming, our algorithm needs signal mixtures of only a short duration and therefore supports real-time implementation.

  15. Reaction time, inhibition, working memory and ‘delay aversion’ performance: genetic influences and their interpretation

    PubMed Central

    KUNTSI, JONNA; ROGERS, HANNAH; SWINARD, GREER; BÖRGER, NORBERT; van der MEERE, JAAP; RIJSDIJK, FRUHLING; ASHERSON, PHILIP

    2013-01-01

    Background For candidate endophenotypes to be useful for psychiatric genetic research, they first of all need to show significant genetic influences. To address the relative lack of previous data, we set to investigate the extent of genetic and environmental influences on performance in a set of theoretically driven cognitive-experimental tasks in a large twin sample. We further aimed to illustrate how test–retest reliability of the measures affects the estimates. Method Four-hundred 7- to 9-year-old twin pairs were assessed individually on tasks measuring reaction time, inhibition, working memory and ‘delay aversion’ performance. Test–retest reliability data on some of the key measures were available from a previous study. Results Several key measures of reaction time, inhibition and working-memory performance indicated a moderate degree of genetic influence. Combining data across theoretically related tasks increased the heritability estimates, as illustrated by the heritability estimates of 60% for mean reaction time and 50% for reaction-time variability. Psychometric properties (reliability or ceiling effects) had a substantial influence on the estimates for some measures. Conclusions The data support the usefulness of several of the variables for endophenotype studies that aim to link genes to cognitive and motivational processes. Importantly, the data also illustrate specific conditions under which the true extent of genetic influences may be underestimated and hence the usefulness for genetic mapping studies compromised, and suggest ways to address this. PMID:16882357

  16. Preliminary Outcomes from an Integrated Pediatric Mental Health Outpatient Clinic.

    PubMed

    Maslow, Gary R; Banny, Adrienne; Pollock, McLean; Stefureac, Kristen; Rosa, Kendra; Walter, Barbara Keith; Hobbs Knutson, Katherine; Lucas, Joseph; Heilbron, Nicole

    2017-10-01

    An estimated 1 in 5 children in the United States meet criteria for a diagnosable mental disorder, yet fewer than 20% receive mental health services. Unmet need for psychiatric treatment may contribute to patterns of increasing use of the emergency department. This article describes an integrated pediatric evaluation center designed to prevent the need for treatment in emergency settings by increasing access to timely and appropriate care for emergent and critical mental health needs. Preliminary results showed that the center provided rapid access to assessment and treatment services for children and adolescents presenting with a wide range of psychiatric concerns. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. A parallel implementation of a multisensor feature-based range-estimation method

    NASA Technical Reports Server (NTRS)

    Suorsa, Raymond E.; Sridhar, Banavar

    1993-01-01

    There are many proposed vision based methods to perform obstacle detection and avoidance for autonomous or semi-autonomous vehicles. All methods, however, will require very high processing rates to achieve real time performance. A system capable of supporting autonomous helicopter navigation will need to extract obstacle information from imagery at rates varying from ten frames per second to thirty or more frames per second depending on the vehicle speed. Such a system will need to sustain billions of operations per second. To reach such high processing rates using current technology, a parallel implementation of the obstacle detection/ranging method is required. This paper describes an efficient and flexible parallel implementation of a multisensor feature-based range-estimation algorithm, targeted for helicopter flight, realized on both a distributed-memory and shared-memory parallel computer.

  18. Application of minidisk infiltrometer to estimate soil water repellency

    NASA Astrophysics Data System (ADS)

    Alagna, Vincenzo; Iovino, Massimo; Bagarello, Vincenzo; Mataix-Solera, Jorge; Lichner, Ľubomír

    2016-04-01

    Soil water repellency (SWR) reduces affinity of soils to water resulting in detrimental implication for plants growth as well as for hydrological processes. During the last decades, it has become clear that SWR is much more widespread than formerly thought, having been reported for a wide variety of soils, land uses and climatic conditions. The repellency index (RI), based on soil-water to soil-ethanol sorptivity ratio, was proposed to characterize subcritical SWR that is the situation where a low degree of repellency impedes infiltration but does not prevent it. The minidisk infiltrometer allows adequate field assessment of RI inherently scaled to account for soil physical properties other than hydrophobicity (e.g., the volume, connectivity and the geometry of pores) that directly influence the hydrological processes. There are however some issues that still need consideration. For example, use of a fixed time for both water and ethanol sorptivity estimation may lead to inaccurate RI values given that water infiltration could be negligible whereas ethanol sorptivity could be overestimated due to influence of gravity and lateral diffusion that rapidly come into play when the infiltration process is very fast. Moreover, water and ethanol sorptivity values need to be determined at different infiltration sites thus implying that a large number of replicated runs should be carried out to obtain a reliable estimate of RI for a given area. Minidisk infiltrometer tests, conducted under different initial soil moisture and management conditions in the experimental sites of Ciavolo, Trapani (Italy) and Javea, Alicante (East Spain), were used to investigate the best applicative procedure to estimate RI. In particular, different techniques to estimate the water, Sw, and ethanol, Se, sorptivities were compared including i) a fixed 1-min time interval, ii) the slope of early-time 1D infiltration equation and iii) the two-term transient 3D infiltration equation that explicitly accounts for the effects of gravity and lateral expansion. According to Pekárová et al. (2015), the combination of all the ethanol and water sorptivities was used to calculate an aggregated repellency index, RIa, that accounts for the influence of spatial variability. Alternatively, the plot of the water cumulative infiltration vs. square root of time, exhibiting a clear "hockey-stick-like" shape, was used to estimate a single-test repellency index, RI∗, that overcomes the limitations of the traditional approach given that information on both the hydrophobic and the wettable states of soil are gathered from a unique infiltration test. The mean RI values were affected by the technique used to estimate Sw and Se. In particular, the choice of a fixed time interval lead to overestimation of RI up to a factor of 3.2 as compared with the other techniques. The RIa yielded unbiased estimations of the mean RI values and also allowed to quantify the variability of SWR within a given area. A statistically significant relationship was found between RI∗ and RI but also between RI∗ and the water retention cessation time, that is the time hydrophobic turns into wettable soil, thus indicating that RI∗ is potentially able detect both the degree and the persistence of SWR. Pekárová P., Pekár J., Lichner Ľ. 2015. A new method for estimating soil water repellency index. Biologia, 70(11):1450-1455.

  19. Inferring invasive species abundance using removal data from management actions

    USGS Publications Warehouse

    Davis, Amy J.; Hooten, Mevin B.; Miller, Ryan S.; Farnsworth, Matthew L.; Lewis, Jesse S.; Moxcey, Michael; Pepin, Kim M.

    2016-01-01

    Evaluation of the progress of management programs for invasive species is crucial for demonstrating impacts to stakeholders and strategic planning of resource allocation. Estimates of abundance before and after management activities can serve as a useful metric of population management programs. However, many methods of estimating population size are too labor intensive and costly to implement, posing restrictive levels of burden on operational programs. Removal models are a reliable method for estimating abundance before and after management using data from the removal activities exclusively, thus requiring no work in addition to management. We developed a Bayesian hierarchical model to estimate abundance from removal data accounting for varying levels of effort, and used simulations to assess the conditions under which reliable population estimates are obtained. We applied this model to estimate site-specific abundance of an invasive species, feral swine (Sus scrofa), using removal data from aerial gunning in 59 site/time-frame combinations (480–19,600 acres) throughout Oklahoma and Texas, USA. Simulations showed that abundance estimates were generally accurate when effective removal rates (removal rate accounting for total effort) were above 0.40. However, when abundances were small (<50) the effective removal rate needed to accurately estimates abundances was considerably higher (0.70). Based on our post-validation method, 78% of our site/time frame estimates were accurate. To use this modeling framework it is important to have multiple removals (more than three) within a time frame during which demographic changes are minimized (i.e., a closed population; ≤3 months for feral swine). Our results show that the probability of accurately estimating abundance from this model improves with increased sampling effort (8+ flight hours across the 3-month window is best) and increased removal rate. Based on the inverse relationship between inaccurate abundances and inaccurate removal rates, we suggest auxiliary information that could be collected and included in the model as covariates (e.g., habitat effects, differences between pilots) to improve accuracy of removal rates and hence abundance estimates.

  20. Design and Implementation of Real-Time Vehicular Camera for Driver Assistance and Traffic Congestion Estimation

    PubMed Central

    Son, Sanghyun; Baek, Yunju

    2015-01-01

    As society has developed, the number of vehicles has increased and road conditions have become complicated, increasing the risk of crashes. Therefore, a service that provides safe vehicle control and various types of information to the driver is urgently needed. In this study, we designed and implemented a real-time traffic information system and a smart camera device for smart driver assistance systems. We selected a commercial device for the smart driver assistance systems, and applied a computer vision algorithm to perform image recognition. For application to the dynamic region of interest, dynamic frame skip methods were implemented to perform parallel processing in order to enable real-time operation. In addition, we designed and implemented a model to estimate congestion by analyzing traffic information. The performance of the proposed method was evaluated using images of a real road environment. We found that the processing time improved by 15.4 times when all the proposed methods were applied in the application. Further, we found experimentally that there was little or no change in the recognition accuracy when the proposed method was applied. Using the traffic congestion estimation model, we also found that the average error rate of the proposed model was 5.3%. PMID:26295230

  1. Design and Implementation of Real-Time Vehicular Camera for Driver Assistance and Traffic Congestion Estimation.

    PubMed

    Son, Sanghyun; Baek, Yunju

    2015-08-18

    As society has developed, the number of vehicles has increased and road conditions have become complicated, increasing the risk of crashes. Therefore, a service that provides safe vehicle control and various types of information to the driver is urgently needed. In this study, we designed and implemented a real-time traffic information system and a smart camera device for smart driver assistance systems. We selected a commercial device for the smart driver assistance systems, and applied a computer vision algorithm to perform image recognition. For application to the dynamic region of interest, dynamic frame skip methods were implemented to perform parallel processing in order to enable real-time operation. In addition, we designed and implemented a model to estimate congestion by analyzing traffic information. The performance of the proposed method was evaluated using images of a real road environment. We found that the processing time improved by 15.4 times when all the proposed methods were applied in the application. Further, we found experimentally that there was little or no change in the recognition accuracy when the proposed method was applied. Using the traffic congestion estimation model, we also found that the average error rate of the proposed model was 5.3%.

  2. Improved efficiency of maximum likelihood analysis of time series with temporally correlated errors

    USGS Publications Warehouse

    Langbein, John O.

    2017-01-01

    Most time series of geophysical phenomena have temporally correlated errors. From these measurements, various parameters are estimated. For instance, from geodetic measurements of positions, the rates and changes in rates are often estimated and are used to model tectonic processes. Along with the estimates of the size of the parameters, the error in these parameters needs to be assessed. If temporal correlations are not taken into account, or each observation is assumed to be independent, it is likely that any estimate of the error of these parameters will be too low and the estimated value of the parameter will be biased. Inclusion of better estimates of uncertainties is limited by several factors, including selection of the correct model for the background noise and the computational requirements to estimate the parameters of the selected noise model for cases where there are numerous observations. Here, I address the second problem of computational efficiency using maximum likelihood estimates (MLE). Most geophysical time series have background noise processes that can be represented as a combination of white and power-law noise, 1/fα">1/fα1/fα with frequency, f. With missing data, standard spectral techniques involving FFTs are not appropriate. Instead, time domain techniques involving construction and inversion of large data covariance matrices are employed. Bos et al. (J Geod, 2013. doi:10.1007/s00190-012-0605-0) demonstrate one technique that substantially increases the efficiency of the MLE methods, yet is only an approximate solution for power-law indices >1.0 since they require the data covariance matrix to be Toeplitz. That restriction can be removed by simply forming a data filter that adds noise processes rather than combining them in quadrature. Consequently, the inversion of the data covariance matrix is simplified yet provides robust results for a wider range of power-law indices.

  3. Bayesian Non-Stationary Flood Frequency Estimation at Ungauged Basins Using Climate Information and a Scaling Model

    NASA Astrophysics Data System (ADS)

    Lima, C. H.; Lall, U.

    2010-12-01

    Flood frequency statistical analysis most often relies on stationary assumptions, where distribution moments (e.g. mean, standard deviation) and associated flood quantiles do not change over time. In this sense, one expects that flood magnitudes and their frequency of occurrence will remain constant as observed in the historical information. However, evidence of inter-annual and decadal climate variability and anthropogenic change as well as an apparent increase in the number and magnitude of flood events across the globe have made the stationary assumption questionable. Here, we show how to estimate flood quantiles (e.g. 100-year flood) at ungauged basins without needing to consider stationarity. A statistical model based on the well known flow-area scaling law is proposed to estimate flood flows at ungauged basins. The slope and intercept scaling law coefficients are assumed time varying and a hierarchical Bayesian model is used to include climate information and reduce parameter uncertainties. Cross-validated results from 34 streamflow gauges located in a nested Basin in Brazil show that the proposed model is able to estimate flood quantiles at ungauged basins with remarkable skills compared with data based estimates using the full record. The model as developed in this work is also able to simulate sequences of flood flows considering global climate changes provided an appropriate climate index developed from the General Circulation Model is used as a predictor. The time varying flood frequency estimates can be used for pricing insurance models, and in a forecast mode for preparations for flooding, and finally, for timing infrastructure investments and location. Non-stationary 95% interval estimation for the 100-year Flood (shaded gray region) and 95% interval for the 100-year flood estimated from data (horizontal dashed and solid lines). The average distribution of the 100-year flood is shown in green in the right side.

  4. Improved efficiency of maximum likelihood analysis of time series with temporally correlated errors

    NASA Astrophysics Data System (ADS)

    Langbein, John

    2017-08-01

    Most time series of geophysical phenomena have temporally correlated errors. From these measurements, various parameters are estimated. For instance, from geodetic measurements of positions, the rates and changes in rates are often estimated and are used to model tectonic processes. Along with the estimates of the size of the parameters, the error in these parameters needs to be assessed. If temporal correlations are not taken into account, or each observation is assumed to be independent, it is likely that any estimate of the error of these parameters will be too low and the estimated value of the parameter will be biased. Inclusion of better estimates of uncertainties is limited by several factors, including selection of the correct model for the background noise and the computational requirements to estimate the parameters of the selected noise model for cases where there are numerous observations. Here, I address the second problem of computational efficiency using maximum likelihood estimates (MLE). Most geophysical time series have background noise processes that can be represented as a combination of white and power-law noise, 1/f^{α } with frequency, f. With missing data, standard spectral techniques involving FFTs are not appropriate. Instead, time domain techniques involving construction and inversion of large data covariance matrices are employed. Bos et al. (J Geod, 2013. doi: 10.1007/s00190-012-0605-0) demonstrate one technique that substantially increases the efficiency of the MLE methods, yet is only an approximate solution for power-law indices >1.0 since they require the data covariance matrix to be Toeplitz. That restriction can be removed by simply forming a data filter that adds noise processes rather than combining them in quadrature. Consequently, the inversion of the data covariance matrix is simplified yet provides robust results for a wider range of power-law indices.

  5. Pearson correlation estimation for irregularly sampled time series

    NASA Astrophysics Data System (ADS)

    Rehfeld, K.; Marwan, N.; Heitzig, J.; Kurths, J.

    2012-04-01

    Many applications in the geosciences call for the joint and objective analysis of irregular time series. For automated processing, robust measures of linear and nonlinear association are needed. Up to now, the standard approach would have been to reconstruct the time series on a regular grid, using linear or spline interpolation. Interpolation, however, comes with systematic side-effects, as it increases the auto-correlation in the time series. We have searched for the best method to estimate Pearson correlation for irregular time series, i.e. the one with the lowest estimation bias and variance. We adapted a kernel-based approach, using Gaussian weights. Pearson correlation is calculated, in principle, as a mean over products of previously centralized observations. In the regularly sampled case, observations in both time series were observed at the same time and thus the allocation of measurement values into pairs of products is straightforward. In the irregularly sampled case, however, measurements were not necessarily observed at the same time. Now, the key idea of the kernel-based method is to calculate weighted means of products, with the weight depending on the time separation between the observations. If the lagged correlation function is desired, the weights depend on the absolute difference between observation time separation and the estimation lag. To assess the applicability of the approach we used extensive simulations to determine the extent of interpolation side-effects with increasing irregularity of time series. We compared different approaches, based on (linear) interpolation, the Lomb-Scargle Fourier Transform, the sinc kernel and the Gaussian kernel. We investigated the role of kernel bandwidth and signal-to-noise ratio in the simulations. We found that the Gaussian kernel approach offers significant advantages and low Root-Mean Square Errors for regular, slightly irregular and very irregular time series. We therefore conclude that it is a good (linear) similarity measure that is appropriate for irregular time series with skewed inter-sampling time distributions.

  6. Development of an extended Kalman filter for the self-sensing application of a spring-biased shape memory alloy wire actuator

    NASA Astrophysics Data System (ADS)

    Gurung, H.; Banerjee, A.

    2016-02-01

    This report presents the development of an extended Kalman filter (EKF) to harness the self-sensing capability of a shape memory alloy (SMA) wire, actuating a linear spring. The stress and temperature of the SMA wire, constituting the state of the system, are estimated using the EKF, from the measured change in electrical resistance (ER) of the SMA. The estimated stress is used to compute the change in length of the spring, eliminating the need for a displacement sensor. The system model used in the EKF comprises the heat balance equation and the constitutive relation of the SMA wire coupled with the force-displacement behavior of a spring. Both explicit and implicit approaches are adopted to evaluate the system model at each time-update step of the EKF. Next, in the measurement-update step, estimated states are updated based on the measured electrical resistance. It has been observed that for the same time step, the implicit approach consumes less computational time than the explicit method. To verify the implementation, EKF estimated states of the system are compared with those of an established model for different inputs to the SMA wire. An experimental setup is developed to measure the actual spring displacement and ER of the SMA, for any time-varying voltage applied to it. The process noise covariance is decided using a heuristic approach, whereas the measurement noise covariance is obtained experimentally. Finally, the EKF is used to estimate the spring displacement for a given input and the corresponding experimentally obtained ER of the SMA. The qualitative agreement between the EKF estimated displacement with that obtained experimentally reveals the true potential of this approach to harness the self-sensing capability of the SMA.

  7. Imaging tooth enamel using zero echo time (ZTE) magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Rychert, Kevin M.; Zhu, Gang; Kmiec, Maciej M.; Nemani, Venkata K.; Williams, Benjamin B.; Flood, Ann B.; Swartz, Harold M.; Gimi, Barjor

    2015-03-01

    In an event where many thousands of people may have been exposed to levels of radiation that are sufficient to cause the acute radiation syndrome, we need technology that can estimate the absorbed dose on an individual basis for triage and meaningful medical decision making. Such dose estimates may be achieved using in vivo electron paramagnetic resonance (EPR) tooth biodosimetry, which measures the number of persistent free radicals that are generated in tooth enamel following irradiation. However, the accuracy of dose estimates may be impacted by individual variations in teeth, especially the amount and distribution of enamel in the inhomogeneous sensitive volume of the resonator used to detect the radicals. In order to study the relationship between interpersonal variations in enamel and EPR-based dose estimates, it is desirable to estimate these parameters nondestructively and without adding radiation to the teeth. Magnetic Resonance Imaging (MRI) is capable of acquiring structural and biochemical information without imparting additional radiation, which may be beneficial for many EPR dosimetry studies. However, the extremely short T2 relaxation time in tooth structures precludes tooth imaging using conventional MRI methods. Therefore, we used zero echo time (ZTE) MRI to image teeth ex vivo to assess enamel volumes and spatial distributions. Using these data in combination with the data on the distribution of the transverse radio frequency magnetic field from electromagnetic simulations, we then can identify possible sources of variations in radiation-induced signals detectable by EPR. Unlike conventional MRI, ZTE applies spatial encoding gradients during the RF excitation pulse, thereby facilitating signal acquisition almost immediately after excitation, minimizing signal loss from short T2 relaxation times. ZTE successfully provided volumetric measures of tooth enamel that may be related to variations that impact EPR dosimetry and facilitate the development of analytical procedures for individual dose estimates.

  8. Societal economic costs and benefits from death: another look.

    PubMed

    Stack, Steven

    2007-04-01

    B. Yang and D. Lester (2007) have produced an innovative contribution to the relevant literature. Unlike previous studies, they incorporate estimates of cost savings from suicide. Their argument could be strengthened in 3 ways. First, they may have underestimated some of the cost savings by relying on inflated estimates of mental health usage by suicidal persons. The present analysis shows that only 20% of suicidal individuals see a mental health professional during the last year of life, much lower than previous estimates. Further, persons dying of cancer are 4 times more likely than suicides to report high usage of medical services. Second, our economy relies heavily on the health care sector for job creation, so that we need to exercise caution in interpreting savings in medical care; such savings may also represent costs in employment opportunities for nurses, doctors, and other medical personnel. Third, an anticipated criticism, the costs of the grieving of significant others, needs to be considered. Suicidal persons are shown to have less dense social networks, a sign of fewer potential grievers than in the case of natural deaths. Future work is needed to adjust lost earnings for the lower occupational status of suicides; this is another reason why Yang and Lester may be underestimating cost savings from suicide.

  9. Different approaches to valuing the lost productivity of patients with migraine.

    PubMed

    Lofland, J H; Locklear, J C; Frick, K D

    2001-01-01

    To calculate and compare the human capital approach (HCA) and friction cost approach (FCA) methods for estimating the cost of lost productivity of migraineurs after the initiation of sumatriptan from a US societal perspective. Secondary, retrospective analysis to a prospective observational study. A mixed-model managed care organisation in western Pennsylvania, USA. Patients with migraine using sumatriptan therapy. Patient-reported questionnaires collected at baseline, 3 and 6 months after initiation of sumatriptan therapy. The cost of lost productivity estimated with the HCA and FCA methods. Of the 178 patients who completed the study, 51% were full-time employees, 13% were part-time, 18% were not working and 17% changed work status. Twenty-four percent reported a clerical or administrative position. From the HCA, the estimated total cost of lost productivity for 6 months following the initiation of sumatriptan was $US117905 (1996 values). From the FCA, the six-month estimated total cost of lost productivity ranged from $US28329 to $US117905 (1996 values). This was the first study to retrospectively estimate lost productivity of patients with migraine using the FCA methodology. Our results demonstrate that depending on the assumptions and illustrations employed, the FCA can yield lost productivity estimates that vary greatly as a percentage of the HCA estimate. Prospective investigations are needed to better determine the components and the nature of the lost productivity for chronic episodic diseases such as migraine headache.

  10. Exploring the use of WRF-3DVar for Estimating reference evapotranspiration in semi arid regions

    NASA Astrophysics Data System (ADS)

    Bray, Michaela; Liu, Jia; Abdulhamza, Ali; Bocklemann-Evans, Bettina

    2013-04-01

    Evapotranspiration is an important process in hydrology and is central to the analysis of water balances and water resource management. Significant water losses can occur in large drainage basins under semi arid climate conditions, moreover with the lack of measured data, the exact losses are hard to quantify. Since direct measurements for evapotranspiration are difficult to obtain it is common to estimate the process by using evapotranspiration models such as the Priestley-Taylor model, Shuttleworth -Wallace model and the FAO Penmann-Monteith. However these models depend on several atmospheric variables such as atmospheric pressure, wind speed, air temperature, net radiation and relative humidity. Some of these variables are also difficult to acquire from in-situ measurements; in addition these measurements provide local information which need to be interpolated to cover larger catchment areas over long time scales. Mesoscale Numerical Weather Prediction (NWP) modelling has become more accessible to the hydrometeorological community in recent years and is frequently used for modelling precipitation at the catchment scale. However these NWPs can also provide the atmospheric variables needed for evapotranspiration estimation at finer resolutions than can be attained from in situ measurements, offering a practical water resource tool. Moreover there is evidence that assimilation of real time observations can help improve the accuracy of mesoscale weather modelling which in turn would improve the overall evapotranspiration estimate. This study explores the effect of data assimilation in the Weather Research and Forecasting (WRF) model to derive evapotranspiration estimates for the Tigris water basin, Iraq. Two types of traditional observations, SYNOP and SOUND are assimilated by WRF-3DVAR.which contain surface and upper-level measurements of pressure, temperature, humidity and wind. The downscaled weather variables are used to determine evapostranspiration estimates and compared with observed evapostranspiration data measured by Class A evaporation pan.

  11. Water consumption by nuclear powerplants and some hydrological implications

    USGS Publications Warehouse

    Giusti, Ennio V.; Meyer, E.L.

    1977-01-01

    Published data show that estimated water consumption varies with the cooling system adopted, being least in once-through cooling (about 18 cubic feet per second per 1,000 megawatts electrical) and greatest in closed cooling with mechanical draft towers (about 30 cubic feet per second per 1,000 megawatts electrical). When freshwater is used at this magnitude, water-resources economy may be affected in a given region. The critical need for cooling water at all times by the nuclear powerplant industry, coupled with the knowledge that water withdrawal in the basin will generally increase with time and will be at a maximum during low-flow periods, indicates a need for reexamination of the design low flow currently adopted and the methods used to estimate it. The amount of power generated, the name of the cooling water source, and the cooling method adopted for all nuclear powerplants projected to be in operation by 1985 in the United States are tabulated and the estimated annual evaporation at each powerplant site is shown on a map of the conterminous United States. Another map is presented that shows all nuclear powerplants located on river sites as well as stream reaches in the United States where the 7-day, 10-year low flow is at least 300 cubic feet per second or where this amount of flow can be developed with storage. (Woodard-USGS)

  12. A Statistical Weather-Driven Streamflow Model: Enabling future flow predictions in data-scarce headwater streams

    NASA Astrophysics Data System (ADS)

    Rosner, A.; Letcher, B. H.; Vogel, R. M.

    2014-12-01

    Predicting streamflow in headwaters and over a broad spatial scale pose unique challenges due to limited data availability. Flow observation gages for headwaters streams are less common than for larger rivers, and gages with records lengths of ten year or more are even more scarce. Thus, there is a great need for estimating streamflows in ungaged or sparsely-gaged headwaters. Further, there is often insufficient basin information to develop rainfall-runoff models that could be used to predict future flows under various climate scenarios. Headwaters in the northeastern U.S. are of particular concern to aquatic biologists, as these stream serve as essential habitat for native coldwater fish. In order to understand fish response to past or future environmental drivers, estimates of seasonal streamflow are needed. While there is limited flow data, there is a wealth of data for historic weather conditions. Observed data has been modeled to interpolate a spatially continuous historic weather dataset. (Mauer et al 2002). We present a statistical model developed by pairing streamflow observations with precipitation and temperature information for the same and preceding time-steps. We demonstrate this model's use to predict flow metrics at the seasonal time-step. While not a physical model, this statistical model represents the weather drivers. Since this model can predict flows not directly tied to reference gages, we can generate flow estimates for historic as well as potential future conditions.

  13. Virtual water trade and time scales for loss of water sustainability: a comparative regional analysis.

    PubMed

    Goswami, Prashant; Nishad, Shiv Narayan

    2015-03-20

    Assessment and policy design for sustainability in primary resources like arable land and water need to adopt long-term perspective; even small but persistent effects like net export of water may influence sustainability through irreversible losses. With growing consumption, this virtual water trade has become an important element in the water sustainability of a nation. We estimate and contrast the virtual (embedded) water trades of two populous nations, India and China, to present certain quantitative measures and time scales. Estimates show that export of embedded water alone can lead to loss of water sustainability. With the current rate of net export of water (embedded) in the end products, India is poised to lose its entire available water in less than 1000 years; much shorter time scales are implied in terms of water for production. The two cases contrast and exemplify sustainable and non-sustainable virtual water trade in long term perspective.

  14. Virtual water trade and time scales for loss of water sustainability: A comparative regional analysis

    NASA Astrophysics Data System (ADS)

    Goswami, Prashant; Nishad, Shiv Narayan

    2015-03-01

    Assessment and policy design for sustainability in primary resources like arable land and water need to adopt long-term perspective; even small but persistent effects like net export of water may influence sustainability through irreversible losses. With growing consumption, this virtual water trade has become an important element in the water sustainability of a nation. We estimate and contrast the virtual (embedded) water trades of two populous nations, India and China, to present certain quantitative measures and time scales. Estimates show that export of embedded water alone can lead to loss of water sustainability. With the current rate of net export of water (embedded) in the end products, India is poised to lose its entire available water in less than 1000 years; much shorter time scales are implied in terms of water for production. The two cases contrast and exemplify sustainable and non-sustainable virtual water trade in long term perspective.

  15. Real-time stereo matching using orthogonal reliability-based dynamic programming.

    PubMed

    Gong, Minglun; Yang, Yee-Hong

    2007-03-01

    A novel algorithm is presented in this paper for estimating reliable stereo matches in real time. Based on the dynamic programming-based technique we previously proposed, the new algorithm can generate semi-dense disparity maps using as few as two dynamic programming passes. The iterative best path tracing process used in traditional dynamic programming is replaced by a local minimum searching process, making the algorithm suitable for parallel execution. Most computations are implemented on programmable graphics hardware, which improves the processing speed and makes real-time estimation possible. The experiments on the four new Middlebury stereo datasets show that, on an ATI Radeon X800 card, the presented algorithm can produce reliable matches for 60% approximately 80% of pixels at the rate of 10 approximately 20 frames per second. If needed, the algorithm can be configured for generating full density disparity maps.

  16. Understanding unmet need: history, theory, and measurement.

    PubMed

    Bradley, Sarah E K; Casterline, John B

    2014-06-01

    During the past two decades, estimates of unmet need have become an influential measure for assessing population policies and programs. This article recounts the evolution of the concept of unmet need, describes how demographic survey data have been used to generate estimates of its prevalence, and tests the sensitivity of these estimates to various assumptions in the unmet need algorithm. The algorithm uses a complex set of assumptions to identify women: who are sexually active, who are infecund, whose most recent pregnancy was unwanted, who wish to postpone their next birth, and who are postpartum amenorrheic. The sensitivity tests suggest that defensible alternative criteria for identifying four out of five of these subgroups of women would increase the estimated prevalence of unmet need. The exception is identification of married women who are sexually active; more accurate measurement of this subgroup would reduce the estimated prevalence of unmet need in most settings. © 2013 The Population Council, Inc.

  17. Estimating crop net primary production using inventory data and MODIS-derived parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bandaru, Varaprasad; West, Tristram O.; Ricciuto, Daniel M.

    2013-06-03

    National estimates of spatially-resolved cropland net primary production (NPP) are needed for diagnostic and prognostic modeling of carbon sources, sinks, and net carbon flux. Cropland NPP estimates that correspond with existing cropland cover maps are needed to drive biogeochemical models at the local scale and over national and continental extents. Existing satellite-based NPP products tend to underestimate NPP on croplands. A new Agricultural Inventory-based Light Use Efficiency (AgI-LUE) framework was developed to estimate individual crop biophysical parameters for use in estimating crop-specific NPP. The method is documented here and evaluated for corn and soybean crops in Iowa and Illinois inmore » years 2006 and 2007. The method includes a crop-specific enhanced vegetation index (EVI) from the Moderate Resolution Imaging Spectroradiometer (MODIS), shortwave radiation data estimated using Mountain Climate Simulator (MTCLIM) algorithm and crop-specific LUE per county. The combined aforementioned variables were used to generate spatially-resolved, crop-specific NPP that correspond to the Cropland Data Layer (CDL) land cover product. The modeling framework represented well the gradient of NPP across Iowa and Illinois, and also well represented the difference in NPP between years 2006 and 2007. Average corn and soybean NPP from AgI-LUE was 980 g C m-2 yr-1 and 420 g C m-2 yr-1, respectively. This was 2.4 and 1.1 times higher, respectively, for corn and soybean compared to the MOD17A3 NPP product. Estimated gross primary productivity (GPP) derived from AgI-LUE were in close agreement with eddy flux tower estimates. The combination of new inputs and improved datasets enabled the development of spatially explicit and reliable NPP estimates for individual crops over large regional extents.« less

  18. Recent im/migration to Canada linked to unmet health needs among sex workers in Vancouver, Canada: Findings of a longitudinal study

    PubMed Central

    Sou, Julie; Goldenberg, Shira M.; Duff, Putu; Nguyen, Paul; Shoveller, Jean; Shannon, Kate

    2017-01-01

    Despite universal health care in Canada, sex workers (SW) and im/migrants experience suboptimal health care access. In this analysis, we examined the correlates of unmet health needs among SWs in Metro Vancouver over time. Data from a longitudinal cohort of women SWs (AESHA) was used. Of 742 SWs, 25.5% reported unmet health needs at least once over the 4-year study period. In multivariable logistic regression using generalized estimating equations, recent im/migration had the strongest impact on unmet health needs; long-term im/migration, policing, and trauma were also important determinants. Legal and social supports to promote im/migrant SWs’ access to health care are recommended. PMID:28300492

  19. Why is quality estimation judgment fast? Comparison of gaze control strategies in quality and difference estimation tasks

    NASA Astrophysics Data System (ADS)

    Radun, Jenni; Leisti, Tuomas; Virtanen, Toni; Nyman, Göte; Häkkinen, Jukka

    2014-11-01

    To understand the viewing strategies employed in a quality estimation task, we compared two visual tasks-quality estimation and difference estimation. The estimation was done for a pair of natural images having small global changes in quality. Two groups of observers estimated the same set of images, but with different instructions. One group estimated the difference in quality and the other the difference between image pairs. The results demonstrated the use of different visual strategies in the tasks. The quality estimation was found to include more visual planning during the first fixation than the difference estimation, but afterward needed only a few long fixations on the semantically important areas of the image. The difference estimation used many short fixations. Salient image areas were mainly attended to when these areas were also semantically important. The results support the hypothesis that these tasks' general characteristics (evaluation time, number of fixations, area fixated on) show differences in processing, but also suggest that examining only single fixations when comparing tasks is too narrow a view. When planning a subjective experiment, one must remember that a small change in the instructions might lead to a noticeable change in viewing strategy.

  20. Time dependent analysis of assay comparability: a novel approach to understand intra- and inter-site variability over time

    NASA Astrophysics Data System (ADS)

    Winiwarter, Susanne; Middleton, Brian; Jones, Barry; Courtney, Paul; Lindmark, Bo; Page, Ken M.; Clark, Alan; Landqvist, Claire

    2015-09-01

    We demonstrate here a novel use of statistical tools to study intra- and inter-site assay variability of five early drug metabolism and pharmacokinetics in vitro assays over time. Firstly, a tool for process control is presented. It shows the overall assay variability but allows also the following of changes due to assay adjustments and can additionally highlight other, potentially unexpected variations. Secondly, we define the minimum discriminatory difference/ratio to support projects to understand how experimental values measured at different sites at a given time can be compared. Such discriminatory values are calculated for 3 month periods and followed over time for each assay. Again assay modifications, especially assay harmonization efforts, can be noted. Both the process control tool and the variability estimates are based on the results of control compounds tested every time an assay is run. Variability estimates for a limited set of project compounds were computed as well and found to be comparable. This analysis reinforces the need to consider assay variability in decision making, compound ranking and in silico modeling.

Top