A Semantics-Based Approach to Construction Cost Estimating
ERIC Educational Resources Information Center
Niknam, Mehrdad
2015-01-01
A construction project requires collaboration of different organizations such as owner, designer, contractor, and resource suppliers. These organizations need to exchange information to improve their teamwork. Understanding the information created in other organizations requires specialized human resources. Construction cost estimating is one of…
MARC ES: a computer program for estimating medical information storage requirements.
Konoske, P J; Dobbins, R W; Gauker, E D
1998-01-01
During combat, documentation of medical treatment information is critical for maintaining continuity of patient care. However, knowledge of prior status and treatment of patients is limited to the information noted on a paper field medical card. The Multi-technology Automated Reader Card (MARC), a smart card, has been identified as a potential storage mechanism for casualty medical information. Focusing on data capture and storage technology, this effort developed a Windows program, MARC ES, to estimate storage requirements for the MARC. The program calculates storage requirements for a variety of scenarios using medical documentation requirements, casualty rates, and casualty flows and provides the user with a tool to estimate the space required to store medical data at each echelon of care for selected operational theaters. The program can also be used to identify the point at which data must be uploaded from the MARC if size constraints are imposed. Furthermore, this model can be readily extended to other systems that store or transmit medical information.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-21
.... Estimated Cost (Operation and Maintenance): $0. IV. Public Participation--Submission of Comments on This... costs) is minimal, collection instruments are clearly understood, and OSHA's estimate of the information... of OSHA's estimate of the burden (time and costs) of the information collection requirements...
NASA Astrophysics Data System (ADS)
Grenn, Michael W.
This dissertation introduces a theory of information quality to explain macroscopic behavior observed in the systems engineering process. The theory extends principles of Shannon's mathematical theory of communication [1948] and statistical mechanics to information development processes concerned with the flow, transformation, and meaning of information. The meaning of requirements information in the systems engineering context is estimated or measured in terms of the cumulative requirements quality Q which corresponds to the distribution of the requirements among the available quality levels. The requirements entropy framework (REF) implements the theory to address the requirements engineering problem. The REF defines the relationship between requirements changes, requirements volatility, requirements quality, requirements entropy and uncertainty, and engineering effort. The REF is evaluated via simulation experiments to assess its practical utility as a new method for measuring, monitoring and predicting requirements trends and engineering effort at any given time in the process. The REF treats the requirements engineering process as an open system in which the requirements are discrete information entities that transition from initial states of high entropy, disorder and uncertainty toward the desired state of minimum entropy as engineering effort is input and requirements increase in quality. The distribution of the total number of requirements R among the N discrete quality levels is determined by the number of defined quality attributes accumulated by R at any given time. Quantum statistics are used to estimate the number of possibilities P for arranging R among the available quality levels. The requirements entropy H R is estimated using R, N and P by extending principles of information theory and statistical mechanics to the requirements engineering process. The information I increases as HR and uncertainty decrease, and the change in information AI needed to reach the desired state of quality is estimated from the perspective of the receiver. The HR may increase, decrease or remain steady depending on the degree to which additions, deletions and revisions impact the distribution of R among the quality levels. Current requirements trend metrics generally treat additions, deletions and revisions the same and simply measure the quantity of these changes over time. The REF evaluates the quantity of requirements changes over time, distinguishes between their positive and negative effects by calculating their impact on HR, Q, and AI, and forecasts when the desired state will be reached, enabling more accurate assessment of the status and progress of the requirements engineering effort. Results from random variable simulations suggest the REF is an improved leading indicator of requirements trends that can be readily combined with current methods. The increase in I, or decrease in H R and uncertainty, is proportional to the engineering effort E input into the requirements engineering process. The REF estimates the AE needed to transition R from their current state of quality to the desired end state or some other interim state of interest. Simulation results are compared with measured engineering effort data for Department of Defense programs published in the SE literature, and the results suggest the REF is a promising new method for estimation of AE.
76 FR 60853 - Agency Information Collection Activities: Documents Required Aboard Private Aircraft
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-30
... respondents or record keepers from the collection of information (a total of capital/startup costs and.... Estimated Number of Respondents: 120,000. Estimated Number of Annual Responses: 120,000. Estimated Time per...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-22
.... Estimated Total urden Hours: 222,924. Estimated Cost (Operation and Maintenance): $0. IV. Public... costs) is minimal, collection instruments are clearly understood, and OSHA's estimate of the information... of OSHA's estimate of the burden (time and costs) of the information collection requirements...
Improving size estimates of open animal populations by incorporating information on age
Manly, Bryan F.J.; McDonald, Trent L.; Amstrup, Steven C.; Regehr, Eric V.
2003-01-01
Around the world, a great deal of effort is expended each year to estimate the sizes of wild animal populations. Unfortunately, population size has proven to be one of the most intractable parameters to estimate. The capture-recapture estimation models most commonly used (of the Jolly-Seber type) are complicated and require numerous, sometimes questionable, assumptions. The derived estimates usually have large variances and lack consistency over time. In capture–recapture studies of long-lived animals, the ages of captured animals can often be determined with great accuracy and relative ease. We show how to incorporate age information into size estimates for open populations, where the size changes through births, deaths, immigration, and emigration. The proposed method allows more precise estimates of population size than the usual models, and it can provide these estimates from two sample occasions rather than the three usually required. Moreover, this method does not require specialized programs for capture-recapture data; researchers can derive their estimates using the logistic regression module in any standard statistical package.
78 FR 23944 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-23
...) will publish periodic summaries of proposed projects. To request more information on the proposed... estimates for these requirements are summarized in the table below: Annualized Burden Estimates Annual...
Kevin Megown; Andy Lister; Paul Patterson; Tracey Frescino; Dennis Jacobs; Jeremy Webb; Nicholas Daniels; Mark Finco
2015-01-01
The Image-based Change Estimation (ICE) protocols have been designed to respond to several Agency and Department information requirements. These include provisions set forth by the 2014 Farm Bill, the Forest Service Action Plan and Strategic Plan, the 2012 Planning Rule, and the 2015 Planning Directives. ICE outputs support the information needs by providing estimates...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-20
.... Estimated Cost (Operation and Maintenance): $54,197 IV. Public Participation--Submission of Comments on This... costs) is minimal, collection instruments are clearly understood, and OSHA's estimate of the information... accuracy of OSHA's estimate of the burden (time and costs) of the information collection requirements...
A Role for Memory in Prospective Timing informs Timing in Prospective Memory
Waldum, Emily R; Sahakyan, Lili
2014-01-01
Time-based prospective memory (TBPM) tasks require the estimation of time in passing – known as prospective timing. Prospective timing is said to depend on an attentionally-driven internal clock mechanism, and is thought to be unaffected by memory for interval information (for reviews see, Block, Hancock, & Zakay, 2010; Block & Zakay, 1997). A prospective timing task that required a verbal estimate following the entire interval (Experiment 1) and a TBPM task that required production of a target response during the interval (Experiment 2) were used to test an alternative view that episodic memory does influence prospective timing. In both experiments, participants performed an ongoing lexical decision task of fixed duration while a varying number of songs were played in the background. Experiment 1 results revealed that verbal time estimates became longer the more songs participants remembered from the interval, suggesting that memory for interval information influences prospective time estimates. In Experiment 2, participants who were asked to perform the TBPM task without the aid of an external clock made their target responses earlier as the number of songs increased, indicating that prospective estimates of elapsed time increased as more songs were experienced. For participants who had access to a clock, changes in clock-checking coincided with the occurrence of song boundaries, indicating that participants used both song information and clock information to estimate time. Finally, ongoing task performance and verbal reports in both experiments further substantiate a role for episodic memory in prospective timing. PMID:22984950
NASA Technical Reports Server (NTRS)
Korram, S.
1977-01-01
The design of general remote sensing-aided methodologies was studied to provide the estimates of several important inputs to water yield forecast models. These input parameters are snow area extent, snow water content, and evapotranspiration. The study area is Feather River Watershed (780,000 hectares), Northern California. The general approach involved a stepwise sequence of identification of the required information, sample design, measurement/estimation, and evaluation of results. All the relevent and available information types needed in the estimation process are being defined. These include Landsat, meteorological satellite, and aircraft imagery, topographic and geologic data, ground truth data, and climatic data from ground stations. A cost-effective multistage sampling approach was employed in quantification of all the required parameters. The physical and statistical models for both snow quantification and evapotranspiration estimation was developed. These models use the information obtained by aerial and ground data through appropriate statistical sampling design.
Approximate sample sizes required to estimate length distributions
Miranda, L.E.
2007-01-01
The sample sizes required to estimate fish length were determined by bootstrapping from reference length distributions. Depending on population characteristics and species-specific maximum lengths, 1-cm length-frequency histograms required 375-1,200 fish to estimate within 10% with 80% confidence, 2.5-cm histograms required 150-425 fish, proportional stock density required 75-140 fish, and mean length required 75-160 fish. In general, smaller species, smaller populations, populations with higher mortality, and simpler length statistics required fewer samples. Indices that require low sample sizes may be suitable for monitoring population status, and when large changes in length are evident, additional sampling effort may be allocated to more precisely define length status with more informative estimators. ?? Copyright by the American Fisheries Society 2007.
Dunham, Kylee; Grand, James B.
2016-01-01
We examined the effects of complexity and priors on the accuracy of models used to estimate ecological and observational processes, and to make predictions regarding population size and structure. State-space models are useful for estimating complex, unobservable population processes and making predictions about future populations based on limited data. To better understand the utility of state space models in evaluating population dynamics, we used them in a Bayesian framework and compared the accuracy of models with differing complexity, with and without informative priors using sequential importance sampling/resampling (SISR). Count data were simulated for 25 years using known parameters and observation process for each model. We used kernel smoothing to reduce the effect of particle depletion, which is common when estimating both states and parameters with SISR. Models using informative priors estimated parameter values and population size with greater accuracy than their non-informative counterparts. While the estimates of population size and trend did not suffer greatly in models using non-informative priors, the algorithm was unable to accurately estimate demographic parameters. This model framework provides reasonable estimates of population size when little to no information is available; however, when information on some vital rates is available, SISR can be used to obtain more precise estimates of population size and process. Incorporating model complexity such as that required by structured populations with stage-specific vital rates affects precision and accuracy when estimating latent population variables and predicting population dynamics. These results are important to consider when designing monitoring programs and conservation efforts requiring management of specific population segments.
Software Development Cost Estimation Executive Summary
NASA Technical Reports Server (NTRS)
Hihn, Jairus M.; Menzies, Tim
2006-01-01
Identify simple fully validated cost models that provide estimation uncertainty with cost estimate. Based on COCOMO variable set. Use machine learning techniques to determine: a) Minimum number of cost drivers required for NASA domain based cost models; b) Minimum number of data records required and c) Estimation Uncertainty. Build a repository of software cost estimation information. Coordinating tool development and data collection with: a) Tasks funded by PA&E Cost Analysis; b) IV&V Effort Estimation Task and c) NASA SEPG activities.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-03
... worker to obtain and post information for hoists. Total Burden Hours: 20,957. Estimated Cost (Operation... information is in the desired format, reporting burden (time and costs) is minimal, collection instruments are... accuracy of OSHA's estimate of the burden (time and costs) of the information collection requirements...
42 CFR 431.970 - Information submission requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
... for Estimating Improper Payments in Medicaid and CHIP § 431.970 Information submission requirements... payments in Medicaid and CHIP, that include but are not limited to— (1) Adjudicated fee-for-service (FFS... contracts, rate information, and any quarterly updates applicable to the review year for CHIP and, as...
42 CFR 431.970 - Information submission requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
... for Estimating Improper Payments in Medicaid and CHIP § 431.970 Information submission requirements... payments in Medicaid and CHIP, that include but are not limited to— (1) Adjudicated fee-for-service (FFS... contracts, rate information, and any quarterly updates applicable to the review year for CHIP and, as...
42 CFR 431.970 - Information submission requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... for Estimating Improper Payments in Medicaid and CHIP § 431.970 Information submission requirements... payments in Medicaid and CHIP, that include but are not limited to— (1) Adjudicated fee-for-service (FFS... contracts, rate information, and any quarterly updates applicable to the review year for CHIP and, as...
42 CFR 431.970 - Information submission requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
... for Estimating Improper Payments in Medicaid and CHIP § 431.970 Information submission requirements... payments in Medicaid and CHIP, that include but are not limited to— (1) Adjudicated fee-for-service (FFS... contracts, rate information, and any quarterly updates applicable to the review year for CHIP and, as...
42 CFR 431.970 - Information submission requirements.
Code of Federal Regulations, 2010 CFR
2010-10-01
... for Estimating Improper Payments in Medicaid and CHIP § 431.970 Information submission requirements... payments in Medicaid and CHIP, that include but are not limited to— (1) Adjudicated fee-for-service (FFS... contracts, rate information, and any quarterly updates applicable to the review year for CHIP and, as...
A Doppler centroid estimation algorithm for SAR systems optimized for the quasi-homogeneous source
NASA Technical Reports Server (NTRS)
Jin, Michael Y.
1989-01-01
Radar signal processing applications frequently require an estimate of the Doppler centroid of a received signal. The Doppler centroid estimate is required for synthetic aperture radar (SAR) processing. It is also required for some applications involving target motion estimation and antenna pointing direction estimation. In some cases, the Doppler centroid can be accurately estimated based on available information regarding the terrain topography, the relative motion between the sensor and the terrain, and the antenna pointing direction. Often, the accuracy of the Doppler centroid estimate can be improved by analyzing the characteristics of the received SAR signal. This kind of signal processing is also referred to as clutterlock processing. A Doppler centroid estimation (DCE) algorithm is described which contains a linear estimator optimized for the type of terrain surface that can be modeled by a quasi-homogeneous source (QHS). Information on the following topics is presented: (1) an introduction to the theory of Doppler centroid estimation; (2) analysis of the performance characteristics of previously reported DCE algorithms; (3) comparison of these analysis results with experimental results; (4) a description and performance analysis of a Doppler centroid estimator which is optimized for a QHS; and (5) comparison of the performance of the optimal QHS Doppler centroid estimator with that of previously reported methods.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-16
... information. \\5\\ Estimated number of hours an employee works each year = 2080, estimated average annual cost..., along, from, or in any of the streams or other bodies of water over which Congress has jurisdiction... water or water power from any Government dam. FERC-515: The information collected under the requirements...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-19
... date, we estimate that no more than one (1) person per year would be subject to this collection of information, and we do not anticipate receiving more than one report a year from any particular person...: Using the above estimate of one (1) affected person a year, with an estimated two (2) hours of...
Intertemporal consumption with directly measured welfare functions and subjective expectations
Kapteyn, Arie; Kleinjans, Kristin J.; van Soest, Arthur
2010-01-01
Euler equation estimation of intertemporal consumption models requires many, often unverifiable assumptions. These include assumptions on expectations and preferences. We aim at reducing some of these requirements by using direct subjective information on respondents’ preferences and expectations. The results suggest that individually measured welfare functions and expectations have predictive power for the variation in consumption across households. Furthermore, estimates of the intertemporal elasticity of substitution based on the estimated welfare functions are plausible and of a similar order of magnitude as other estimates found in the literature. The model favored by the data only requires cross-section data for estimation. PMID:20442798
Using Robust Variance Estimation to Combine Multiple Regression Estimates with Meta-Analysis
ERIC Educational Resources Information Center
Williams, Ryan
2013-01-01
The purpose of this study was to explore the use of robust variance estimation for combining commonly specified multiple regression models and for combining sample-dependent focal slope estimates from diversely specified models. The proposed estimator obviates traditionally required information about the covariance structure of the dependent…
Estimating sawmill processing capacity for Tongass timber: 2005 and 2006 update
Allen M. Brackley; Lisa K. Crone
2009-01-01
In spring 2006 and 2007, sawmill capacity and wood utilization information was collected for selected mills in southeast Alaska. The collected information is required to prepare information for compliance with Section 705(a) of the Tongass Timber Reform Act. The total estimated design capacity in the region (active and inactive mills) was 289,850 thousand board feet (...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-18
... import explosive materials or ammonium nitrate must, when required by the Director, furnish samples of such explosive materials or ammonium nitrate; information on chemical composition of those products... ammonium nitrate. (5) An estimate of the total number of respondents and the amount of time estimated for...
78 FR 54513 - Proposed Information Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-04
.... Estimated Time per Respondent: 1 hr. Estimated Total Annual Burden Hours: 1. Title: Indoor Tanning Services... (124 Stat. 119 (2010)) to impose an excise tax on indoor tanning services. This information is required to be maintained in order for providers to accurately calculate the tax on indoor tanning services...
Local Estimators for Spacecraft Formation Flying
NASA Technical Reports Server (NTRS)
Fathpour, Nanaz; Hadaegh, Fred Y.; Mesbahi, Mehran; Nabi, Marzieh
2011-01-01
A formation estimation architecture for formation flying builds upon the local information exchange among multiple local estimators. Spacecraft formation flying involves the coordination of states among multiple spacecraft through relative sensing, inter-spacecraft communication, and control. Most existing formation flying estimation algorithms can only be supported via highly centralized, all-to-all, static relative sensing. New algorithms are needed that are scalable, modular, and robust to variations in the topology and link characteristics of the formation exchange network. These distributed algorithms should rely on a local information-exchange network, relaxing the assumptions on existing algorithms. In this research, it was shown that only local observability is required to design a formation estimator and control law. The approach relies on breaking up the overall information-exchange network into sequence of local subnetworks, and invoking an agreement-type filter to reach consensus among local estimators within each local network. State estimates were obtained by a set of local measurements that were passed through a set of communicating Kalman filters to reach an overall state estimation for the formation. An optimization approach was also presented by means of which diffused estimates over the network can be incorporated in the local estimates obtained by each estimator via local measurements. This approach compares favorably with that obtained by a centralized Kalman filter, which requires complete knowledge of the raw measurement available to each estimator.
Methods to estimate irrigated reference crop evapotranspiration - a review.
Kumar, R; Jat, M K; Shankar, V
2012-01-01
Efficient water management of crops requires accurate irrigation scheduling which, in turn, requires the accurate measurement of crop water requirement. Irrigation is applied to replenish depleted moisture for optimum plant growth. Reference evapotranspiration plays an important role for the determination of water requirements for crops and irrigation scheduling. Various models/approaches varying from empirical to physically base distributed are available for the estimation of reference evapotranspiration. Mathematical models are useful tools to estimate the evapotranspiration and water requirement of crops, which is essential information required to design or choose best water management practices. In this paper the most commonly used models/approaches, which are suitable for the estimation of daily water requirement for agricultural crops grown in different agro-climatic regions, are reviewed. Further, an effort has been made to compare the accuracy of various widely used methods under different climatic conditions.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-30
...] General Services Administration Acquisition Regulation; Information Collection; Zero Burden Information... information collection requirement regarding zero burden information collection reports. Public comments are... utility; whether our estimate of the public burden of this collection of information is accurate and based...
Koltun, G.F.
2001-01-01
This report provides data and methods to aid in the hydrologic design or evaluation of impounding reservoirs and side-channel reservoirs used for water supply in Ohio. Data from 117 streamflow-gaging stations throughout Ohio were analyzed by means of nonsequential-mass-curve-analysis techniques to develop relations between storage requirements, water demand, duration, and frequency. Information also is provided on minimum runoff for selected durations and frequencies. Systematic record lengths for the streamflow-gaging stations ranged from about 10 to 75 years; however, in many cases, additional streamflow record was synthesized. For impounding reservoirs, families of curves are provided to facilitate the estimation of storage requirements as a function of demand and the ratio of the 7-day, 2-year low flow to the mean annual flow. Information is provided with which to evaluate separately the effects of evaporation on storage requirements. Comparisons of storage requirements for impounding reservoirs determined by nonsequential-mass-curve-analysis techniques with storage requirements determined by annual-mass-curve techniques that employ probability routing to account for carryover-storage requirements indicate that large differences in computed required storages can result from the two methods, particularly for conditions where demand cannot be met from within-year storage. For side-channel reservoirs, tables of demand-storage-frequency information are provided for a primary pump relation consisting of one variable-speed pump with a pumping capacity that ranges from 0.1 to 20 times demand. Tables of adjustment ratios are provided to facilitate determination of storage requirements for 19 other pump sets consisting of assorted combinations of fixed-speed pumps or variable-speed pumps with aggregate pumping capacities smaller than or equal to the primary pump relation. The effects of evaporation on side-channel reservoir storage requirements are incorporated into the storage-requirement estimates. The effects of an instream-flow requirement equal to the 80-percent-duration flow are also incorporated into the storage-requirement estimates.
NASA Technical Reports Server (NTRS)
Dickinson, William B.
1995-01-01
An Earth Sciences Data and Information System (ESDIS) Project Management Plan (PMP) is prepared. An ESDIS Project Systems Engineering Management Plan (SEMP) consistent with the developed PMP is also prepared. ESDIS and related EOS program requirements developments, management and analysis processes are evaluated. Opportunities to improve the effectiveness of these processes and program/project responsiveness to requirements are identified. Overall ESDIS cost estimation processes are evaluated, and recommendations to improve cost estimating and modeling techniques are developed. ESDIS schedules and scheduling tools are evaluated. Risk assessment, risk mitigation strategies and approaches, and use of risk information in management decision-making are addressed.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-10
.... Estimated Time per Response: 2 hours. Frequency of Response: On occasion reporting requirement. Obligation...,200 responses. Estimated Time per Response: 1 to 1.5 hours. Frequency of Response: On occasion... Time per Response: 2 to 5 hours. Frequency of Response: On occasion reporting requirement; Third party...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-04
... Cost (Operation and Maintenance): $0. IV. Public Participation--Submission of Comments on This Notice... and costs) is minimal, collection instruments are clearly understood, and OSHA's estimate of the... information is useful; The accuracy of OSHA's estimate of the burden (time and costs) of the information...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-21
... instructional literature requirements in the Safety Standard for Infant walkers. DATES: Submit written or... for marking and instructional literature. We estimate the burden of this collection of information as... literature that are disclosure requirements, thus falling within the definition of ``collections of...
78 FR 47317 - Agency Information Collection Activities; Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-05
... availability of warranty terms and minimum standards for informal dispute settlement mechanisms that are...--requires a mix of legal analysis (50%), legal support (paralegals) (25%) and clerical help (25%). Staff estimates that half of the total burden hours (58,064 hours) requires legal analysis at an average hourly...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-03
... estimated. The proposed rule will require that State agencies provide descriptive information regarding the... Burden on Respondents Section 272.12(3) requires that States provide both descriptive and analytic... analysis in the normal course of their own planning and decisionmaking. The descriptive information should...
Laurence, Caroline O; Heywood, Troy; Bell, Janice; Atkinson, Kaye; Karnon, Jonathan
2018-03-27
Health workforce planning models have been developed to estimate the future health workforce requirements for a population whom they serve and have been used to inform policy decisions. To adapt and further develop a need-based GP workforce simulation model to incorporate current and estimated geographic distribution of patients and GPs. A need-based simulation model that estimates the supply of GPs and levels of services required in South Australia (SA) was adapted and applied to the Western Australian (WA) workforce. The main outcome measure was the differences in the number of full-time equivalent (FTE) GPs supplied and required from 2013 to 2033. The base scenario estimated a shortage of GPs in WA from 2019 onwards with a shortage of 493 FTE GPs in 2033, while for SA, estimates showed an oversupply over the projection period. The WA urban and rural models estimated an urban shortage of GPs over this period. A reduced international medical graduate recruitment scenario resulted in estimated shortfalls of GPs by 2033 for WA and SA. The WA-specific scenarios of lower population projections and registrar work value resulted in a reduced shortage of FTE GPs in 2033, while unfilled training places increased the shortfall of FTE GPs in 2033. The simulation model incorporates contextual differences to its structure that allows within and cross jurisdictional comparisons of workforce estimations. It also provides greater insights into the drivers of supply and demand and the impact of changes in workforce policy, promoting more informed decision-making.
78 FR 33116 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-03
... action to submit an information collection request to the Office of Management and Budget (OMB) and... report system.'' The total number of reports is estimated to be 350 per year. 4. Who is required or asked... practical utility? 2. Is the burden estimate accurate? 3. Is there a way to enhance the quality, utility...
NASA Astrophysics Data System (ADS)
Park, Jin-Young; Lee, Dong-Eun; Kim, Byung-Soo
2017-10-01
Due to the increasing concern about climate change, efforts to reduce environmental load are continuously being made in construction industry, and LCA (life cycle assessment) is being presented as an effective method to assess environmental load. Since LCA requires information on construction quantity used for environmental load estimation, however, it is not being utilized in the environmental review in the early design phase where it is difficult to obtain such information. In this study, computation system for construction quantity based on standard cross section of road drainage facilities was developed to compute construction quantity required for LCA using only information available in the early design phase to develop and verify the effectiveness of a model that can perform environmental load estimation. The result showed that it is an effective model that can be used in the early design phase as it revealed a 13.39% mean absolute error rate.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-17
... clearly the definition of capital used in any aspect of its internal capital adequacy assessment process (ICAAP) and document any changes in the internal definition of capital. Section 41 requires banks to... regulatory or accounting). The agencies' burden estimates for these information collection requirements are...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-16
... to total protein. FDA estimates the burden of this collection of information as follows: Table 1... Requirements for the Soy Protein and Risk of Coronary Heart Disease Health Claim AGENCY: Food and Drug... notice solicits comments on the record retention requirements for the soy protein and coronary heart...
NASA Technical Reports Server (NTRS)
Klich, P. J.; Macconochie, I. O.
1979-01-01
A study of an array of advanced earth-to-orbit space transportation systems with a focus on mass properties and technology requirements is presented. Methods of estimating weights of these vehicles differ from those used for commercial and military aircraft; the new techniques emphasizing winged horizontal and vertical takeoff advanced systems are described utilizing the space shuttle subsystem data base for the weight estimating equations. The weight equations require information on mission profile, the structural materials, the thermal protection system, and the ascent propulsion system, allowing for the type of construction and various propellant tank shapes. The overall system weights are calculated using this information and incorporated into the Systems Engineering Mass Properties Computer Program.
50 CFR 36.3 - Information collection.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 50 Wildlife and Fisheries 6 2010-10-01 2010-10-01 false Information collection. 36.3 Section 36.3... Information collection. The information collection requirements contained in this part have been approved by... collection of information is estimated to average 1.5 hours each for 150 non-competitively awarded permits...
M. A. Wulder; J. C. White; B. J. Bentz
2005-01-01
Estimates of the location and extent of the red attack stage of mountain pine beetle (Dentroctonus ponderosae Hopkins) infestations are critical for forest management. The degree of spatial and temporal precision required for these estimates varies according to the management objectives and the nature of the infestation. This paper outlines a hierarchy of information...
Dispersion of Sound in Marine Sediments
2015-09-30
primary objective of this work is to investigate the approach to use the information in the extracted mode amplitudes to invert for sound attenuation...marine sediment. APPROACH Previous work carried out on the use of modal amplitude information for estimating sound attenuation in the sediments...investigate the intrinsic modal interference. Estimation of sound attenuation in marine sediments from modal amplitudes requires knowledge of the
78 FR 62932 - Agency Information Collection Activities: Proposed Request and Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-22
... the agency's burden estimate; the need for the information; its practical utility; ways to enhance its...-Employment Information, Employee Information, Employer Information--20 CFR 422.120-0960-0508. When SSA cannot... employers. While we need this information to ensure the correct payment of benefits, we do not require...
Coding “What” and “When” in the Archer Fish Retina
Vasserman, Genadiy; Shamir, Maoz; Ben Simon, Avi; Segev, Ronen
2010-01-01
Traditionally, the information content of the neural response is quantified using statistics of the responses relative to stimulus onset time with the assumption that the brain uses onset time to infer stimulus identity. However, stimulus onset time must also be estimated by the brain, making the utility of such an approach questionable. How can stimulus onset be estimated from the neural responses with sufficient accuracy to ensure reliable stimulus identification? We address this question using the framework of colour coding by the archer fish retinal ganglion cell. We found that stimulus identity, “what”, can be estimated from the responses of best single cells with an accuracy comparable to that of the animal's psychophysical estimation. However, to extract this information, an accurate estimation of stimulus onset is essential. We show that stimulus onset time, “when”, can be estimated using a linear-nonlinear readout mechanism that requires the response of a population of 100 cells. Thus, stimulus onset time can be estimated using a relatively simple readout. However, large nerve cell populations are required to achieve sufficient accuracy. PMID:21079682
48 CFR 252.215-7002 - Cost estimating system requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Department of Defense to rely upon information produced by the system that is needed for management purposes... management systems; and (4) Is subject to applicable financial control systems. Estimating system means the... estimates of costs and other data included in proposals submitted to customers in the expectation of...
Information systems - Issues in global habitability
NASA Technical Reports Server (NTRS)
Norman, S. D.; Brass, J. A.; Jones, H.; Morse, D. R.
1984-01-01
The present investigation is concerned with fundamental issues, related to information considerations, which arise in an interdisciplinary approach to questions of global habitability. Information system problems and issues are illustrated with the aid of an example involving biochemical cycling and biochemical productivity. The estimation of net primary production (NPP) as an important consideration in the overall global habitability issue is discussed. The NPP model requires three types of data, related to meteorological information, a land surface inventory, and the vegetation structure. Approaches for obtaining and processing these data are discussed. Attention is given to user requirements, information system requirements, workstations, network communications, hardware/software access, and data management.
Mathematical Modeling of Programmatic Requirements for Yaws Eradication
Mitjà, Oriol; Fitzpatrick, Christopher; Asiedu, Kingsley; Solomon, Anthony W.; Mabey, David C.W.; Funk, Sebastian
2017-01-01
Yaws is targeted for eradication by 2020. The mainstay of the eradication strategy is mass treatment followed by case finding. Modeling has been used to inform programmatic requirements for other neglected tropical diseases and could provide insights into yaws eradication. We developed a model of yaws transmission varying the coverage and number of rounds of treatment. The estimated number of cases arising from an index case (basic reproduction number [R0]) ranged from 1.08 to 3.32. To have 80% probability of achieving eradication, 8 rounds of treatment with 80% coverage were required at low estimates of R0 (1.45). This requirement increased to 95% at high estimates of R0 (2.47). Extending the treatment interval to 12 months increased requirements at all estimates of R0. At high estimates of R0 with 12 monthly rounds of treatment, no combination of variables achieved eradication. Models should be used to guide the scale-up of yaws eradication. PMID:27983500
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-21
... Officer, (202) 452-3829, Division of Research and Statistics, Board of Governors of the Federal Reserve... the information collection, including the validity of the methodology and assumptions used; (c) Ways... measures (such as regulatory or accounting). The agencies' burden estimates for these information...
77 FR 62273 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-12
... automating regulatory filings and business information processing. We estimate that 10,229 respondents per... Budget (``OMB'') this request for extension of the previously approved collection of information discussed below. The ``Interactive Data'' collection of information requires issuers filing registration...
Methods of adjusting the stable estimates of fertility for the effects of mortality decline.
Abou-Gamrah, H
1976-03-01
Summary The paper shows how stable population methods, based on the age structure and the rate of increase, may be used to estimate the demographic measures of a quasi-stable population. After a discussion of known methods for adjusting the stable estimates to allow for the effects of mortality decline two new methods are presented, the application of which requires less information. The first method does not need any supplementary information, and the second method requires an estimate of the difference between the last two five-year intercensal rates of increase, i.e. five times the annual change of the rate of increase during the last ten years. For these new methods we do not need to know the onset year of mortality decline as in the Coale-Demeny method, or a long series of rates of increase as in Zachariah's method.
76 FR 53420 - Information Collection; Submission for OMB Review, Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-26
... requirements of successful applicants. Type of Review: New Information Collection. Agency: Corporation for... providers. Estimated Total Burden Hours: 658 hours. Total Burden Cost (capital/startup): None. Total Burden...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-12
...: Required to obtain or retain a benefit. Frequency of Collection: On occasion. Estimated Number of Annual... Form is required when the applicant has requested a solo hike, an itinerary considered overly ambitious for the typical hiker, or a summer hike outside the Corridor Zone. The form asks for information that...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-17
....209, 53.211, and 53.213, Accounting Safeguards and Sections 260 and 271-276 of the Communications Act... reporting requirements, third party disclosure requirement, and recordkeeping requirement. Obligation to... party disclosure requirements). There is no change in the Commission's burden estimates. A Bell...
1980-02-26
months estimated to be required in some areas), and more direct invol ement of information users in long range planning of information requirements (with...most people, there is a definite need to educate the members of the organization as to the implications of the IRM approach. Emphasis should be placed...from information sharing and a coordinated approach. Such an educational process has already begun with the execution of this study, but more must be
77 FR 10761 - Agency Information Collection Activities: Screening Requirements for Carriers
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-23
... the collection of information (a total capital/ startup costs and operations and maintenance costs... per Respondent: 100 hours. Estimated Total Annual Burden Hours: 6,500. Dated: February 17, 2012...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-30
... viewing on http://www.regulations.gov without change. All Personal Identifying Information (for example... Confidential Business Information or otherwise sensitive or protected information. NMFS will accept anonymous... burden-hour estimates or other aspects of the collection-of-information requirements contained in this...
78 FR 13157 - Proposed Collection; Comment Request for Form 13803
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-26
... submit the required information necessary to complete the e-services enrollment process for IVES users... information displays a valid OMB control number. Books or records relating to a collection of information must... forms of information technology; and (e) estimates of capital or start-up costs and costs of operation...
77 FR 38144 - Proposed Collection; Comment Request for Form 8594
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-26
... information technology; and (e) estimates of capital or start-up costs and costs of operation, maintenance... proposed and/or continuing information collections, as required by the Paperwork Reduction Act of 1995... INFORMATION CONTACT: Requests for additional information or copies of the form and instructions should be...
76 FR 13447 - Proposed Collection; Comment Request for Regulation Project
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-11
... forms of information technology; and (e) estimates of capital or start-up costs and costs of operation... opportunity to comment on proposed and/or continuing information collections, as required by the Paperwork...., Washington, DC 20224. FOR FURTHER INFORMATION CONTACT: Requests for additional information or copies of the...
77 FR 4883 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-31
... information technology; and (e) Estimates of capital or start-up costs and costs of operation, maintenance... DEPARTMENT OF THE TREASURY Agency Information Collection Activities: Submission for OMB Review... continuing information collection, as required by the Paperwork Reduction Act of 1995. An agency may not...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-26
... information technology, e.g., permitting electronic submission of responses. Summary of Information Collection... specific medical standards and physical requirements. The information will be used to make a recommendation on either hiring or not hiring an applicant. (5) An estimate of the total number of respondents and...
75 FR 9003 - Agency Information Collection Activities; Request for Comment
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-26
... a previously approved information collection consisting of a customer survey form. OSC is required... practical utility; (b) the accuracy of OSC's estimate of the burden of the proposed collections of information; (c) ways to enhance the quality, utility, and clarity of the information to be collected; and (d...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-24
... zero hours and zero costs. Number of Respondents: We estimate that there will be no more than one per... our experience to date, we estimate that no more than one (1) person per year would be subject to this collection of information, and we do not anticipate receiving more than one report a year from any particular...
Rabideau, Dustin J; Pei, Pamela P; Walensky, Rochelle P; Zheng, Amy; Parker, Robert A
2018-02-01
The expected value of sample information (EVSI) can help prioritize research but its application is hampered by computational infeasibility, especially for complex models. We investigated an approach by Strong and colleagues to estimate EVSI by applying generalized additive models (GAM) to results generated from a probabilistic sensitivity analysis (PSA). For 3 potential HIV prevention and treatment strategies, we estimated life expectancy and lifetime costs using the Cost-effectiveness of Preventing AIDS Complications (CEPAC) model, a complex patient-level microsimulation model of HIV progression. We fitted a GAM-a flexible regression model that estimates the functional form as part of the model fitting process-to the incremental net monetary benefits obtained from the CEPAC PSA. For each case study, we calculated the expected value of partial perfect information (EVPPI) using both the conventional nested Monte Carlo approach and the GAM approach. EVSI was calculated using the GAM approach. For all 3 case studies, the GAM approach consistently gave similar estimates of EVPPI compared with the conventional approach. The EVSI behaved as expected: it increased and converged to EVPPI for larger sample sizes. For each case study, generating the PSA results for the GAM approach required 3 to 4 days on a shared cluster, after which EVPPI and EVSI across a range of sample sizes were evaluated in minutes. The conventional approach required approximately 5 weeks for the EVPPI calculation alone. Estimating EVSI using the GAM approach with results from a PSA dramatically reduced the time required to conduct a computationally intense project, which would otherwise have been impractical. Using the GAM approach, we can efficiently provide policy makers with EVSI estimates, even for complex patient-level microsimulation models.
Filtering observations without the initial guess
NASA Astrophysics Data System (ADS)
Chin, T. M.; Abbondanza, C.; Gross, R. S.; Heflin, M. B.; Parker, J. W.; Soja, B.; Wu, X.
2017-12-01
Noisy geophysical observations sampled irregularly over space and time are often numerically "analyzed" or "filtered" before scientific usage. The standard analysis and filtering techniques based on the Bayesian principle requires "a priori" joint distribution of all the geophysical parameters of interest. However, such prior distributions are seldom known fully in practice, and best-guess mean values (e.g., "climatology" or "background" data if available) accompanied by some arbitrarily set covariance values are often used in lieu. It is therefore desirable to be able to exploit efficient (time sequential) Bayesian algorithms like the Kalman filter while not forced to provide a prior distribution (i.e., initial mean and covariance). An example of this is the estimation of the terrestrial reference frame (TRF) where requirement for numerical precision is such that any use of a priori constraints on the observation data needs to be minimized. We will present the Information Filter algorithm, a variant of the Kalman filter that does not require an initial distribution, and apply the algorithm (and an accompanying smoothing algorithm) to the TRF estimation problem. We show that the information filter allows temporal propagation of partial information on the distribution (marginal distribution of a transformed version of the state vector), instead of the full distribution (mean and covariance) required by the standard Kalman filter. The information filter appears to be a natural choice for the task of filtering observational data in general cases where prior assumption on the initial estimate is not available and/or desirable. For application to data assimilation problems, reduced-order approximations of both the information filter and square-root information filter (SRIF) have been published, and the former has previously been applied to a regional configuration of the HYCOM ocean general circulation model. Such approximation approaches are also briefed in the presentation.
Hierarchical models and Bayesian analysis of bird survey information
Sauer, J.R.; Link, W.A.; Royle, J. Andrew; Ralph, C. John; Rich, Terrell D.
2005-01-01
Summary of bird survey information is a critical component of conservation activities, but often our summaries rely on statistical methods that do not accommodate the limitations of the information. Prioritization of species requires ranking and analysis of species by magnitude of population trend, but often magnitude of trend is a misleading measure of actual decline when trend is poorly estimated. Aggregation of population information among regions is also complicated by varying quality of estimates among regions. Hierarchical models provide a reasonable means of accommodating concerns about aggregation and ranking of quantities of varying precision. In these models the need to consider multiple scales is accommodated by placing distributional assumptions on collections of parameters. For collections of species trends, this allows probability statements to be made about the collections of species-specific parameters, rather than about the estimates. We define and illustrate hierarchical models for two commonly encountered situations in bird conservation: (1) Estimating attributes of collections of species estimates, including ranking of trends, estimating number of species with increasing populations, and assessing population stability with regard to predefined trend magnitudes; and (2) estimation of regional population change, aggregating information from bird surveys over strata. User-friendly computer software makes hierarchical models readily accessible to scientists.
76 FR 44659 - Proposed Collection; Comment Request for Regulation Project
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-26
... information collection requirements related to new technologies in retirement plans. DATES: Written comments....gov . SUPPLEMENTARY INFORMATION: Title: New Technologies in Retirement Plans. OMB Number: 1545-1632... information technology; and (e) estimates of capital or start-up costs and costs of operation, maintenance...
77 FR 59455 - Internal Revenue Service
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-27
...-37 describes documentation and information a taxpayer that uses the fair market value method of... information technology; and (e) estimates of capital or start-up costs and costs of operation, maintenance... on proposed and/or continuing information collections, as required by the Paperwork Reduction Act of...
NASA Technical Reports Server (NTRS)
Rediess, Herman A.; Ramnath, Rudrapatna V.; Vrable, Daniel L.; Hirvo, David H.; Mcmillen, Lowell D.; Osofsky, Irving B.
1991-01-01
The results are presented of a study to identify potential real time remote computational applications to support monitoring HRV flight test experiments along with definitions of preliminary requirements. A major expansion of the support capability available at Ames-Dryden was considered. The focus is on the use of extensive computation and data bases together with real time flight data to generate and present high level information to those monitoring the flight. Six examples were considered: (1) boundary layer transition location; (2) shock wave position estimation; (3) performance estimation; (4) surface temperature estimation; (5) critical structural stress estimation; and (6) stability estimation.
Estimating risk reduction required to break even in a health promotion program.
Ozminkowski, Ronald J; Goetzel, Ron Z; Santoro, Jan; Saenz, Betty-Jo; Eley, Christine; Gorsky, Bob
2004-01-01
To illustrate a formula to estimate the amount of risk reduction required to break even on a corporate health promotion program. A case study design was implemented. Base year (2001) health risk and medical expenditure data from the company, along with published information on the relationships between employee demographics, health risks, and medical expenditures, were used to forecast demographics, risks, and expenditures for 2002 through 2011 and estimate the required amount of risk reduction. Motorola. 52,124 domestic employees. Demographics included age, gender, race, and job type. Health risks for 2001 were measured via health risk appraisal. Risks were noted as either high or low and related to exercise/eating habits, body weight, blood pressure, blood sugar levels, cholesterol levels, depression, stress, smoking/drinking habits, and seat belt use. Medical claims for 2001 were used to calculate medical expenditures per employee. Assuming a dollar 282 per employee program cost, Motorola employees would need to reduce their lifestyle-related health risks by 1.08% to 1.42% per year to break even on health promotion programming, depending upon the discount rate. Higher or lower program investments would change the risk reduction percentages. Employers can use information from published studies, along with their own data, to estimate the amount of risk reduction required to break even on their health promotion programs.
USDA registration and rectification requirements
NASA Technical Reports Server (NTRS)
Allen, R.
1982-01-01
Some of the requirements of the United States Department of Agriculture for accuracy of aerospace acquired data, and specifically, requirements for registration and rectification of remotely sensed data are discussed. Particular attention is given to foreign and domestic crop estimation and forecasting, forestry information applications, and rangeland condition evaluations.
A Qualitative Analysis of the Navy’s HSI Billet Structure
2008-06-01
of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...subspecialty code. The research results support the hypothesis that the work requirements of the July 2007 data set of 4600P-coded billets (billets
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-22
... mortgage insurance is terminated and no claim for insurance benefits will be filed. This information is... proposed information collection requirement described below will be submitted to the Office of Management... information will have practical utility; (2) evaluate the accuracy of the agency's estimate of the burden of...
76 FR 51126 - Proposed Collection; Comment Request for Regulation Project
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-17
....gov . SUPPLEMENTARY INFORMATION: Title: Disclosure of Relative Values of Optional Forms of Benefit... forms of information technology; and (e) estimates of capital or start-up costs and costs of operation... on proposed and/or continuing information collections, as required by the Paperwork Reduction Act of...
Breast surface estimation for radar-based breast imaging systems.
Williams, Trevor C; Sill, Jeff M; Fear, Elise C
2008-06-01
Radar-based microwave breast-imaging techniques typically require the antennas to be placed at a certain distance from or on the breast surface. This requires prior knowledge of the breast location, shape, and size. The method proposed in this paper for obtaining this information is based on a modified tissue sensing adaptive radar algorithm. First, a breast surface detection scan is performed. Data from this scan are used to localize the breast by creating an estimate of the breast surface. If required, the antennas may then be placed at specified distances from the breast surface for a second tumor-sensing scan. This paper introduces the breast surface estimation and antenna placement algorithms. Surface estimation and antenna placement results are demonstrated on three-dimensional breast models derived from magnetic resonance images.
Bundschuh, Mirco; Newman, Michael C; Zubrod, Jochen P; Seitz, Frank; Rosenfeldt, Ricki R; Schulz, Ralf
2015-03-01
We argued recently that the positive predictive value (PPV) and the negative predictive value (NPV) are valuable metrics to include during null hypothesis significance testing: They inform the researcher about the probability of statistically significant and non-significant test outcomes actually being true. Although commonly misunderstood, a reported p value estimates only the probability of obtaining the results or more extreme results if the null hypothesis of no effect was true. Calculations of the more informative PPV and NPV require a priori estimate of the probability (R). The present document discusses challenges of estimating R.
Regional Input-Output Tables and Trade Flows: an Integrated and Interregional Non-survey Approach
Boero, Riccardo; Edwards, Brian Keith; Rivera, Michael Kelly
2017-03-20
Regional input–output tables and trade flows: an integrated and interregional non-survey approach. Regional Studies. Regional analyses require detailed and accurate information about dynamics happening within and between regional economies. However, regional input–output tables and trade flows are rarely observed and they must be estimated using up-to-date information. Common estimation approaches vary widely but consider tables and flows independently. Here, by using commonly used economic assumptions and available economic information, this paper presents a method that integrates the estimation of regional input–output tables and trade flows across regions. Examples of the method implementation are presented and compared with other approaches, suggestingmore » that the integrated approach provides advantages in terms of estimation accuracy and analytical capabilities.« less
Regional Input-Output Tables and Trade Flows: an Integrated and Interregional Non-survey Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boero, Riccardo; Edwards, Brian Keith; Rivera, Michael Kelly
Regional input–output tables and trade flows: an integrated and interregional non-survey approach. Regional Studies. Regional analyses require detailed and accurate information about dynamics happening within and between regional economies. However, regional input–output tables and trade flows are rarely observed and they must be estimated using up-to-date information. Common estimation approaches vary widely but consider tables and flows independently. Here, by using commonly used economic assumptions and available economic information, this paper presents a method that integrates the estimation of regional input–output tables and trade flows across regions. Examples of the method implementation are presented and compared with other approaches, suggestingmore » that the integrated approach provides advantages in terms of estimation accuracy and analytical capabilities.« less
Reinecke, R D; Steinberg, T
1981-04-01
This is the second in the series of Ophthalmology Manpower Studies. Part I presented estimates of disease prevalence and incidence, the average amount of time required to care for such conditions, and based on that information, the total hours of ophthalmological services required to care for all the projected need in the population. Using different estimates of the average number of hours worked per year per ophthalmologist (based on a 35, 40 and 48 hours/week in patient care), estimates of the total number of ophthalmologists required were calculated. This method is basically similar to the method later adopted by the Graduate Medical Education National Advisory Committee (GMENAC) to arrive at estimates of hours of ophthalmological services required for 1990. However, instead of using all the need present in the population, the GMENAC panel chose to use an "adjusted-needs based" model as a compromise between total need and actual utilization, the former being an overestimation and the latter being an underestimation since it is in part a function of the barriers to medical care. Since some of these barriers to medical care include informational factors, as well as availability and accessibility, this study was undertaken to assess the utilization of these services and the adequacy of present ophthalmological manpower in the opinion of the consumer. Also, since the consumer's choice or behavior depends on being informed about the differences between optometrists and ophthalmologists, such knowledge was assessed and the responses further evaluated after explanatory statements were made to the responders.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-11
...The Food and Drug Administration (FDA) is announcing an opportunity for public comment on the proposed collection of certain information by the agency. Under the Paperwork Reduction Act of 1995 (the PRA), Federal agencies are required to publish notice in the Federal Register concerning each proposed collection of information, including each proposed extension of an existing collection of information, and to allow 60 days for public comment in response to the notice. This notice solicits comments on the estimated reporting and recordkeeping burden associated with the Mammography Quality Standards Act requirements.
Harnessing the Power or Narratives to Understand User Requirements
ERIC Educational Resources Information Center
Hvalshagen, Merete
2011-01-01
It is estimated that $150 billion is wasted yearly on information and communications technology failures in the US. The most common reason for failure is deficiencies in the specified requirements. IS researchers and IS practitioners are therefore continuously exploring methods for effectively analyzing and capturing the user requirements. One…
78 FR 28020 - Proposed Collection: Comment Request for Form 5498.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-13
... information technology; and (e) estimates of capital or start-up costs and costs of operation, maintenance... on proposed and/or continuing information collections, as required by the Paperwork Reduction Act of... Form 5498, IRA Contribution Information. DATES: Written comments should be received on or before July...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-12
... INFORMATION: Maritime Administration (MARAD). Title: Approval of Underwriters for Marine Hull Insurance. OMB... of information involves the approval of marine hull underwriters to insure MARAD program vessels... suitability for providing marine hull insurance on MARAD vessels. Annual Estimated Burden Hours: 46 hours...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-23
... a consumer report based on a direct request from a consumer. Request for Comment Comments are... information: Title: Procedures to Enhance the Accuracy and Integrity of Information Furnished to Consumer... received directly from consumers 8 hours to implement the new dispute notice requirement. Estimated burden...
Budgeting Facilities Operation Costs Using the Facilities Operation Model
2011-06-01
practices that today’s modern buildings have built into them. Several factors can change from the time the requirement is generated to when actual...information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and...BOS required $4.2 billion.2 In FY2012, it is estimated it will reach $4.6 billion.3 Unlike sustainment and modernization , failure to fund facility
Tree Biomass Estimation of Chinese fir (Cunninghamia lanceolata) Based on Bayesian Method
Zhang, Jianguo
2013-01-01
Chinese fir (Cunninghamia lanceolata (Lamb.) Hook.) is the most important conifer species for timber production with huge distribution area in southern China. Accurate estimation of biomass is required for accounting and monitoring Chinese forest carbon stocking. In the study, allometric equation was used to analyze tree biomass of Chinese fir. The common methods for estimating allometric model have taken the classical approach based on the frequency interpretation of probability. However, many different biotic and abiotic factors introduce variability in Chinese fir biomass model, suggesting that parameters of biomass model are better represented by probability distributions rather than fixed values as classical method. To deal with the problem, Bayesian method was used for estimating Chinese fir biomass model. In the Bayesian framework, two priors were introduced: non-informative priors and informative priors. For informative priors, 32 biomass equations of Chinese fir were collected from published literature in the paper. The parameter distributions from published literature were regarded as prior distributions in Bayesian model for estimating Chinese fir biomass. Therefore, the Bayesian method with informative priors was better than non-informative priors and classical method, which provides a reasonable method for estimating Chinese fir biomass. PMID:24278198
Tree biomass estimation of Chinese fir (Cunninghamia lanceolata) based on Bayesian method.
Zhang, Xiongqing; Duan, Aiguo; Zhang, Jianguo
2013-01-01
Chinese fir (Cunninghamia lanceolata (Lamb.) Hook.) is the most important conifer species for timber production with huge distribution area in southern China. Accurate estimation of biomass is required for accounting and monitoring Chinese forest carbon stocking. In the study, allometric equation W = a(D2H)b was used to analyze tree biomass of Chinese fir. The common methods for estimating allometric model have taken the classical approach based on the frequency interpretation of probability. However, many different biotic and abiotic factors introduce variability in Chinese fir biomass model, suggesting that parameters of biomass model are better represented by probability distributions rather than fixed values as classical method. To deal with the problem, Bayesian method was used for estimating Chinese fir biomass model. In the Bayesian framework, two priors were introduced: non-informative priors and informative priors. For informative priors, 32 biomass equations of Chinese fir were collected from published literature in the paper. The parameter distributions from published literature were regarded as prior distributions in Bayesian model for estimating Chinese fir biomass. Therefore, the Bayesian method with informative priors was better than non-informative priors and classical method, which provides a reasonable method for estimating Chinese fir biomass.
NASA Technical Reports Server (NTRS)
Eckel, J. S.; Crabtree, M. S.
1984-01-01
Analytical and subjective techniques that are sensitive to the information transmission and processing requirements of individual communications-related tasks are used to assess workload imposed on the aircrew by A-10 communications requirements for civilian transport category aircraft. Communications-related tasks are defined to consist of the verbal exchanges between crews and controllers. Three workload estimating techniques are proposed. The first, an information theoretic analysis, is used to calculate bit values for perceptual, manual, and verbal demands in each communication task. The second, a paired-comparisons technique, obtains subjective estimates of the information processing and memory requirements for specific messages. By combining the results of the first two techniques, a hybrid analytical scale is created. The third, a subjective rank ordering of sequences of communications tasks, provides an overall scaling of communications workload. Recommendations for future research include an examination of communications-induced workload among the air crew and the development of simulation scenarios.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-28
...The Federal Emergency Management Agency (FEMA) has submitted the following information collection to the Office of Management and Budget (OMB) for review and clearance in accordance with the requirements of the Paperwork Reduction Act of 1995. The submission describes the nature of the information collection, the categories of respondents, the estimated burden (i.e., the time, effort and resources used by respondents to respond) and cost, and includes the actual data collection instruments FEMA will use. There has been a change in the respondents, estimated burden, and estimated total annual burden hours from previous 30 day Notice. This change is a result of including the time, effort, and resources to collect information to be used by respondents as well as the significant decline in respondents expected.
2004-03-01
using standard Internet technologies with no additional client software required. Furthermore, using a portable...Wilkerson Computational and Information Sciences Directorate, ARL Approved for public release... information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and
12 CFR 611.515 - Information statement.
Code of Federal Regulations, 2010 CFR
2010-01-01
... estimated to be associated with the transfer. (7) A description of the type and dollar amount of any... require financial assistance during the first 3 years of operation, the estimated type and dollar amount... possible tax consequences to stockholders and whether any legal opinion, ruling or external auditor's...
78 FR 73876 - Agency Information Collection Activities: Passenger and Crew Manifest
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-09
... private aircraft flights. Specific data elements required for each passenger and crew member include: Full... expiration date; and alien registration number where applicable. APIS is authorized under the Aviation and...,937. Private Aircraft Pilots: Estimated Number of Respondents: 460,000. Estimated Number of Total...
Finite-error metrological bounds on multiparameter Hamiltonian estimation
NASA Astrophysics Data System (ADS)
Kura, Naoto; Ueda, Masahito
2018-01-01
Estimation of multiple parameters in an unknown Hamiltonian is investigated. We present upper and lower bounds on the time required to complete the estimation within a prescribed error tolerance δ . The lower bound is given on the basis of the Cramér-Rao inequality, where the quantum Fisher information is bounded by the squared evolution time. The upper bound is obtained by an explicit construction of estimation procedures. By comparing the cases with different numbers of Hamiltonian channels, we also find that the few-channel procedure with adaptive feedback and the many-channel procedure with entanglement are equivalent in the sense that they require the same amount of time resource up to a constant factor.
Accurate position estimation methods based on electrical impedance tomography measurements
NASA Astrophysics Data System (ADS)
Vergara, Samuel; Sbarbaro, Daniel; Johansen, T. A.
2017-08-01
Electrical impedance tomography (EIT) is a technology that estimates the electrical properties of a body or a cross section. Its main advantages are its non-invasiveness, low cost and operation free of radiation. The estimation of the conductivity field leads to low resolution images compared with other technologies, and high computational cost. However, in many applications the target information lies in a low intrinsic dimensionality of the conductivity field. The estimation of this low-dimensional information is addressed in this work. It proposes optimization-based and data-driven approaches for estimating this low-dimensional information. The accuracy of the results obtained with these approaches depends on modelling and experimental conditions. Optimization approaches are sensitive to model discretization, type of cost function and searching algorithms. Data-driven methods are sensitive to the assumed model structure and the data set used for parameter estimation. The system configuration and experimental conditions, such as number of electrodes and signal-to-noise ratio (SNR), also have an impact on the results. In order to illustrate the effects of all these factors, the position estimation of a circular anomaly is addressed. Optimization methods based on weighted error cost functions and derivate-free optimization algorithms provided the best results. Data-driven approaches based on linear models provided, in this case, good estimates, but the use of nonlinear models enhanced the estimation accuracy. The results obtained by optimization-based algorithms were less sensitive to experimental conditions, such as number of electrodes and SNR, than data-driven approaches. Position estimation mean squared errors for simulation and experimental conditions were more than twice for the optimization-based approaches compared with the data-driven ones. The experimental position estimation mean squared error of the data-driven models using a 16-electrode setup was less than 0.05% of the tomograph radius value. These results demonstrate that the proposed approaches can estimate an object’s position accurately based on EIT measurements if enough process information is available for training or modelling. Since they do not require complex calculations it is possible to use them in real-time applications without requiring high-performance computers.
78 FR 15015 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-08
... performance of the Agency's function; (2) the accuracy of the estimated burden; (3) ways to enhance the... techniques or other forms of information technology to minimize the information collection burden. 1. Type of... help determine the medical necessity of certain items. The CMS requires CMNs where there may be a...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-13
... and reporting requirements (i.e., the burden and costs for complying with drinking water information... Activities; Submission to OMB for Review and Approval; Comment Request; Public Water System Supervision... abstracted below, describes the nature of the information collection and its estimated burden and cost. DATES...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-07
... Act of 1973 (ESA; 16 U.S.C. 1531 et seq.) imposed prohibitions against the taking of endangered... approvals, 15 minutes; and inspection requests, 30 minutes. Estimated Total Annual Burden Hours: 7,698... collection of information; (c) ways to enhance the quality, utility, and clarity of the information to be...
Reasoning and memory: People make varied use of the information available in working memory.
Hardman, Kyle O; Cowan, Nelson
2016-05-01
Working memory (WM) is used for storing information in a highly accessible state so that other mental processes, such as reasoning, can use that information. Some WM tasks require that participants not only store information, but also reason about that information to perform optimally on the task. In this study, we used visual WM tasks that had both storage and reasoning components to determine both how ideally people are able to reason about information in WM and if there is a relationship between information storage and reasoning. We developed novel psychological process models of the tasks that allowed us to estimate for each participant both how much information they had in WM and how efficiently they reasoned about that information. Our estimates of information use showed that participants are not all ideal information users or minimal information users, but rather that there are individual differences in the thoroughness of information use in our WM tasks. However, we found that our participants tended to be more ideal than minimal. One implication of this work is that to accurately estimate the amount of information in WM, it is important to also estimate how efficiently that information is used. This new analysis contributes to the theoretical premise that human rationality may be bounded by the complexity of task demands. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Reasoning and memory: People make varied use of the information available in working memory
Hardman, Kyle O.; Cowan, Nelson
2015-01-01
Working memory (WM) is used for storing information in a highly-accessible state so that other mental processes, such as reasoning, can use that information. Some WM tasks require that participants not only store information, but also reason about that information in order to perform optimally on the task. In this study, we used visual WM tasks that had both storage and reasoning components in order to determine both how ideally people are able to reason about information in WM and if there is a relationship between information storage and reasoning. We developed novel psychological process models of the tasks that allowed us to estimate for each participant both how much information they had in WM and how efficiently they reasoned about that information. Our estimates of information use showed that participants are not all ideal information users or minimal information users, but rather that there are individual differences in the thoroughness of information use in our WM tasks. However, we found that our participants tended to be more ideal than minimal. One implication of this work is that in order to accurately estimate the amount of information in WM, it is important to also estimate how efficiently that information is used. This new analysis contributes to the theoretical premise that human rationality may be bounded by the complexity of task demands. PMID:26569436
Surviving an Information Systems Conversion.
ERIC Educational Resources Information Center
Neel, Don
1999-01-01
Prompted by the "millennium bug," many school districts are in the process of replacing non-Y2K-compliant information systems. Planners should establish a committee to develop performance criteria and select the winning proposal, estimate time requirements, and schedule retraining during low-activity periods. (MLH)
Siegel, Carole E.; Laska, Eugene; Meisner, Morris
2004-01-01
Objectives. We sought to estimate the extended mental health service capacity requirements of persons affected by the September 11, 2001, terrorist attacks. Methods. We developed a formula to estimate the extended mental health service capacity requirements following disaster situations and assessed availability of the information required by the formula. Results. Sparse data exist on current services and supports used by people with mental health problems outside of the formal mental health specialty sector. There also are few systematically collected data on mental health sequelae of disasters. Conclusions. We recommend research-based surveys to understand service usage in non–mental health settings and suggest that federal guidelines be established to promote uniform data collection of a core set of items in studies carried out after disasters. PMID:15054009
40 CFR 600.312-86 - Labeling, reporting, and recordkeeping; Administrator reviews.
Code of Federal Regulations, 2010 CFR
2010-07-01
... shall constitute the EPA fuel economy estimates unless the Administrator determines that they are not... Fuel Economy Estimates. (iii) If additional information is required, the Administrator shall request... new vehicles which are unsold beginning no later than 15 calendar days after the date of notification...
Estimating forest canopy fuel parameters using LIDAR data.
Hans-Erik Andersen; Robert J. McGaughey; Stephen E. Reutebuch
2005-01-01
Fire researchers and resource managers are dependent upon accurate, spatially-explicit forest structure information to support the application of forest fire behavior models. In particular, reliable estimates of several critical forest canopy structure metrics, including canopy bulk density, canopy height, canopy fuel weight, and canopy base height, are required to...
75 FR 57283 - Agency Information Collection Activities: Passenger and Crew Manifest
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-20
... private aircraft flights. Specific data elements required for each passenger and crew member include: Full... expiration date; and alien registration number where applicable. APIS is authorized under the Aviation and.... Estimated Time per Response: 1 minute. Estimated Total Annual Burden Hours: 3,128,861. Private Aircraft...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-28
...The Food and Drug Administration (FDA) is announcing an opportunity for public comment on the proposed collection of certain information by the Agency. Under the Paperwork Reduction Act of 1995 (the PRA), Federal Agencies are required to publish notice in the Federal Register concerning each proposed collection of information, including each proposed extension of an existing collection of information, and to allow 60 days for public comment in response to the notice. This notice solicits comments on the estimated reporting, recordkeeping, and third-party disclosure burden associated with the Mammography Quality Standards Act requirements.
Accounting for Incomplete Species Detection in Fish Community Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamay, Ryan A; Orth, Dr. Donald J; Jager, Yetta
2013-01-01
Riverine fish assemblages are heterogeneous and very difficult to characterize with a one-size-fits-all approach to sampling. Furthermore, detecting changes in fish assemblages over time requires accounting for variation in sampling designs. We present a modeling approach that permits heterogeneous sampling by accounting for site and sampling covariates (including method) in a model-based framework for estimation (versus a sampling-based framework). We snorkeled during three surveys and electrofished during a single survey in suite of delineated habitats stratified by reach types. We developed single-species occupancy models to determine covariates influencing patch occupancy and species detection probabilities whereas community occupancy models estimated speciesmore » richness in light of incomplete detections. For most species, information-theoretic criteria showed higher support for models that included patch size and reach as covariates of occupancy. In addition, models including patch size and sampling method as covariates of detection probabilities also had higher support. Detection probability estimates for snorkeling surveys were higher for larger non-benthic species whereas electrofishing was more effective at detecting smaller benthic species. The number of sites and sampling occasions required to accurately estimate occupancy varied among fish species. For rare benthic species, our results suggested that higher number of occasions, and especially the addition of electrofishing, may be required to improve detection probabilities and obtain accurate occupancy estimates. Community models suggested that richness was 41% higher than the number of species actually observed and the addition of an electrofishing survey increased estimated richness by 13%. These results can be useful to future fish assemblage monitoring efforts by informing sampling designs, such as site selection (e.g. stratifying based on patch size) and determining effort required (e.g. number of sites versus occasions).« less
Knopman, Debra S.; Voss, Clifford I.
1988-01-01
Sensitivities of solute concentration to parameters associated with first-order chemical decay, boundary conditions, initial conditions, and multilayer transport are examined in one-dimensional analytical models of transient solute transport in porous media. A sensitivity is a change in solute concentration resulting from a change in a model parameter. Sensitivity analysis is important because minimum information required in regression on chemical data for the estimation of model parameters by regression is expressed in terms of sensitivities. Nonlinear regression models of solute transport were tested on sets of noiseless observations from known models that exceeded the minimum sensitivity information requirements. Results demonstrate that the regression models consistently converged to the correct parameters when the initial sets of parameter values substantially deviated from the correct parameters. On the basis of the sensitivity analysis, several statements may be made about design of sampling for parameter estimation for the models examined: (1) estimation of parameters associated with solute transport in the individual layers of a multilayer system is possible even when solute concentrations in the individual layers are mixed in an observation well; (2) when estimating parameters in a decaying upstream boundary condition, observations are best made late in the passage of the front near a time chosen by adding the inverse of an hypothesized value of the source decay parameter to the estimated mean travel time at a given downstream location; (3) estimation of a first-order chemical decay parameter requires observations to be made late in the passage of the front, preferably near a location corresponding to a travel time of √2 times the half-life of the solute; and (4) estimation of a parameter relating to spatial variability in an initial condition requires observations to be made early in time relative to passage of the solute front.
Perceptual constancy in auditory perception of distance to railway tracks.
De Coensel, Bert; Nilsson, Mats E; Berglund, Birgitta; Brown, A L
2013-07-01
Distance to a sound source can be accurately estimated solely from auditory information. With a sound source such as a train that is passing by at a relatively large distance, the most important auditory information for the listener for estimating its distance consists of the intensity of the sound, spectral changes in the sound caused by air absorption, and the motion-induced rate of change of intensity. However, these cues are relative because prior information/experience of the sound source-its source power, its spectrum and the typical speed at which it moves-is required for such distance estimates. This paper describes two listening experiments that allow investigation of further prior contextual information taken into account by listeners-viz., whether they are indoors or outdoors. Asked to estimate the distance to the track of a railway, it is shown that listeners assessing sounds heard inside the dwelling based their distance estimates on the expected train passby sound level outdoors rather than on the passby sound level actually experienced indoors. This form of perceptual constancy may have consequences for the assessment of annoyance caused by railway noise.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-17
... supplemental funds have been offered: $75,000 per grantee in FY 2009; $100,000 in FY 2010; $150,000 in FY 2011... applicable. Total Burden Estimate: Number of Hours per Cost per Startup Requirement respondents respondent...
A Statistical Framework for Protein Quantitation in Bottom-Up MS-Based Proteomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas
2009-08-15
Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and associated confidence measures. Challenges include the presence of low quality or incorrectly identified peptides and informative missingness. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model that carefully accounts for informative missingness in peak intensities and allows unbiased, model-based, protein-level estimation and inference. The model is applicable to both label-based and label-free quantitation experiments. We also provide automated, model-based, algorithms for filtering of proteins and peptides as well as imputation of missing values. Two LC/MS datasets are used to illustrate themore » methods. In simulation studies, our methods are shown to achieve substantially more discoveries than standard alternatives. Availability: The software has been made available in the opensource proteomics platform DAnTE (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu Supplementary information: Supplementary data are available at Bioinformatics online.« less
A parametric method for determining the number of signals in narrow-band direction finding
NASA Astrophysics Data System (ADS)
Wu, Qiang; Fuhrmann, Daniel R.
1991-08-01
A novel and more accurate method to determine the number of signals in the multisource direction finding problem is developed. The information-theoretic criteria of Yin and Krishnaiah (1988) are applied to a set of quantities which are evaluated from the log-likelihood function. Based on proven asymptotic properties of the maximum likelihood estimation, these quantities have the properties required by the criteria. Since the information-theoretic criteria use these quantities instead of the eigenvalues of the estimated correlation matrix, this approach possesses the advantage of not requiring a subjective threshold, and also provides higher performance than when eigenvalues are used. Simulation results are presented and compared to those obtained from the nonparametric method given by Wax and Kailath (1985).
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-21
... commercial launch site are required by 49 U.S.C. Subtitle IX, 701-- Commercial Space Launch Activities, 49 U.S.C. 70101-70119 (1994). The information is needed in order to demonstrate to the FAA Office of.... Frequency: Information is collected on occasion. Estimated Average Burden per Response: 2,322 hours...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-23
... Information Collection: DS-3035, J-1 Visa Waiver Recommendation Application ACTION: Notice of request for... accordance with the Paperwork Reduction Act of 1995. Title of Information Collection: J-1 Visa Waiver.... Respondents: J-1 visa holders applying for a waiver of the two-year foreign residence requirement. Estimated...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-12
... request for comments. SUMMARY: As part of its continuing effort to reduce paperwork burden and as required... forms of information technology; and ways to further reduce the information burden for small business... responses. Estimated Time Per Response: 30 sec (.0084 hours). Frequency of Response: One time reporting...
Learning dependence from samples.
Seth, Sohan; Príncipe, José C
2014-01-01
Mutual information, conditional mutual information and interaction information have been widely used in scientific literature as measures of dependence, conditional dependence and mutual dependence. However, these concepts suffer from several computational issues; they are difficult to estimate in continuous domain, the existing regularised estimators are almost always defined only for real or vector-valued random variables, and these measures address what dependence, conditional dependence and mutual dependence imply in terms of the random variables but not finite realisations. In this paper, we address the issue that given a set of realisations in an arbitrary metric space, what characteristic makes them dependent, conditionally dependent or mutually dependent. With this novel understanding, we develop new estimators of association, conditional association and interaction association. Some attractive properties of these estimators are that they do not require choosing free parameter(s), they are computationally simpler, and they can be applied to arbitrary metric spaces.
A Height Estimation Approach for Terrain Following Flights from Monocular Vision.
Campos, Igor S G; Nascimento, Erickson R; Freitas, Gustavo M; Chaimowicz, Luiz
2016-12-06
In this paper, we present a monocular vision-based height estimation algorithm for terrain following flights. The impressive growth of Unmanned Aerial Vehicle (UAV) usage, notably in mapping applications, will soon require the creation of new technologies to enable these systems to better perceive their surroundings. Specifically, we chose to tackle the terrain following problem, as it is still unresolved for consumer available systems. Virtually every mapping aircraft carries a camera; therefore, we chose to exploit this in order to use presently available hardware to extract the height information toward performing terrain following flights. The proposed methodology consists of using optical flow to track features from videos obtained by the UAV, as well as its motion information to estimate the flying height. To determine if the height estimation is reliable, we trained a decision tree that takes the optical flow information as input and classifies whether the output is trustworthy or not. The classifier achieved accuracies of 80 % for positives and 90 % for negatives, while the height estimation algorithm presented good accuracy.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-09
... Activities; Proposed Collection; Comment Request; Facility Ground-Water Monitoring Requirements AGENCY...) concerning groundwater monitoring reporting and recordkeeping requirements. This ICR is scheduled to expire... arrived at the estimate that you provide. 5. Offer alternative ways to improve the collection activity. 6...
On-orbit calibration for star sensors without priori information.
Zhang, Hao; Niu, Yanxiong; Lu, Jiazhen; Zhang, Chengfen; Yang, Yanqiang
2017-07-24
The star sensor is a prerequisite navigation device for a spacecraft. The on-orbit calibration is an essential guarantee for its operation performance. However, traditional calibration methods rely on ground information and are invalid without priori information. The uncertain on-orbit parameters will eventually influence the performance of guidance navigation and control system. In this paper, a novel calibration method without priori information for on-orbit star sensors is proposed. Firstly, the simplified back propagation neural network is designed for focal length and main point estimation along with system property evaluation, called coarse calibration. Then the unscented Kalman filter is adopted for the precise calibration of all parameters, including focal length, main point and distortion. The proposed method benefits from self-initialization and no attitude or preinstalled sensor parameter is required. Precise star sensor parameter estimation can be achieved without priori information, which is a significant improvement for on-orbit devices. Simulations and experiments results demonstrate that the calibration is easy for operation with high accuracy and robustness. The proposed method can satisfy the stringent requirement for most star sensors.
Improving chemical species tomography of turbulent flows using covariance estimation.
Grauer, Samuel J; Hadwin, Paul J; Daun, Kyle J
2017-05-01
Chemical species tomography (CST) experiments can be divided into limited-data and full-rank cases. Both require solving ill-posed inverse problems, and thus the measurement data must be supplemented with prior information to carry out reconstructions. The Bayesian framework formalizes the role of additive information, expressed as the mean and covariance of a joint-normal prior probability density function. We present techniques for estimating the spatial covariance of a flow under limited-data and full-rank conditions. Our results show that incorporating a covariance estimate into CST reconstruction via a Bayesian prior increases the accuracy of instantaneous estimates. Improvements are especially dramatic in real-time limited-data CST, which is directly applicable to many industrially relevant experiments.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-22
... DEPARTMENT OF LABOR Occupational Safety and Health Administration [Docket No. OSHA-2011-0008... of Information Collection (Paperwork) Requirements AGENCY: Occupational Safety and Health... OSHA's estimate of the information collection burden is accurate. The Occupational Safety and Health...
Annual Program, 1987. Texas State Library.
ERIC Educational Resources Information Center
Texas State Library, Austin.
This report provides information related to the Texas State Library's fiscal year 1987 Library Services and Construction Act (LSCA) Public Law 84-597, as amended state-administered program. Information is included on: (1) Standard Form 424 for federal assistance; (2) fiscal breakdowns of estimated expenditures; (3) specific requirements for…
28 CFR 70.52 - Financial reporting.
Code of Federal Regulations, 2010 CFR
2010-07-01
... accounting system does not meet the standards in § 70.21, additional pertinent information to further monitor.... (ii) Reports must be on an accrual basis. Recipients are not required to convert their accounting system, but must develop such accrual information through best estimates based on an analysis of the...
30 CFR 772.10 - Information collection.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Specifically, OSM estimates that preparation of a notice of intent to explore under § 772.11 will require an average of 10 hours per notice, preparation and processing of an...
75 FR 71704 - Agency Information Collection Activities; Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-24
... for decisions, and follow-up), recordkeeping, and annual audits. The Rule requires that IDSMs... not include any sensitive personal information, such as any individual's Social Security number, date..., staff has adjusted its previous estimates based on the following two factors. First, the annual audits...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-29
... burden estimate. \\3\\ The rule became effective on September 24, 2009. Full compliance was required by... media notice. Such substitute notice must include a toll-free number for the purpose of allowing a..., web posting, or media.\\9\\ \\9\\ Staff's earlier estimate also included costs associated with obtaining a...
40 CFR 80.270 - Can a refiner seek temporary relief from the requirements of this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... following information: (i) Detailed description of efforts to obtain capital for refinery investments; (ii) Bond rating of entity that owns the refinery; and (iii) Estimated capital investment needed to comply... necessary for construction, a description of plans to obtain necessary capital, and a detailed estimate of...
Disposition of Depleted Uranium Oxide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crandall, J.L.
2001-08-13
This document summarizes environmental information which has been collected up to June 1983 at Savannah River Plant. Of particular interest is an updating of dose estimates from changes in methodology of calculation, lower cesium transport estimates from Steel Creek, and new sports fish consumption data for the Savannah River. The status of various permitting requirements are also discussed.
30 CFR 702.18 - Reporting requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... used or transferred by the operator or related entity and the estimated total fair market value of such... related entity and the estimated total fair market value of such minerals; and (6) The number of tons of... definition of Cumulative measurement period in § 702.5 of this part. (3) The information in the report shall...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-29
... respondents; 409,048 responses. Estimated Time Per Response: .033 hours Frequency of Response: Recordkeeping... previous estimates. Section 90.215 requires station licensees to measure the carrier frequency, output power, and modulation of each transmitter authorized to operate with power in excess of two watts when...
Implementing the measurement interval midpoint method for change estimation
James A. Westfall; Thomas Frieswyk; Douglas M. Griffith
2009-01-01
The adoption of nationally consistent estimation procedures for the Forest Inventory and Analysis (FIA) program mandates changes in the methods used to develop resource trend information. Particularly, it is prescribed that changes in tree status occur at the midpoint of the measurement interval to minimize potential bias. The individual-tree characteristics requiring...
Olson, Scott A.; Tasker, Gary D.; Johnston, Craig M.
2003-01-01
Estimates of the magnitude and frequency of streamflow are needed to safely and economically design bridges, culverts, and other structures in or near streams. These estimates also are used for managing floodplains, identifying flood-hazard areas, and establishing flood-insurance rates, but may be required at ungaged sites where no observed flood data are available for streamflow-frequency analysis. This report describes equations for estimating flow-frequency characteristics at ungaged, unregulated streams in Vermont. In the past, regression equations developed to estimate streamflow statistics required users to spend hours manually measuring basin characteristics for the stream site of interest. This report also describes the accompanying customized geographic information system (GIS) tool that automates the measurement of basin characteristics and calculation of corresponding flow statistics. The tool includes software that computes the accuracy of the results and adjustments for expected probability and for streamflow data of a nearby stream-gaging station that is either upstream or downstream and within 50 percent of the drainage area of the site where the flow-frequency characteristics are being estimated. The custom GIS can be linked to the National Flood Frequency program, adding the ability to plot peak-flow-frequency curves and synthetic hydrographs and to compute adjustments for urbanization.
Garrard, Georgia E; McCarthy, Michael A; Vesk, Peter A; Radford, James Q; Bennett, Andrew F
2012-01-01
1. Informative Bayesian priors can improve the precision of estimates in ecological studies or estimate parameters for which little or no information is available. While Bayesian analyses are becoming more popular in ecology, the use of strongly informative priors remains rare, perhaps because examples of informative priors are not readily available in the published literature. 2. Dispersal distance is an important ecological parameter, but is difficult to measure and estimates are scarce. General models that provide informative prior estimates of dispersal distances will therefore be valuable. 3. Using a world-wide data set on birds, we develop a predictive model of median natal dispersal distance that includes body mass, wingspan, sex and feeding guild. This model predicts median dispersal distance well when using the fitted data and an independent test data set, explaining up to 53% of the variation. 4. Using this model, we predict a priori estimates of median dispersal distance for 57 woodland-dependent bird species in northern Victoria, Australia. These estimates are then used to investigate the relationship between dispersal ability and vulnerability to landscape-scale changes in habitat cover and fragmentation. 5. We find evidence that woodland bird species with poor predicted dispersal ability are more vulnerable to habitat fragmentation than those species with longer predicted dispersal distances, thus improving the understanding of this important phenomenon. 6. The value of constructing informative priors from existing information is also demonstrated. When used as informative priors for four example species, predicted dispersal distances reduced the 95% credible intervals of posterior estimates of dispersal distance by 8-19%. Further, should we have wished to collect information on avian dispersal distances and relate it to species' responses to habitat loss and fragmentation, data from 221 individuals across 57 species would have been required to obtain estimates with the same precision as those provided by the general model. © 2011 The Authors. Journal of Animal Ecology © 2011 British Ecological Society.
NASA Astrophysics Data System (ADS)
Matthews, Thomas P.; Anastasio, Mark A.
2017-12-01
The initial pressure and speed of sound (SOS) distributions cannot both be stably recovered from photoacoustic computed tomography (PACT) measurements alone. Adjunct ultrasound computed tomography (USCT) measurements can be employed to estimate the SOS distribution. Under the conventional image reconstruction approach for combined PACT/USCT systems, the SOS is estimated from the USCT measurements alone and the initial pressure is estimated from the PACT measurements by use of the previously estimated SOS. This approach ignores the acoustic information in the PACT measurements and may require many USCT measurements to accurately reconstruct the SOS. In this work, a joint reconstruction method where the SOS and initial pressure distributions are simultaneously estimated from combined PACT/USCT measurements is proposed. This approach allows accurate estimation of both the initial pressure distribution and the SOS distribution while requiring few USCT measurements.
Image information content and patient exposure.
Motz, J W; Danos, M
1978-01-01
Presently, patient exposure and x-ray tube kilovoltage are determined by image visibility requirements on x-ray film. With the employment of image-processing techniques, image visibility may be manipulated and the exposure may be determined only by the desired information content, i.e., by the required degree of tissue-density descrimination and spatial resolution. This work gives quantitative relationships between the image information content and the patient exposure, give estimates of the minimum exposures required for the detection of image signals associated with particular radiological exams. Also, for subject thickness larger than approximately 5 cm, the results show that the maximum information content may be obtained at a single kilovoltage and filtration with the simultaneous employment of image-enhancement and antiscatter techniques. This optimization may be used either to reduce the patient exposure or to increase the retrieved information.
An estimator for the standard deviation of a natural frequency. II.
NASA Technical Reports Server (NTRS)
Schiff, A. J.; Bogdanoff, J. L.
1971-01-01
A method has been presented for estimating the variability of a system's natural frequencies arising from the variability of the system's parameters. The only information required to obtain the estimates is the member variability, in the form of second-order properties, and the natural frequencies and mode shapes of the mean system. It has also been established for the systems studied by means of Monte Carlo estimates that the specification of second-order properties is an adequate description of member variability.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-09
... recommendation, and official transcripts. A personal interview must also be conducted. Eligibility requirements.../erecruit/login.jsp ) and then submit paper forms via mail. An in-person interview is also required. III... of Respondents: 1,800. Estimated Time per Response: written applications, 2 hours; interviews, 5...
Boundary methods for mode estimation
NASA Astrophysics Data System (ADS)
Pierson, William E., Jr.; Ulug, Batuhan; Ahalt, Stanley C.
1999-08-01
This paper investigates the use of Boundary Methods (BMs), a collection of tools used for distribution analysis, as a method for estimating the number of modes associated with a given data set. Model order information of this type is required by several pattern recognition applications. The BM technique provides a novel approach to this parameter estimation problem and is comparable in terms of both accuracy and computations to other popular mode estimation techniques currently found in the literature and automatic target recognition applications. This paper explains the methodology used in the BM approach to mode estimation. Also, this paper quickly reviews other common mode estimation techniques and describes the empirical investigation used to explore the relationship of the BM technique to other mode estimation techniques. Specifically, the accuracy and computational efficiency of the BM technique are compared quantitatively to the a mixture of Gaussian (MOG) approach and a k-means approach to model order estimation. The stopping criteria of the MOG and k-means techniques is the Akaike Information Criteria (AIC).
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-29
... FEDERAL RESERVE SYSTEM Agency Information Collection Activities: Announcement of Board Approval... holding companies (SLHCs) and nongovernmental entities or persons (NGEPs). Estimated annual reporting... Insurance Act (FDI Act), 12 U.S.C. 1831y(b) and (c). The FDI Act authorizes the Federal Reserve to require...
22 CFR 8.8 - Chartering of committees.
Code of Federal Regulations, 2010 CFR
2010-04-01
... information set forth in the charter of the parent committee. (3) Informal subgroups may not require a charter... the Library of Congress. (b) Contents. Each committee charter shall contain: The official name and... maintained; the estimated annual operating costs in dollars and man-years, and the source and authority for...
75 FR 62401 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-08
... collection; Title of Information Collection: Clinical Laboratory Improvement Amendment (CLIA) of 1988 and... laboratories that perform testing on human beings to meet performance requirements (quality standards) in order... functions; (2) the accuracy of the estimated burden; (3) ways to enhance the quality, utility, and clarity...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-25
... Activities; Submission to OMB for Review and Approval; Comment Request; Emergency Planning and Release Notification Requirements Under Emergency Planning and Community Right- To-Know Act (Renewal) AGENCY... of the information collection and its estimated burden and cost. DATES: Additional comments may be...
75 FR 13312 - Notice of Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-19
... AGENCY: National Aeronautics and Space Administration (NASA). ACTION: Notice of information collection... offerors to award Purchase Orders and to use bank cards for required goods and services in support of NASA..., Purchase Orders and the use of bank cards for purchases with an estimated valueless than $100,000. OMB...
Toward a Linguistically Realistic Assessment of Language Vitality: The Case of Jejueo
ERIC Educational Resources Information Center
Yang, Changyong; O'Grady, William; Yang, Sejung
2017-01-01
The assessment of language endangerment requires accurate estimates of speaker populations, including information about the proficiency of different groups within those populations. Typically, this information is based on self-assessments, a methodology whose reliability is open to question. We outline an approach that seeks to improve the…
Code of Federal Regulations, 2011 CFR
2011-10-01
... AK-2561-10. BLM uses this information to determine if using the public lands is appropriate. You must... follows: 28 hours per response to fill out form AK-2561-10. These estimates include the time for reviewing...
Code of Federal Regulations, 2013 CFR
2013-10-01
... AK-2561-10. BLM uses this information to determine if using the public lands is appropriate. You must... follows: 28 hours per response to fill out form AK-2561-10. These estimates include the time for reviewing...
Code of Federal Regulations, 2014 CFR
2014-10-01
... AK-2561-10. BLM uses this information to determine if using the public lands is appropriate. You must... follows: 28 hours per response to fill out form AK-2561-10. These estimates include the time for reviewing...
Code of Federal Regulations, 2012 CFR
2012-10-01
... AK-2561-10. BLM uses this information to determine if using the public lands is appropriate. You must... follows: 28 hours per response to fill out form AK-2561-10. These estimates include the time for reviewing...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-26
... importers and persons who manufacture or import explosive materials or ammonium nitrate must, when required by the Director, furnish samples of such explosive materials or ammonium nitrate; information on... to the identification of the ammonium nitrate. (5) An estimate of the total number of respondents and...
75 FR 66134 - Agency Information Collection Activities: Proposed Collection; Comments Requested
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-27
... certificate required by section 21 of the Federal Water Pollution Control Act. (5) An estimate of the total... of Information Collection Under Review: Certification of Knowledge of State Laws, Submission of Water Pollution Act. The Department of Justice (DOJ), Bureau of Alcohol, Tobacco, Firearms and Explosives (ATF...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-23
... Collection; Comment Request; Western Region Vessel Monitoring System and Pre-Trip Reporting Requirements... information or new problems in the fisheries. Vessel Monitoring System (VMS) units will facilitate enforcement... monitoring system (VMS) activation reports, 15 minutes; pre-trip reports, 5 minutes. Estimated Burden Hours...
Estimating sawmill processing capacity for Tongass timber: 2003 and 2004 update.
Allen M. Brackley; Daniel J. Parrent; Thomas D. Rojas
2006-01-01
In spring 2004 and 2005, sawmill capacity and wood utilization information was collected for selected mills in southeast Alaska. The collected information is required to prepare information for compliance with Section 705(a) of the Tongass Timber Reform Act. The total capacity in the region (active and inactive mills) was 370,350 thousand board feet (mbf) Scribner log...
Dewey, H M; Thrift, A G; Mihalopoulos, C; Carter, R; Macdonell, R A L; McNeil, J J; Donnan, G A
2002-04-01
Informal caregivers play an important role in the lives of stroke patients, but the cost of providing this care has not been estimated. The purpose of this study was to determine the nature and amount of informal care provided to stroke patients and to estimate the economic cost of that care. The primary caregivers of stroke patients registered in the North East Melbourne Stroke Incidence Study (NEMESIS) were interviewed at 3, 6, and 12 months after stroke, and the nature and amount of informal care provided were documented. The opportunity and replacement costs of informal care for all first-ever-in-a-lifetime strokes (excluding subarachnoid hemorrhages) that occurred in 1997 in Australia were estimated. Among 3-month stroke survivors, 74% required assistance with activities of daily living and received informal care from family or friends. Two thirds of primary caregivers were women, and most primary caregivers (>90%) provided care during family or leisure time. Total first-year caregiver time costs for all first-ever-in-a-lifetime strokes were estimated to be A$21.7 million (opportunity cost approach) or A$42.5 million (replacement cost approach), and the present values of lifetime caregiver time costs were estimated to be A$171.4 million (opportunity cost approach) or A$331.8 million (replacement cost approach). Informal care for stroke survivors represents a significant hidden cost to Australian society. Because our community is rapidly aging, this informal care burden may increase significantly in the future.
Fast Conceptual Cost Estimating of Aerospace Projects Using Historical Information
NASA Technical Reports Server (NTRS)
Butts, Glenn
2007-01-01
Accurate estimates can be created in less than a minute by applying powerful techniques and algorithms to create an Excel-based parametric cost model. In five easy steps you will learn how to normalize your company 's historical cost data to the new project parameters. This paper provides a complete, easy-to-understand, step by step how-to guide. Such a guide does not seem to currently exist. Over 2,000 hours of research, data collection, and trial and error, and thousands of lines of Excel Visual Basic Application (VBA) code were invested in developing these methods. While VBA is not required to use this information, it increases the power and aesthetics of the model. Implementing all of the steps described, while not required, will increase the accuracy of the results.
Randomised cluster trial to support informed parental decision-making for the MMR vaccine
2011-01-01
Background In the UK public concern about the safety of the combined measles, mumps and rubella [MMR] vaccine continues to impact on MMR coverage. Whilst the sharp decline in uptake has begun to level out, first and second dose uptake rates remain short of that required for population immunity. Furthermore, international research consistently shows that some parents lack confidence in making a decision about MMR vaccination for their children. Together, this work suggests that effective interventions are required to support parents to make informed decisions about MMR. This trial assessed the impact of a parent-centred, multi-component intervention (balanced information, group discussion, coaching exercise) on informed parental decision-making for MMR. Methods This was a two arm, cluster randomised trial. One hundred and forty two UK parents of children eligible for MMR vaccination were recruited from six primary healthcare centres and six childcare organisations. The intervention arm received an MMR information leaflet and participated in the intervention (parent meeting). The control arm received the leaflet only. The primary outcome was decisional conflict. Secondary outcomes were actual and intended MMR choice, knowledge, attitude, concern and necessity beliefs about MMR and anxiety. Results Decisional conflict decreased for both arms to a level where an 'effective' MMR decision could be made one-week (effect estimate = -0.54, p < 0.001) and three-months (effect estimate = -0.60, p < 0.001) post-intervention. There was no significant difference between arms (effect estimate = 0.07, p = 0.215). Heightened decisional conflict was evident for parents making the MMR decision for their first child (effect estimate = -0.25, p = 0.003), who were concerned (effect estimate = 0.07, p < 0.001), had less positive attitudes (effect estimate = -0.20, p < 0.001) yet stronger intentions (effect estimate = 0.09, p = 0.006). Significantly more parents in the intervention arm reported vaccinating their child (93% versus 73%, p = 0.04). Conclusions Whilst both the leaflet and the parent meeting reduced parents' decisional conflict, the parent meeting appeared to enable parents to act upon their decision leading to vaccination uptake. PMID:21679432
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-26
... estimate of 3 minutes per Medication Guide for pharmacists to comply with the requirements is miscalculated... as the estimated burden for pharmacists to distribute Medication Guides to patients. (Comment 2) One... should be a tool to enhance the level of care to consumers, rather than a hindrance to pharmacists in...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-15
... interested in being approved DV user auditors are required to submit an application to the Board on ABC Form No. 53, ``Application for Direct Verifiable (DV) Program Auditors.'' To ensure compliance with the... to qualify as auditors under the DV program. Estimated Number of Respondents: 175. Estimated Number...
Net Operating Working Capital, Capital Budgeting, and Cash Budgets: A Teaching Example
ERIC Educational Resources Information Center
Tuner, James A.
2016-01-01
Many introductory finance texts present information on the capital budgeting process, including estimation of project cash flows. Typically, estimation of project cash flows begins with a calculation of net income. Getting from net income to cash flows requires accounting for non-cash items such as depreciation. Also important is the effect of…
A method to combine remotely sensed and in situ measurements: Program documentation
NASA Technical Reports Server (NTRS)
Peck, E. L.; Johnson, E. R.; Wong, M. Y.
1984-01-01
All user and programmer information required for using the correlation area method (CAM) program is presented. This program combines measurements of hydrologic variables from all measurement technologies to produce estimated areal mean values. The method accounts for sampling geometries and measurement accuracies and provides a measure of the accuracy of the estimated mean areal value.
Risk analysis for biological hazards: What we need to know about invasive species
Stohlgren, T.J.; Schnase, J.L.
2006-01-01
Risk analysis for biological invasions is similar to other types of natural and human hazards. For example, risk analysis for chemical spills requires the evaluation of basic information on where a spill occurs; exposure level and toxicity of the chemical agent; knowledge of the physical processes involved in its rate and direction of spread; and potential impacts to the environment, economy, and human health relative to containment costs. Unlike typical chemical spills, biological invasions can have long lag times from introduction and establishment to successful invasion, they reproduce, and they can spread rapidly by physical and biological processes. We use a risk analysis framework to suggest a general strategy for risk analysis for invasive species and invaded habitats. It requires: (1) problem formation (scoping the problem, defining assessment endpoints); (2) analysis (information on species traits, matching species traits to suitable habitats, estimating exposure, surveys of current distribution and abundance); (3) risk characterization (understanding of data completeness, estimates of the “potential” distribution and abundance; estimates of the potential rate of spread; and probable risks, impacts, and costs); and (4) risk management (containment potential, costs, and opportunity costs; legal mandates and social considerations and information science and technology needs).
Confidence in outcome estimates from systematic reviews used in informed consent.
Fritz, Robert; Bauer, Janet G; Spackman, Sue S; Bains, Amanjyot K; Jetton-Rangel, Jeanette
2016-12-01
Evidence-based dentistry now guides informed consent in which clinicians are obliged to provide patients with the most current, best evidence, or best estimates of outcomes, of regimens, therapies, treatments, procedures, materials, and equipment or devices when developing personal oral health care, treatment plans. Yet, clinicians require that the estimates provided from systematic reviews be verified to their validity, reliability, and contextualized as to performance competency so that clinicians may have confidence in explaining outcomes to patients in clinical practice. The purpose of this paper was to describe types of informed estimates from which clinicians may have confidence in their capacity to assist patients in competent decision-making, one of the most important concepts of informed consent. Using systematic review methodology, researchers provide clinicians with valid best estimates of outcomes regarding a subject of interest from best evidence. Best evidence is verified through critical appraisals using acceptable sampling methodology either by scoring instruments (Timmer analysis) or checklist (grade), a Cochrane Collaboration standard that allows transparency in open reviews. These valid best estimates are then tested for reliability using large databases. Finally, valid and reliable best estimates are assessed for meaning using quantification of margins and uncertainties. Through manufacturer and researcher specifications, quantification of margins and uncertainties develops a performance competency continuum by which valid, reliable best estimates may be contextualized for their performance competency: at a lowest margin performance competency (structural failure), high margin performance competency (estimated true value of success), or clinically determined critical values (clinical failure). Informed consent may be achieved when clinicians are confident of their ability to provide useful and accurate best estimates of outcomes regarding regimens, therapies, treatments, and equipment or devices to patients in their clinical practices and when developing personal, oral health care, treatment plans. Copyright © 2016 Elsevier Inc. All rights reserved.
Lightweight, Miniature Inertial Measurement System
NASA Technical Reports Server (NTRS)
Tang, Liang; Crassidis, Agamemnon
2012-01-01
A miniature, lighter-weight, and highly accurate inertial navigation system (INS) is coupled with GPS receivers to provide stable and highly accurate positioning, attitude, and inertial measurements while being subjected to highly dynamic maneuvers. In contrast to conventional methods that use extensive, groundbased, real-time tracking and control units that are expensive, large, and require excessive amounts of power to operate, this method focuses on the development of an estimator that makes use of a low-cost, miniature accelerometer array fused with traditional measurement systems and GPS. Through the use of a position tracking estimation algorithm, onboard accelerometers are numerically integrated and transformed using attitude information to obtain an estimate of position in the inertial frame. Position and velocity estimates are subject to drift due to accelerometer sensor bias and high vibration over time, and so require the integration with GPS information using a Kalman filter to provide highly accurate and reliable inertial tracking estimations. The method implemented here uses the local gravitational field vector. Upon determining the location of the local gravitational field vector relative to two consecutive sensors, the orientation of the device may then be estimated, and the attitude determined. Improved attitude estimates further enhance the inertial position estimates. The device can be powered either by batteries, or by the power source onboard its target platforms. A DB9 port provides the I/O to external systems, and the device is designed to be mounted in a waterproof case for all-weather conditions.
Buried transuranic wastes at ORNL: Review of past estimates and reconciliation with current data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trabalka, J.R.
1997-09-01
Inventories of buried (generally meaning disposed of) transuranic (TRU) wastes at Oak Ridge National Laboratory (ORNL) have been estimated for site remediation and waste management planning over a period of about two decades. Estimates were required because of inadequate waste characterization and incomplete disposal records. For a variety of reasons, including changing definitions of TRU wastes, differing objectives for the estimates, and poor historical data, the published results have sometimes been in conflict. The purpose of this review was (1) to attempt to explain both the rationale for and differences among the various estimates, and (2) to update the estimatesmore » based on more recent information obtained from waste characterization and from evaluations of ORNL waste data bases and historical records. The latter included information obtained from an expert panel`s review and reconciliation of inconsistencies in data identified during preparation of the ORNL input for the third revision of the Baseline Inventory Report for the Waste Isolation Pilot Plant. The results summarize current understanding of the relationship between past estimates of buried TRU wastes and provide the most up-to-date information on recorded burials thereafter. The limitations of available information on the latter and thus the need for improved waste characterization are highlighted.« less
NASA Technical Reports Server (NTRS)
Holdeman, J. D.
1979-01-01
Three analytical problems in estimating the frequency at which commercial airline flights will encounter high cabin ozone levels are formulated and solved: namely, estimating flight-segment mean levels, estimating maximum-per-flight levels, and estimating the maximum average level over a specified flight interval. For each problem, solution procedures are given for different levels of input information - from complete cabin ozone data, which provides a direct solution, to limited ozone information, such as ambient ozone means and standard deviations, with which several assumptions are necessary to obtain the required estimates. Each procedure is illustrated by an example case calculation that uses simultaneous cabin and ambient ozone data obtained by the NASA Global Atmospheric Sampling Program. Critical assumptions are discussed and evaluated, and the several solutions for each problem are compared. Example calculations are also performed to illustrate how variations in lattitude, altitude, season, retention ratio, flight duration, and cabin ozone limits affect the estimated probabilities.
Mauro, Francisco; Monleon, Vicente J; Temesgen, Hailemariam; Ford, Kevin R
2017-01-01
Forest inventories require estimates and measures of uncertainty for subpopulations such as management units. These units often times hold a small sample size, so they should be regarded as small areas. When auxiliary information is available, different small area estimation methods have been proposed to obtain reliable estimates for small areas. Unit level empirical best linear unbiased predictors (EBLUP) based on plot or grid unit level models have been studied more thoroughly than area level EBLUPs, where the modelling occurs at the management unit scale. Area level EBLUPs do not require a precise plot positioning and allow the use of variable radius plots, thus reducing fieldwork costs. However, their performance has not been examined thoroughly. We compared unit level and area level EBLUPs, using LiDAR auxiliary information collected for inventorying 98,104 ha coastal coniferous forest. Unit level models were consistently more accurate than area level EBLUPs, and area level EBLUPs were consistently more accurate than field estimates except for large management units that held a large sample. For stand density, volume, basal area, quadratic mean diameter, mean height and Lorey's height, root mean squared errors (rmses) of estimates obtained using area level EBLUPs were, on average, 1.43, 2.83, 2.09, 1.40, 1.32 and 1.64 times larger than those based on unit level estimates, respectively. Similarly, direct field estimates had rmses that were, on average, 1.37, 1.45, 1.17, 1.17, 1.26, and 1.38 times larger than rmses of area level EBLUPs. Therefore, area level models can lead to substantial gains in accuracy compared to direct estimates, and unit level models lead to very important gains in accuracy compared to area level models, potentially justifying the additional costs of obtaining accurate field plot coordinates.
Monleon, Vicente J.; Temesgen, Hailemariam; Ford, Kevin R.
2017-01-01
Forest inventories require estimates and measures of uncertainty for subpopulations such as management units. These units often times hold a small sample size, so they should be regarded as small areas. When auxiliary information is available, different small area estimation methods have been proposed to obtain reliable estimates for small areas. Unit level empirical best linear unbiased predictors (EBLUP) based on plot or grid unit level models have been studied more thoroughly than area level EBLUPs, where the modelling occurs at the management unit scale. Area level EBLUPs do not require a precise plot positioning and allow the use of variable radius plots, thus reducing fieldwork costs. However, their performance has not been examined thoroughly. We compared unit level and area level EBLUPs, using LiDAR auxiliary information collected for inventorying 98,104 ha coastal coniferous forest. Unit level models were consistently more accurate than area level EBLUPs, and area level EBLUPs were consistently more accurate than field estimates except for large management units that held a large sample. For stand density, volume, basal area, quadratic mean diameter, mean height and Lorey’s height, root mean squared errors (rmses) of estimates obtained using area level EBLUPs were, on average, 1.43, 2.83, 2.09, 1.40, 1.32 and 1.64 times larger than those based on unit level estimates, respectively. Similarly, direct field estimates had rmses that were, on average, 1.37, 1.45, 1.17, 1.17, 1.26, and 1.38 times larger than rmses of area level EBLUPs. Therefore, area level models can lead to substantial gains in accuracy compared to direct estimates, and unit level models lead to very important gains in accuracy compared to area level models, potentially justifying the additional costs of obtaining accurate field plot coordinates. PMID:29216290
Real-Time Dynamic Modeling - Data Information Requirements and Flight Test Results
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.; Smith, Mark S.
2008-01-01
Practical aspects of identifying dynamic models for aircraft in real time were studied. Topics include formulation of an equation-error method in the frequency domain to estimate non-dimensional stability and control derivatives in real time, data information content for accurate modeling results, and data information management techniques such as data forgetting, incorporating prior information, and optimized excitation. Real-time dynamic modeling was applied to simulation data and flight test data from a modified F-15B fighter aircraft, and to operational flight data from a subscale jet transport aircraft. Estimated parameter standard errors and comparisons with results from a batch output-error method in the time domain were used to demonstrate the accuracy of the identified real-time models.
Real-Time Dynamic Modeling - Data Information Requirements and Flight Test Results
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.; Smith, Mark S.
2010-01-01
Practical aspects of identifying dynamic models for aircraft in real time were studied. Topics include formulation of an equation-error method in the frequency domain to estimate non-dimensional stability and control derivatives in real time, data information content for accurate modeling results, and data information management techniques such as data forgetting, incorporating prior information, and optimized excitation. Real-time dynamic modeling was applied to simulation data and flight test data from a modified F-15B fighter aircraft, and to operational flight data from a subscale jet transport aircraft. Estimated parameter standard errors, prediction cases, and comparisons with results from a batch output-error method in the time domain were used to demonstrate the accuracy of the identified real-time models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Robert N; White, Devin A; Urban, Marie L
2013-01-01
The Population Density Tables (PDT) project at the Oak Ridge National Laboratory (www.ornl.gov) is developing population density estimates for specific human activities under normal patterns of life based largely on information available in open source. Currently, activity based density estimates are based on simple summary data statistics such as range and mean. Researchers are interested in improving activity estimation and uncertainty quantification by adopting a Bayesian framework that considers both data and sociocultural knowledge. Under a Bayesian approach knowledge about population density may be encoded through the process of expert elicitation. Due to the scale of the PDT effort whichmore » considers over 250 countries, spans 40 human activity categories, and includes numerous contributors, an elicitation tool is required that can be operationalized within an enterprise data collection and reporting system. Such a method would ideally require that the contributor have minimal statistical knowledge, require minimal input by a statistician or facilitator, consider human difficulties in expressing qualitative knowledge in a quantitative setting, and provide methods by which the contributor can appraise whether their understanding and associated uncertainty was well captured. This paper introduces an algorithm that transforms answers to simple, non-statistical questions into a bivariate Gaussian distribution as the prior for the Beta distribution. Based on geometric properties of the Beta distribution parameter feasibility space and the bivariate Gaussian distribution, an automated method for encoding is developed that responds to these challenging enterprise requirements. Though created within the context of population density, this approach may be applicable to a wide array of problem domains requiring informative priors for the Beta distribution.« less
78 FR 33466 - Proposed Agency Information Collection Activities; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-04
... who are vulnerable to being struck by moving cars as they inspect or service equipment on a particular... are being moved in transportation. Annual Estimated Burden: 15,750 hours. Addressee: Send comments... requirements are being submitted for clearance by OMB as required by the PRA. Title: Filing of Dedicated Cars...
Time On Station Requirements: Costs, Policy Change, and Perceptions
2016-12-01
Travel Management Office (2016). .........................................................................6 Table 3. Time it took spouses to find...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA MBA PROFESSIONAL REPORT TIME ON STATION REQUIREMENTS: COSTS, POLICY CHANGE, AND...reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching
High throughput toxicity testing (HTT) holds the promise of providing data for tens of thousands of chemicals that currently have no data due to the cost and time required for animal testing. Interpretation of these results require information linking the perturbations seen in vi...
Multiscale Structure of UXO Site Characterization: Spatial Estimation and Uncertainty Quantification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ostrouchov, George; Doll, William E.; Beard, Les P.
2009-01-01
Unexploded ordnance (UXO) site characterization must consider both how the contamination is generated and how we observe that contamination. Within the generation and observation processes, dependence structures can be exploited at multiple scales. We describe a conceptual site characterization process, the dependence structures available at several scales, and consider their statistical estimation aspects. It is evident that most of the statistical methods that are needed to address the estimation problems are known but their application-specific implementation may not be available. We demonstrate estimation at one scale and propose a representation for site contamination intensity that takes full account of uncertainty,more » is flexible enough to answer regulatory requirements, and is a practical tool for managing detailed spatial site characterization and remediation. The representation is based on point process spatial estimation methods that require modern computational resources for practical application. These methods have provisions for including prior and covariate information.« less
Economic burden of informal care attributable to stroke among those aged 65 years or older in China.
Joo, Heesoo; Liang, Di
2017-02-01
Stroke is a leading cause of disability in China, frequently resulting in the need for informal care. No information, however, is available on costs of informal care associated with stroke, required to understand the true cost of stroke in China. Using the 2011 China Health and Retirement Longitudinal Study, we identified 4447 respondents aged ≥65 years suitable for analyses, including 184 stroke survivors. We estimated the economic burden of informal care associated with stroke using a two-part model. The monthly number of hours of informal caregiving associated with stroke was 29.2 h/stroke survivor, and the average annual cost of informal care associated with stroke was 10,612 RMB per stroke survivor. The findings stress the necessity of proper interventions to prevent stroke and will be useful for estimating the economic burden of stroke.
78 FR 37277 - Proposed Collection: Comment Request for Form 1099-S
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-20
...: 1099-S. Abstract: Internal Revenue Code section 6045(e) and the regulations there under require persons... displays a valid OMB control number. Books or records relating to a collection of information must be... information technology; and (e) estimates of capital or start-up costs and costs of operation, maintenance...
78 FR 57219 - Proposed Collection; Comment Request for Form W-2G
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-17
... Locality) at the request of TIGERS (Tax Information Group for E-Commerce Requirements Standardization) and the e-Channel Support e-Initiatives Group. The new boxes are added for the use of state and local... information technology; and (e) estimates of capital or start-up costs and costs of operation, maintenance...
75 FR 76468 - Agency Information Collection Request; 30-Day Public Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-08
... Coordinator for Health Information Technology (ONC), HHS. In compliance with the requirement of section 3506(c... members, and competency exam takers; and a Web-based survey of community college faculty. Estimated... Workforce program. Focus groups with Exam takers Competency exam 32 1 1.5 48 takers not enrolled in...
76 FR 76162 - Agency Information Collection Activities; Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-06
... Franchise Rule, and it mirrors the requirements and prohibitions of the original Franchise Rule. The FTC... Franchise Rule. Staff estimates that 250 or so new business opportunity sellers will enter the market each... x 3 hours per seller)). \\3\\ Based upon staff's informal discussions with several franchises in...
76 FR 81909 - Notice of Request for Extension of a Currently Approved Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-29
..., financial feasibility determinations and loan security determinations as required by the Con Act. Estimate... Industry Loan Program. DATES: Comments on this notice must be received by February 27, 2012 to be assured... for TDD users. SUPPLEMENTARY INFORMATION: Title: Business and Industry Loan Program. OMB Number: 0570...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-02
... driving practices when these vehicles are operated. Estimated Annual Burden: 300 hours. Number of... information because of model changes. The light truck manufacturers gather only pre-existing data for the... average of $35.00 per hour for professional and clerical staff to gather data, distribute and print...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-12
... paperwork requirements for the USGS Earthquake Report. We may not conduct or sponsor and a person is not.... SUPPLEMENTARY INFORMATION: OMB Control Number: 1028-0048. Title: USGS Earthquake Report. Type of Request...: Voluntary. Frequency of Collection: On occasion, after an earthquake. Estimated Completion Time: 6 minutes...
NASA Astrophysics Data System (ADS)
1980-09-01
The energy emergency management information system (EEMIS) has responsibility for providing special information and communication services to government officials at Federal and state levels, who must deal with energy emergencies. Because of proprietary information residing in the data base used for federal purposes, a special system (EEMIS-S) must be established for use by the states. It is planned to acquire teleprocessing services for EEMIS-S from a time-sharing commercial vendor, and the process for procurement must meet guidelines for approval. The work plan and schedule for meeting these guidelines are discussed. Tasks to be included contain estimates of time, cost, and resources required, all of which are briefly described.
A decision directed detector for the phase incoherent Gaussian channel
NASA Technical Reports Server (NTRS)
Kazakos, D.
1975-01-01
A vector digital signalling scheme is proposed for simultaneous adaptive data transmission and phase estimation. The use of maximum likelihood estimation methods predicts a better performance than the phase-locked loop. The phase estimate is shown to converge to the true value, so that the adaptive nature of the detector effectively achieves phase acquisition and improvement in performance. No separate synchronization interval is required and phase fluctuations can be tracked simultaneously with the transmission of information.
Incorporation of MRI-AIF Information For Improved Kinetic Modelling of Dynamic PET Data
NASA Astrophysics Data System (ADS)
Sari, Hasan; Erlandsson, Kjell; Thielemans, Kris; Atkinson, David; Ourselin, Sebastien; Arridge, Simon; Hutton, Brian F.
2015-06-01
In the analysis of dynamic PET data, compartmental kinetic analysis methods require an accurate knowledge of the arterial input function (AIF). Although arterial blood sampling is the gold standard of the methods used to measure the AIF, it is usually not preferred as it is an invasive method. An alternative method is the simultaneous estimation method (SIME), where physiological parameters and the AIF are estimated together, using information from different anatomical regions. Due to the large number of parameters to estimate in its optimisation, SIME is a computationally complex method and may sometimes fail to give accurate estimates. In this work, we try to improve SIME by utilising an input function derived from a simultaneously obtained DSC-MRI scan. With the assumption that the true value of one of the six parameter PET-AIF model can be derived from an MRI-AIF, the method is tested using simulated data. The results indicate that SIME can yield more robust results when the MRI information is included with a significant reduction in absolute bias of Ki estimates.
A Height Estimation Approach for Terrain Following Flights from Monocular Vision
Campos, Igor S. G.; Nascimento, Erickson R.; Freitas, Gustavo M.; Chaimowicz, Luiz
2016-01-01
In this paper, we present a monocular vision-based height estimation algorithm for terrain following flights. The impressive growth of Unmanned Aerial Vehicle (UAV) usage, notably in mapping applications, will soon require the creation of new technologies to enable these systems to better perceive their surroundings. Specifically, we chose to tackle the terrain following problem, as it is still unresolved for consumer available systems. Virtually every mapping aircraft carries a camera; therefore, we chose to exploit this in order to use presently available hardware to extract the height information toward performing terrain following flights. The proposed methodology consists of using optical flow to track features from videos obtained by the UAV, as well as its motion information to estimate the flying height. To determine if the height estimation is reliable, we trained a decision tree that takes the optical flow information as input and classifies whether the output is trustworthy or not. The classifier achieved accuracies of 80% for positives and 90% for negatives, while the height estimation algorithm presented good accuracy. PMID:27929424
ASSESSING AND COMBINING RELIABILITY OF PROTEIN INTERACTION SOURCES
LEACH, SONIA; GABOW, AARON; HUNTER, LAWRENCE; GOLDBERG, DEBRA S.
2008-01-01
Integrating diverse sources of interaction information to create protein networks requires strategies sensitive to differences in accuracy and coverage of each source. Previous integration approaches calculate reliabilities of protein interaction information sources based on congruity to a designated ‘gold standard.’ In this paper, we provide a comparison of the two most popular existing approaches and propose a novel alternative for assessing reliabilities which does not require a gold standard. We identify a new method for combining the resultant reliabilities and compare it against an existing method. Further, we propose an extrinsic approach to evaluation of reliability estimates, considering their influence on the downstream tasks of inferring protein function and learning regulatory networks from expression data. Results using this evaluation method show 1) our method for reliability estimation is an attractive alternative to those requiring a gold standard and 2) the new method for combining reliabilities is less sensitive to noise in reliability assignments than the similar existing technique. PMID:17990508
Whiteford, Harvey; Buckingham, Bill; Harris, Meredith; Diminic, Sandra; Stockings, Emily; Degenhardt, Louisa
2017-08-01
A population health approach to mental health service planning requires estimates that align interventions with the needs of people with mental illness. The primary objective was to estimate the number of people in Australia living with severe and persistent mental illness who have complex, multi-agency needs. The secondary objective was to describe the possible service needs of individuals with severe mental illness. We disaggregated the estimated 12-month prevalence of adults with severe mental illness into needs-based sub-groups, using multiple data sources. Possible service needs of 1825 adults with psychotic disorders and 334 adults with severe past-year affective and/or anxiety disorders were described using data from the 2010 Survey of High Impact Psychosis and 2007 National Survey of Mental Health and Wellbeing, respectively. Using best available data, we estimated that 3.3% of adults experience a severe mental illness each year, of whom one-third (1.1% of adults) experience a persistent mental illness that requires ongoing services to address residual disability. Among those with severe and persistent mental illness, one-third of adults (0.4% or 59,000 adults in 2015) have complex needs requiring multi-agency support to maximise their health, housing, social participation and personal functioning. Survey of High Impact Psychosis data indicated that among adults with psychotic disorders, use of accommodation (40%), non-government (30%) services and receipt of income support (85%) services were common, as were possible needs for support with socialising, personal care and employment. National Survey of Mental Health and Wellbeing data indicated that among individuals with severe affective and anxiety disorders, receipt of income support (37%) was common (information on accommodation and non-government support services was not available), as were possible needs for financial management and employment support. Agreed indicators of complex, multi-agency needs are required to refine these estimates. Closer alignment of information collected about possible service needs across epidemiological surveys is needed.
Jay M. Ver Hoef; Hailemariam Temesgen; Sergio Gómez
2013-01-01
Forest surveys provide critical information for many diverse interests. Data are often collected from samples, and from these samples, maps of resources and estimates of aerial totals or averages are required. In this paper, two approaches for mapping and estimating totals; the spatial linear model (SLM) and k-NN (k-Nearest Neighbor) are compared, theoretically,...
NASA Astrophysics Data System (ADS)
Meshgi, Ali; Schmitter, Petra; Babovic, Vladan; Chui, Ting Fong May
2014-11-01
Developing reliable methods to estimate stream baseflow has been a subject of interest due to its importance in catchment response and sustainable watershed management. However, to date, in the absence of complex numerical models, baseflow is most commonly estimated using statistically derived empirical approaches that do not directly incorporate physically-meaningful information. On the other hand, Artificial Intelligence (AI) tools such as Genetic Programming (GP) offer unique capabilities to reduce the complexities of hydrological systems without losing relevant physical information. This study presents a simple-to-use empirical equation to estimate baseflow time series using GP so that minimal data is required and physical information is preserved. A groundwater numerical model was first adopted to simulate baseflow for a small semi-urban catchment (0.043 km2) located in Singapore. GP was then used to derive an empirical equation relating baseflow time series to time series of groundwater table fluctuations, which are relatively easily measured and are physically related to baseflow generation. The equation was then generalized for approximating baseflow in other catchments and validated for a larger vegetation-dominated basin located in the US (24 km2). Overall, this study used GP to propose a simple-to-use equation to predict baseflow time series based on only three parameters: minimum daily baseflow of the entire period, area of the catchment and groundwater table fluctuations. It serves as an alternative approach for baseflow estimation in un-gauged systems when only groundwater table and soil information is available, and is thus complementary to other methods that require discharge measurements.
78 FR 7762 - Information Collection; Submission for OMB Review, Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-04
... verify their eligibility, and by both parties to satisfy certain legal requirements. Type of Review... per Response: 10 minutes. Estimated Total Burden Hours: 633 hours. Total Burden Cost (capital/startup...
77 FR 67026 - Proposed Extension of the Approval of Information Collection Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-08
.... Estimated Time per Response: 30-45 minutes. Frequency: On occasion. Total Burden Cost (capital/startup): $3996. Total Burden Costs (operation/maintenance): $54,732. Dated: October 31, 2012. Mary Ziegler...
Vehicle States Observer Using Adaptive Tire-Road Friction Estimator
NASA Astrophysics Data System (ADS)
Kwak, Byunghak; Park, Youngjin
Vehicle stability control system is a new idea which can enhance the vehicle stability and handling in the emergency situation. This system requires the information of the yaw rate, sideslip angle and road friction in order to control the traction and braking forces at the individual wheels. This paper proposes an observer for the vehicle stability control system. This observer consisted of the state observer for vehicle motion estimation and the road condition estimator for the identification of the coefficient of the road friction. The state observer uses 2 degrees-of-freedom bicycle model and estimates the system variables based on the Kalman filter. The road condition estimator uses the same vehicle model and identifies the coefficient of the tire-road friction based on the recursive least square method. Both estimators make use of each other information. We show the effectiveness and feasibility of the proposed scheme under various road conditions through computer simulations of a fifteen degree-of-freedom non-linear vehicle model.
Autonomous frequency domain identification: Theory and experiment
NASA Technical Reports Server (NTRS)
Yam, Yeung; Bayard, D. S.; Hadaegh, F. Y.; Mettler, E.; Milman, M. H.; Scheid, R. E.
1989-01-01
The analysis, design, and on-orbit tuning of robust controllers require more information about the plant than simply a nominal estimate of the plant transfer function. Information is also required concerning the uncertainty in the nominal estimate, or more generally, the identification of a model set within which the true plant is known to lie. The identification methodology that was developed and experimentally demonstrated makes use of a simple but useful characterization of the model uncertainty based on the output error. This is a characterization of the additive uncertainty in the plant model, which has found considerable use in many robust control analysis and synthesis techniques. The identification process is initiated by a stochastic input u which is applied to the plant p giving rise to the output. Spectral estimation (h = P sub uy/P sub uu) is used as an estimate of p and the model order is estimated using the produce moment matrix (PMM) method. A parametric model unit direction vector p is then determined by curve fitting the spectral estimate to a rational transfer function. The additive uncertainty delta sub m = p - unit direction vector p is then estimated by the cross spectral estimate delta = P sub ue/P sub uu where e = y - unit direction vectory y is the output error, and unit direction vector y = unit direction vector pu is the computed output of the parametric model subjected to the actual input u. The experimental results demonstrate the curve fitting algorithm produces the reduced-order plant model which minimizes the additive uncertainty. The nominal transfer function estimate unit direction vector p and the estimate delta of the additive uncertainty delta sub m are subsequently available to be used for optimization of robust controller performance and stability.
Estimating nest detection probabilities for white-winged dove nest transects in Tamaulipas, Mexico
Nichols, J.D.; Tomlinson, R.E.; Waggerman, G.
1986-01-01
Nest transects in nesting colonies provide one source of information on White-winged Dove (Zenaida asiatica asiatica) population status and reproduction. Nests are counted along transects using standardized field methods each year in Texas and northeastern Mexico by personnel associated with Mexico's Office of Flora and Fauna, the Texas Parks and Wildlife Department, and the U.S. Fish and Wildlife Service. Nest counts on transects are combined with information on the size of nesting colonies to estimate total numbers of nests in sampled colonies. Historically, these estimates have been based on the actual nest counts on transects and thus have required the assumption that all nests lying within transect boundaries are detected (seen) with a probability of one. Our objectives were to test the hypothesis that nest detection probability is one and, if rejected, to estimate this probability.
Assessment of Beddown Alternatives for the F-35
2013-01-01
For More Information Visit RAND at www.rand.org Explore the RAND Corporation View document details Support RAND Purchase this document Browse Reports...copyright law. Permission is required from RAND to reproduce, or reuse in another form, any of our research documents for commercial use. For information ...HOMELAND SECURITY Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to
A Novel Estimator for the Rate of Information Transfer by Continuous Signals
Takalo, Jouni; Ignatova, Irina; Weckström, Matti; Vähäsöyrinki, Mikko
2011-01-01
The information transfer rate provides an objective and rigorous way to quantify how much information is being transmitted through a communications channel whose input and output consist of time-varying signals. However, current estimators of information content in continuous signals are typically based on assumptions about the system's linearity and signal statistics, or they require prohibitive amounts of data. Here we present a novel information rate estimator without these limitations that is also optimized for computational efficiency. We validate the method with a simulated Gaussian information channel and demonstrate its performance with two example applications. Information transfer between the input and output signals of a nonlinear system is analyzed using a sensory receptor neuron as the model system. Then, a climate data set is analyzed to demonstrate that the method can be applied to a system based on two outputs generated by interrelated random processes. These analyses also demonstrate that the new method offers consistent performance in situations where classical methods fail. In addition to these examples, the method is applicable to a wide range of continuous time series commonly observed in the natural sciences, economics and engineering. PMID:21494562
Mist net effort required to inventory a forest bat species assemblage.
Theodore J. Weller; Danny C. Lee
2007-01-01
Little quantitative information exists about the survey effort necessary to inventory temperate bat species assemblages. We used a bootstrap resampling lgorithm to estimate the number of mist net surveys required to capture individuals from 9 species at both study area and site levels using data collected in a forested watershed in northwestern California, USA, during...
ERIC Educational Resources Information Center
Almond, Russell G.
2007-01-01
Over the course of instruction, instructors generally collect a great deal of information about each student. Integrating that information intelligently requires models for how a student's proficiency changes over time. Armed with such models, instructors can "filter" the data--more accurately estimate the student's current proficiency…
Informed spectral analysis: audio signal parameter estimation using side information
NASA Astrophysics Data System (ADS)
Fourer, Dominique; Marchand, Sylvain
2013-12-01
Parametric models are of great interest for representing and manipulating sounds. However, the quality of the resulting signals depends on the precision of the parameters. When the signals are available, these parameters can be estimated, but the presence of noise decreases the resulting precision of the estimation. Furthermore, the Cramér-Rao bound shows the minimal error reachable with the best estimator, which can be insufficient for demanding applications. These limitations can be overcome by using the coding approach which consists in directly transmitting the parameters with the best precision using the minimal bitrate. However, this approach does not take advantage of the information provided by the estimation from the signal and may require a larger bitrate and a loss of compatibility with existing file formats. The purpose of this article is to propose a compromised approach, called the 'informed approach,' which combines analysis with (coded) side information in order to increase the precision of parameter estimation using a lower bitrate than pure coding approaches, the audio signal being known. Thus, the analysis problem is presented in a coder/decoder configuration where the side information is computed and inaudibly embedded into the mixture signal at the coder. At the decoder, the extra information is extracted and is used to assist the analysis process. This study proposes applying this approach to audio spectral analysis using sinusoidal modeling which is a well-known model with practical applications and where theoretical bounds have been calculated. This work aims at uncovering new approaches for audio quality-based applications. It provides a solution for challenging problems like active listening of music, source separation, and realistic sound transformations.
Evapotranspiration and remote sensing
NASA Technical Reports Server (NTRS)
Schmugge, T. J.; Gurney, R.
1982-01-01
There are three things required for evapotranspiration to occur: (1) energy (580 cal/gm) for the change of phase of the water; (2) a source of the water, i.e., adequate soil moisture in the surface layer or in the root zone of the plant; and (3) a sink for the water, i.e., a moisture deficit in the air above the ground. Remote sensing can contribute information to the first two of these conditions by providing estimates of solar insolation, surface albedo, surface temperature, vegetation cover, and soil moisture content. In addition there have been attempts to estimate precipitation and shelter air temperature from remotely sensed data. The problem remains to develop methods for effectively using these sources of information to make large area estimates of evapotranspiration.
Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series
Vicente, Raul; Díaz-Pernas, Francisco J.; Wibral, Michael
2014-01-01
Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble of realizations is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that is suitable for the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method for transfer entropy estimation. We test the performance and robustness of our implementation on data from numerical simulations of stochastic processes. We also demonstrate the applicability of the ensemble method to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscience data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and artificial systems. PMID:25068489
NASA Astrophysics Data System (ADS)
Sobue, Shinichi; Yamazaki, Junichi; Matsumoto, Shuichi; Konishi, Hisahiro; Maejima, Hironori; Sasaki, Susumu; Kato, Manabu; Mitsuhashi, Seiji; Tachino, Junichi
The lunar explorer SELENE (also called KAGUYA) carried thirteen scientific mission instruments to reveal the origin and evolution of Moon and to investigate the possible future utilization of Moon. In addition to the scientific instruments, a high-definition TV (HDTV) camera provided by the Japan Broadcasting Corporation (NHK) was carried on KAGUYA to promote public outreach. We usually use housekeeping telemetry data to derive the satellite attitude along with orbital determination and propagated information. However, it takes time to derive this information, since orbital determination and propagation calculation require the use of the orbital model. When a malfunction of the KAGUYA reaction wheel occurred, we could not have correct attitude information. This means that we don’t have a correct orbital determination in timely fashion. However, when we checked HDTV movies, we found that horizon information on the lunar surface derived from HDTV moving images as a horizon sensor was very useful for the detection of the attitude of KAGUYA. We then compared this information with the attitude information derived from orbital telemetry to validate the accuracy of the HDTV derived estimation. As a result of this comparison, there are good pitch attitude estimation using HDTV derived estimation and we could estimate the pitch angle change during the KAGUYA mission operation simplify and quickly. In this study, we show the usefulness of this HDTV camera as a horizon sensor.
Algae Biofuels Co-Location Assessment Tool for Canada
DOE Office of Scientific and Technical Information (OSTI.GOV)
2011-11-29
The Algae Biofuels Co-Location Assessment Tool for Canada uses chemical stoichiometry to estimate Nitrogen, Phosphorous, and Carbon atom availability from waste water and carbon dioxide emissions streams, and requirements for those same elements to produce a unit of algae. This information is then combined to find limiting nutrient information and estimate potential productivity associated with waste water and carbon dioxide sources. Output is visualized in terms of distributions or spatial locations. Distances are calculated between points of interest in the model using the great circle distance equation, and the smallest distances found by an exhaustive search and sort algorithm.
77 FR 63923 - Proposed Collection; Comment Request for Regulation Project
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-17
... Wellstone and Pete Domenici Mental Health Parity and Addiction Equity Act of 2008, which requires parity... forms of information technology; and (e) estimates of capital or start-up costs and costs of operation...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-17
...; exotic pet industry; hunters; taxidermists; commercial importers/exporters of wildlife and plants...: Required to obtain or retain a benefit. Frequency of Collection: On occasion. Estimated Number of Annual...
McKeown, Robert E; Holbrook, Joseph R; Danielson, Melissa L; Cuffe, Steven P; Wolraich, Mark L; Visser, Susanna N
2015-01-01
To determine the impact of varying attention-deficit/hyperactivity disorder (ADHD) diagnostic criteria, including new DSM-5 criteria, on prevalence estimates. Parent and teacher reports identified high- and low-screen children with ADHD from elementary schools in 2 states that produced a diverse overall sample. The parent interview stage included the Diagnostic Interview Schedule for Children-IV (DISC-IV), and up to 4 additional follow-up interviews. Weighted prevalence estimates, accounting for complex sampling, quantified the impact of varying ADHD criteria using baseline and the final follow-up interview data. At baseline 1,060 caregivers were interviewed; 656 had at least 1 follow-up interview. Teachers and parents reported 6 or more ADHD symptoms for 20.5% (95% CI = 18.1%-23.2%) and 29.8% (CI = 24.5%-35.6%) of children respectively, with criteria for impairment and onset by age 7 years (DSM-IV) reducing these proportions to 16.3% (CI = 14.7%-18.0%) and 17.5% (CI = 13.3%-22.8%); requiring at least 4 teacher-reported symptoms reduced the parent-reported prevalence to 8.9% (CI = 7.4%-10.6%). Revising age of onset to 12 years per DSM-5 increased the 8.9% estimate to 11.3% (CI = 9.5%-13.3%), with a similar increase seen at follow-up: 8.2% with age 7 onset (CI = 5.9%-11.2%) versus 13.0% (CI = 7.6%-21.4%) with onset by age 12. Reducing the number of symptoms required for those aged 17 and older increased the overall estimate to 13.1% (CI = 7.7%-21.5%). These findings quantify the impact on prevalence estimates of varying case definition criteria for ADHD. Further research of impairment ratings and data from multiple informants is required to better inform clinicians conducting diagnostic assessments. DSM-5 changes in age of onset and number of symptoms required for older adolescents appear to increase prevalence estimates, although the full impact is uncertain due to the age of our sample. Published by Elsevier Inc.
Electromagnetic Characterization of Inhomogeneous Media
2012-03-22
Engineering and Management Air Force Institute of Technology Air University Air Education and Training Command In Partial Fulfillment of the Requirements...found in the laboratory data, fun is the code that contains the theatrical formulation of S11, and beta0 is the initial constitutive parameter estimate...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources
Bianca N. I. Eskelson; Hailemariam Temesgen; Tara M. Barrett
2008-01-01
Many growth and yield simulators require a stand table or tree-list to set the initial condition for projections in time. Most similar neighbour (MSN) approaches can be used for estimating stand tables from information commonly available on forest cover maps (e.g. height, volume, canopy cover, and species composition). Simulations were used to compare MSN (using an...
Gu, Hairong; Kim, Woojae; Hou, Fang; Lesmes, Luis Andres; Pitt, Mark A; Lu, Zhong-Lin; Myung, Jay I
2016-01-01
Measurement efficiency is of concern when a large number of observations are required to obtain reliable estimates for parametric models of vision. The standard entropy-based Bayesian adaptive testing procedures addressed the issue by selecting the most informative stimulus in sequential experimental trials. Noninformative, diffuse priors were commonly used in those tests. Hierarchical adaptive design optimization (HADO; Kim, Pitt, Lu, Steyvers, & Myung, 2014) further improves the efficiency of the standard Bayesian adaptive testing procedures by constructing an informative prior using data from observers who have already participated in the experiment. The present study represents an empirical validation of HADO in estimating the human contrast sensitivity function. The results show that HADO significantly improves the accuracy and precision of parameter estimates, and therefore requires many fewer observations to obtain reliable inference about contrast sensitivity, compared to the method of quick contrast sensitivity function (Lesmes, Lu, Baek, & Albright, 2010), which uses the standard Bayesian procedure. The improvement with HADO was maintained even when the prior was constructed from heterogeneous populations or a relatively small number of observers. These results of this case study support the conclusion that HADO can be used in Bayesian adaptive testing by replacing noninformative, diffuse priors with statistically justified informative priors without introducing unwanted bias.
Gu, Hairong; Kim, Woojae; Hou, Fang; Lesmes, Luis Andres; Pitt, Mark A.; Lu, Zhong-Lin; Myung, Jay I.
2016-01-01
Measurement efficiency is of concern when a large number of observations are required to obtain reliable estimates for parametric models of vision. The standard entropy-based Bayesian adaptive testing procedures addressed the issue by selecting the most informative stimulus in sequential experimental trials. Noninformative, diffuse priors were commonly used in those tests. Hierarchical adaptive design optimization (HADO; Kim, Pitt, Lu, Steyvers, & Myung, 2014) further improves the efficiency of the standard Bayesian adaptive testing procedures by constructing an informative prior using data from observers who have already participated in the experiment. The present study represents an empirical validation of HADO in estimating the human contrast sensitivity function. The results show that HADO significantly improves the accuracy and precision of parameter estimates, and therefore requires many fewer observations to obtain reliable inference about contrast sensitivity, compared to the method of quick contrast sensitivity function (Lesmes, Lu, Baek, & Albright, 2010), which uses the standard Bayesian procedure. The improvement with HADO was maintained even when the prior was constructed from heterogeneous populations or a relatively small number of observers. These results of this case study support the conclusion that HADO can be used in Bayesian adaptive testing by replacing noninformative, diffuse priors with statistically justified informative priors without introducing unwanted bias. PMID:27105061
NASA Astrophysics Data System (ADS)
Lika, Konstadia; Kearney, Michael R.; Kooijman, Sebastiaan A. L. M.
2011-11-01
The covariation method for estimating the parameters of the standard Dynamic Energy Budget (DEB) model provides a single-step method of accessing all the core DEB parameters from commonly available empirical data. In this study, we assess the robustness of this parameter estimation procedure and analyse the role of pseudo-data using elasticity coefficients. In particular, we compare the performance of Maximum Likelihood (ML) vs. Weighted Least Squares (WLS) approaches and find that the two approaches tend to converge in performance as the number of uni-variate data sets increases, but that WLS is more robust when data sets comprise single points (zero-variate data). The efficiency of the approach is shown to be high, and the prior parameter estimates (pseudo-data) have very little influence if the real data contain information about the parameter values. For instance, the effects of the pseudo-value for the allocation fraction κ is reduced when there is information for both growth and reproduction, that for the energy conductance is reduced when information on age at birth and puberty is given, and the effects of the pseudo-value for the maturity maintenance rate coefficient are insignificant. The estimation of some parameters (e.g., the zoom factor and the shape coefficient) requires little information, while that of others (e.g., maturity maintenance rate, puberty threshold and reproduction efficiency) require data at several food levels. The generality of the standard DEB model, in combination with the estimation of all of its parameters, allows comparison of species on the basis of parameter values. We discuss a number of preliminary patterns emerging from the present collection of parameter estimates across a wide variety of taxa. We make the observation that the estimated value of the fraction κ of mobilised reserve that is allocated to soma is far away from the value that maximises reproduction. We recognise this as the reason why two very different parameter sets must exist that fit most data set reasonably well, and give arguments why, in most cases, the set with the large value of κ should be preferred. The continued development of a parameter database through the estimation procedures described here will provide a strong basis for understanding evolutionary patterns in metabolic organisation across the diversity of life.
Exploration Planetary Surface Structural Systems: Design Requirements and Compliance
NASA Technical Reports Server (NTRS)
Dorsey, John T.
2011-01-01
The Lunar Surface Systems Project developed system concepts that would be necessary to establish and maintain a permanent human presence on the Lunar surface. A variety of specific system implementations were generated as a part of the scenarios, some level of system definition was completed, and masses estimated for each system. Because the architecture studies generally spawned a large number of system concepts and the studies were executed in a short amount of time, the resulting system definitions had very low design fidelity. This paper describes the development sequence required to field a particular structural system: 1) Define Requirements, 2) Develop the Design and 3) Demonstrate Compliance of the Design to all Requirements. This paper also outlines and describes in detail the information and data that are required to establish structural design requirements and outlines the information that would comprise a planetary surface system Structures Requirements document.
The fossilized birth–death process for coherent calibration of divergence-time estimates
Heath, Tracy A.; Huelsenbeck, John P.; Stadler, Tanja
2014-01-01
Time-calibrated species phylogenies are critical for addressing a wide range of questions in evolutionary biology, such as those that elucidate historical biogeography or uncover patterns of coevolution and diversification. Because molecular sequence data are not informative on absolute time, external data—most commonly, fossil age estimates—are required to calibrate estimates of species divergence dates. For Bayesian divergence time methods, the common practice for calibration using fossil information involves placing arbitrarily chosen parametric distributions on internal nodes, often disregarding most of the information in the fossil record. We introduce the “fossilized birth–death” (FBD) process—a model for calibrating divergence time estimates in a Bayesian framework, explicitly acknowledging that extant species and fossils are part of the same macroevolutionary process. Under this model, absolute node age estimates are calibrated by a single diversification model and arbitrary calibration densities are not necessary. Moreover, the FBD model allows for inclusion of all available fossils. We performed analyses of simulated data and show that node age estimation under the FBD model results in robust and accurate estimates of species divergence times with realistic measures of statistical uncertainty, overcoming major limitations of standard divergence time estimation methods. We used this model to estimate the speciation times for a dataset composed of all living bears, indicating that the genus Ursus diversified in the Late Miocene to Middle Pliocene. PMID:25009181
Integrating TITAN2D Geophysical Mass Flow Model with GIS
NASA Astrophysics Data System (ADS)
Namikawa, L. M.; Renschler, C.
2005-12-01
TITAN2D simulates geophysical mass flows over natural terrain using depth-averaged granular flow models and requires spatially distributed parameter values to solve differential equations. Since a Geographical Information System (GIS) main task is integration and manipulation of data covering a geographic region, the use of a GIS for implementation of simulation of complex, physically-based models such as TITAN2D seems a natural choice. However, simulation of geophysical flows requires computationally intensive operations that need unique optimizations, such as adaptative grids and parallel processing. Thus GIS developed for general use cannot provide an effective environment for complex simulations and the solution is to develop a linkage between GIS and simulation model. The present work presents the solution used for TITAN2D where data structure of a GIS is accessed by simulation code through an Application Program Interface (API). GRASS is an open source GIS with published data formats thus GRASS data structure was selected. TITAN2D requires elevation, slope, curvature, and base material information at every cell to be computed. Results from simulation are visualized by a system developed to handle the large amount of output data and to support a realistic dynamic 3-D display of flow dynamics, which requires elevation and texture, usually from a remote sensor image. Data required by simulation is in raster format, using regular rectangular grids. GRASS format for regular grids is based on data file (binary file storing data either uncompressed or compressed by grid row), header file (text file, with information about georeferencing, data extents, and grid cell resolution), and support files (text files, with information about color table and categories names). The implemented API provides access to original data (elevation, base material, and texture from imagery) and slope and curvature derived from elevation data. From several existing methods to estimate slope and curvature from elevation, the selected one is based on estimation by a third-order finite difference method, which has shown to perform better or with minimal difference when compared to more computationally expensive methods. Derivatives are estimated using weighted sum of 8 grid neighbor values. The method was implemented and simulation results compared to derivatives estimated by a simplified version of the method (uses only 4 neighbor cells) and proven to perform better. TITAN2D uses an adaptative mesh grid, where resolution (grid cell size) is not constant, and visualization tools also uses texture with varying resolutions for efficient display. The API supports different resolutions applying bilinear interpolation when elevation, slope and curvature are required at a resolution higher (smaller cell size) than the original and using a nearest cell approach for elevations with lower resolution (larger) than the original. For material information nearest neighbor method is used since interpolation on categorical data has no meaning. Low fidelity characteristic of visualization allows use of nearest neighbor method for texture. Bilinear interpolation estimates the value at a point as the distance-weighted average of values at the closest four cell centers, and interpolation performance is just slightly inferior compared to more computationally expensive methods such as bicubic interpolation and kriging.
Impact of uncertainty in soil, climatic, and chemical information in a pesticide leaching assessment
NASA Astrophysics Data System (ADS)
Loague, Keith; Green, Richard E.; Giambelluca, Thomas W.; Liang, Tony C.; Yost, Russell S.
1990-01-01
A simple mobility index, when combined with a geographic information system, can be used to generate rating maps which indicate qualitatively the potential for various organic chemicals to leach to groundwater. In this paper we investigate the magnitude of uncertainty associated with pesticide mobility estimates as a result of data uncertainties. Our example is for the Pearl Harbor Basin, Oahu, Hawaii. The two pesticides included in our analysis are atrazine (2-chloro-4-ethylamino-6-isopropylamino-s-triazine) and diuron [3-(3,4-dichlorophenyul)-1,1-dimethylarea]. The mobility index used here is known as the Attenuation Factor ( AF); it requires soil, hydrogeologic, climatic and chemical information as input data. We employ first-order uncertainty analysis to characterize the uncertainty in estimates of AF resulting from uncertainties in the various input data. Soils in the Pearl Harbor Basin are delineated at the order taxonomic category for this study. Our results show that there can be a significant amount of uncertainty in estimates of pesticide mobility for the Pearl Harbor Basin. This information needs to be considered if future decisions concerning chemical regulation are to be based on estimates of pesticide mobility determined from simple indices.
NASA Astrophysics Data System (ADS)
Loague, Keith; Green, Richard E.; Giambelluca, Thomas W.; Liang, Tony C.; Yost, Russell S.
2016-11-01
A simple mobility index, when combined with a geographic information system, can be used to generate rating maps which indicate qualitatively the potential for various organic chemicals to leach to groundwater. In this paper we investigate the magnitude of uncertainty associated with pesticide mobility estimates as a result of data uncertainties. Our example is for the Pearl Harbor Basin, Oahu, Hawaii. The two pesticides included in our analysis are atrazine (2-chloro-4-ethylamino-6-isopropylamino-s-triazine) and diuron [3-(3,4-dichlorophenyl)-1,1-dimethylarea]. The mobility index used here is known as the Attenuation Factor (AF); it requires soil, hydrogeologic, climatic, and chemical information as input data. We employ first-order uncertainty analysis to characterize the uncertainty in estimates of AF resulting from uncertainties in the various input data. Soils in the Pearl Harbor Basin are delineated at the order taxonomic category for this study. Our results show that there can be a significant amount of uncertainty in estimates of pesticide mobility for the Pearl Harbor Basin. This information needs to be considered if future decisions concerning chemical regulation are to be based on estimates of pesticide mobility determined from simple indices.
Arsenault, Joanne E; Brown, Kenneth H
2017-05-01
Background: Previous research indicates that young children in low-income countries (LICs) generally consume greater amounts of protein than published estimates of protein requirements, but this research did not account for protein quality based on the mix of amino acids and the digestibility of ingested protein. Objective: Our objective was to estimate the prevalence of inadequate protein and amino acid intake by young children in LICs, accounting for protein quality. Methods: Seven data sets with information on dietary intake for children (6-35 mo of age) from 6 LICs (Peru, Guatemala, Ecuador, Bangladesh, Uganda, and Zambia) were reanalyzed to estimate protein and amino acid intake and assess adequacy. The protein digestibility-corrected amino acid score of each child's diet was calculated and multiplied by the original (crude) protein intake to obtain an estimate of available protein intake. Distributions of usual intake were obtained to estimate the prevalence of inadequate protein and amino acid intake for each cohort according to Estimated Average Requirements. Results: The prevalence of inadequate protein intake was highest in breastfeeding children aged 6-8 mo: 24% of Bangladeshi and 16% of Peruvian children. With the exception of Bangladesh, the prevalence of inadequate available protein intake decreased by age 9-12 mo and was very low in all sites (0-2%) after 12 mo of age. Inadequate protein intake in children <12 mo of age was due primarily to low energy intake from complementary foods, not inadequate protein density. Conclusions: Overall, most children consumed protein amounts greater than requirements, except for the younger breastfeeding children, who were consuming low amounts of complementary foods. These findings reinforce previous evidence that dietary protein is not generally limiting for children in LICs compared with estimated requirements for healthy children, even after accounting for protein quality. However, unmeasured effects of infection and intestinal dysfunction on the children's protein requirements could modify this conclusion.
NWS Operational Requirements for Ensemble-Based Hydrologic Forecasts
NASA Astrophysics Data System (ADS)
Hartman, R. K.
2008-12-01
Ensemble-based hydrologic forecasts have been developed and issued by National Weather Service (NWS) staff at River Forecast Centers (RFCs) for many years. Used principally for long-range water supply forecasts, only the uncertainty associated with weather and climate have been traditionally considered. As technology and societal expectations of resource managers increase, the use and desire for risk-based decision support tools has also increased. These tools require forecast information that includes reliable uncertainty estimates across all time and space domains. The development of reliable uncertainty estimates associated with hydrologic forecasts is being actively pursued within the United States and internationally. This presentation will describe the challenges, components, and requirements for operational hydrologic ensemble-based forecasts from the perspective of a NOAA/NWS River Forecast Center.
Estimating Local Chlamydia Incidence and Prevalence Using Surveillance Data
White, Peter J.
2017-01-01
Background: Understanding patterns of chlamydia prevalence is important for addressing inequalities and planning cost-effective control programs. Population-based surveys are costly; the best data for England come from the Natsal national surveys, which are only available once per decade, and are nationally representative but not powered to compare prevalence in different localities. Prevalence estimates at finer spatial and temporal scales are required. Methods: We present a method for estimating local prevalence by modeling the infection, testing, and treatment processes. Prior probability distributions for parameters describing natural history and treatment-seeking behavior are informed by the literature or calibrated using national prevalence estimates. By combining them with surveillance data on numbers of chlamydia tests and diagnoses, we obtain estimates of local screening rates, incidence, and prevalence. We illustrate the method by application to data from England. Results: Our estimates of national prevalence by age group agree with the Natsal-3 survey. They could be improved by additional information on the number of diagnosed cases that were asymptomatic. There is substantial local-level variation in prevalence, with more infection in deprived areas. Incidence in each sex is strongly correlated with prevalence in the other. Importantly, we find that positivity (the proportion of tests which were positive) does not provide a reliable proxy for prevalence. Conclusion: This approach provides local chlamydia prevalence estimates from surveillance data, which could inform analyses to identify and understand local prevalence patterns and assess local programs. Estimates could be more accurate if surveillance systems recorded additional information, including on symptoms. See video abstract at, http://links.lww.com/EDE/B211. PMID:28306613
77 FR 70493 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-26
... of certain Canadian issuers under the Securities Act of 1933 (15 U.S.C. 77a et seq.) that will be... securities law requirements and assures the public availability of such information. We estimate that Form F...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-09
... traffic behaviors and design interventions to reduce speeding and other hazardous traffic actions. Some of... would be voluntary and anonymous. Estimated Total Annual Burden: 2,005 hours (15 pretest interviews...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-02
... traffic behaviors and design interventions to reduce speeding and other hazardous traffic actions. Some of... voluntary and anonymous. Estimated Total Annual Burden: 2,005 hours (15 pretest interviews averaging 20...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-30
... requirements for the interstate movement of regulated articles, such as nursery stock and certain trees, from... hours per response. Respondents: Nurseries in California, Oregon, and Washington. Estimated annual...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-18
... importation of plants from Israel, except bulbs, dormant perennials, and seeds. These requirements involve the... plant protection organizations of Spain and Israel. Estimated annual number of respondents: 60...
A statistical framework for protein quantitation in bottom-up MS-based proteomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas
2009-08-15
ABSTRACT Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and confidence measures. Challenges include the presence of low-quality or incorrectly identified peptides and widespread, informative, missing data. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model for protein abundance in terms of peptide peak intensities, applicable to both label-based and label-free quantitation experiments. The model allows for both random and censoring missingness mechanisms and provides naturally for protein-level estimates and confidence measures. The model is also used to derive automated filtering and imputation routines. Three LC-MS datasets are used tomore » illustrate the methods. Availability: The software has been made available in the open-source proteomics platform DAnTE (Polpitiya et al. (2008)) (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu« less
Quantum metrology and estimation of Unruh effect
Wang, Jieci; Tian, Zehua; Jing, Jiliang; Fan, Heng
2014-01-01
We study the quantum metrology for a pair of entangled Unruh-Dewitt detectors when one of them is accelerated and coupled to a massless scalar field. Comparing with previous schemes, our model requires only local interaction and avoids the use of cavities in the probe state preparation process. We show that the probe state preparation and the interaction between the accelerated detector and the external field have significant effects on the value of quantum Fisher information, correspondingly pose variable ultimate limit of precision in the estimation of Unruh effect. We find that the precision of the estimation can be improved by a larger effective coupling strength and a longer interaction time. Alternatively, the energy gap of the detector has a range that can provide us a better precision. Thus we may adjust those parameters and attain a higher precision in the estimation. We also find that an extremely high acceleration is not required in the quantum metrology process. PMID:25424772
Accurate State Estimation and Tracking of a Non-Cooperative Target Vehicle
NASA Technical Reports Server (NTRS)
Thienel, Julie K.; Sanner, Robert M.
2006-01-01
Autonomous space rendezvous scenarios require knowledge of the target vehicle state in order to safely dock with the chaser vehicle. Ideally, the target vehicle state information is derived from telemetered data, or with the use of known tracking points on the target vehicle. However, if the target vehicle is non-cooperative and does not have the ability to maintain attitude control, or transmit attitude knowledge, the docking becomes more challenging. This work presents a nonlinear approach for estimating the body rates of a non-cooperative target vehicle, and coupling this estimation to a tracking control scheme. The approach is tested with the robotic servicing mission concept for the Hubble Space Telescope (HST). Such a mission would not only require estimates of the HST attitude and rates, but also precision control to achieve the desired rate and maintain the orientation to successfully dock with HST.
NASA Astrophysics Data System (ADS)
Cho, N.; Lee, M. J.; Maeng, J. H.
2017-12-01
Environmental impact assessment estimates the impact of development as a business unit and establishes mitigation plan. If the development is done, its economic effects can spread to the nearby areas. So that various developments can be distributed at different time intervals. The impact of the new developments can be combined with existing environmental impacts and can have a larger impact. That is, Cumulative impact assessment is needed to consider the environmental capacity of the Nearby area. Cumulative impact assessments require policy tools such as environmental impact assessment information and cumulative impact estimation models. In Korea, environmental information (water quality, air quality, etc.) of the development site is measured for environmental impact assessment and monitored for a certain period (generally 5 years) after the project. In addition, by constructing the environmental information as a spatial database, it is possible to express the environmental impact on a regional basis spatially and to intuitively use it for development site selection. Utilizing a composite model of environmental impact assessment information and Remote Sensing data for cumulative impact estimation, That can be used as a policy decision support tool that provides quantitative information for development area management, such as time series effect and sprawl phenomenon.
Chaudhuri, Shomesh E; Merfeld, Daniel M
2013-03-01
Psychophysics generally relies on estimating a subject's ability to perform a specific task as a function of an observed stimulus. For threshold studies, the fitted functions are called psychometric functions. While fitting psychometric functions to data acquired using adaptive sampling procedures (e.g., "staircase" procedures), investigators have encountered a bias in the spread ("slope" or "threshold") parameter that has been attributed to the serial dependency of the adaptive data. Using simulations, we confirm this bias for cumulative Gaussian parametric maximum likelihood fits on data collected via adaptive sampling procedures, and then present a bias-reduced maximum likelihood fit that substantially reduces the bias without reducing the precision of the spread parameter estimate and without reducing the accuracy or precision of the other fit parameters. As a separate topic, we explain how to implement this bias reduction technique using generalized linear model fits as well as other numeric maximum likelihood techniques such as the Nelder-Mead simplex. We then provide a comparison of the iterative bootstrap and observed information matrix techniques for estimating parameter fit variance from adaptive sampling procedure data sets. The iterative bootstrap technique is shown to be slightly more accurate; however, the observed information technique executes in a small fraction (0.005 %) of the time required by the iterative bootstrap technique, which is an advantage when a real-time estimate of parameter fit variance is required.
NASA Astrophysics Data System (ADS)
Swenson, S. C.; Lawrence, D. M.
2017-12-01
Partitioning the vertically integrated water storage variations estimated from GRACE satellite data into the components of which it is comprised requires independent information. Land surface models, which simulate the transfer and storage of moisture and energy at the land surface, are often used to estimate water storage variability of snow, surface water, and soil moisture. To obtain an estimate of changes in groundwater, the estimates of these storage components are removed from GRACE data. Biases in the modeled water storage components are therefore present in the residual groundwater estimate. In this study, we examine how soil moisture variability, estimated using the Community Land Model (CLM), depends on the vertical structure of the model. We then explore the implications of this uncertainty in the context of estimating groundwater variations using GRACE data.
Information flow in the DAMA project beyond database managers: information flow managers
NASA Astrophysics Data System (ADS)
Russell, Lucian; Wolfson, Ouri; Yu, Clement
1996-12-01
To meet the demands of commercial data traffic on the information highway, a new look at managing data is necessary. One projected activity, sharing of point of sale information, is being considered in the Demand Activated Manufacturing Project (DAMA) of the American Textile Partnership (AMTEX) project. A scenario is examined in which 100 000 retail outlets communicate over a period of days. They provide the latest estimate of demand for sewn products across a chain of 26 000 suppliers through the use of bill of materials explosions at four levels of detail. Enabling this communication requires an approach that shares common features with both workflows and database management. A new paradigm, the information flow manager, is developed to handle this situation, including the case where members of the supply chain fail to communicate and go out of business. Techniques for approximation are introduced so as to keep estimates of demand as current as possible.
NASA Technical Reports Server (NTRS)
Khorram, S.
1977-01-01
Results are presented of a study intended to develop a general location-specific remote-sensing procedure for watershed-wide estimation of water loss to the atmosphere by evaporation and transpiration. The general approach involves a stepwise sequence of required information definition (input data), appropriate sample design, mathematical modeling, and evaluation of results. More specifically, the remote sensing-aided system developed to evaluate evapotranspiration employs a basic two-stage two-phase sample of three information resolution levels. Based on the discussed design, documentation, and feasibility analysis to yield timely, relatively accurate, and cost-effective evapotranspiration estimates on a watershed or subwatershed basis, work is now proceeding to implement this remote sensing-aided system.
Limitations and opportunities for the social cost of carbon (Invited)
NASA Astrophysics Data System (ADS)
Rose, S. K.
2010-12-01
Estimates of the marginal value of carbon dioxide-the social cost of carbon (SCC)-were recently adopted by the U.S. Government in order to satisfy requirements to value estimated GHG changes of new federal regulations. However, the development and use of SCC estimates of avoided climate change impacts comes with significant challenges and controversial decisions. Fortunately, economics can provide some guidance for conceptually appropriate estimates. At the same time, economics defaults to a benefit-cost decision framework to identify socially optimal policies. However, not all current policy decisions are benefit-cost based, nor depend on monetized information, or even have the same threshold for information. While a conceptually appropriate SCC is a useful metric, how far can we take it? This talk discusses potential applications of the SCC, limitations based on the state of research and methods, as well as opportunities for among other things consistency with climate risk management and research and decision-making tools.
The value of volume and growth measurements in timber sales management of the National Forests
NASA Technical Reports Server (NTRS)
Lietzke, K. R.
1977-01-01
This paper summarizes work performed in the estimation of gross social value of timber volume and growth rate information used in making regional harvest decisions in the National Forest System. A model was developed to permit parametric analysis. The problem is formulated as one of finding optimal inventory holding patterns. Public timber management differs from other inventory holding problems in that the inventory, itself, generates value over time in providing recreational, aesthetic and environmental goods. 'Nontimber' demand estimates are inferred from past Forest Service harvest and sales levels. The solution requires a description of the harvest rates which maintain the optimum inventory level. Gross benefits of the Landsat systems are estimated by comparison with Forest Service information gathering models. Gross annual benefits are estimated to be $5.9 million for the MSS system and $7.2 million for the TM system.
Estimation of uncertainty in tracer gas measurement of air change rates.
Iizuka, Atsushi; Okuizumi, Yumiko; Yanagisawa, Yukio
2010-12-01
Simple and economical measurement of air change rates can be achieved with a passive-type tracer gas doser and sampler. However, this is made more complex by the fact many buildings are not a single fully mixed zone. This means many measurements are required to obtain information on ventilation conditions. In this study, we evaluated the uncertainty of tracer gas measurement of air change rate in n completely mixed zones. A single measurement with one tracer gas could be used to simply estimate the air change rate when n = 2. Accurate air change rates could not be obtained for n ≥ 2 due to a lack of information. However, the proposed method can be used to estimate an air change rate with an accuracy of <33%. Using this method, overestimation of air change rate can be avoided. The proposed estimation method will be useful in practical ventilation measurements.
Data Sources for Estimating Environment-Related Diseases
Walker, Bailus
1984-01-01
Relating current morbidity and mortality to environmental and occupational factors requires information on parameters of environmental exposure for practitioners of medicine and other health scientists. A fundamental source of that information is the exposure history recorded in hospitals, clinics, and other points of entry to the health care system. The qualitative and quantitative aspects of this issue are reviewed. PMID:6716500
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-20
... BOEMRE decision. $7,500 processing fee. 106(b), 109 Request waiver or reduction 1 4. of fee. 104(b), 107... BOEMRE decision. 110 Submit required information for BOEMRE to make a decision. 114, 115(a) Submit appeal on BOEMRE final decision. Estimated Annual Reporting and Recordkeeping Non-Hour Cost Burden: We have...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-21
... per day (MGD) or more. Section 316(b) of the Clean Water Act (CWA) requires that any standard... of the rule is estimated to average 1,101 hours per respondent (i.e., an annual average of 46,228 hours of burden divided among an anticipated 42 States on average per year). Burden means the total time...
Production and use of estimates for monitoring progress in the health sector: the case of Bangladesh
Ahsan, Karar Zunaid; Tahsina, Tazeen; Iqbal, Afrin; Ali, Nazia Binte; Chowdhury, Suman Kanti; Huda, Tanvir M.; Arifeen, Shams El
2017-01-01
ABSTRACT Background: In order to support the progress towards the post-2015 development agenda for the health sector, the importance of high-quality and timely estimates has become evident both globally and at the country level. Objective and Methods: Based on desk review, key informant interviews and expert panel discussions, the paper critically reviews health estimates from both the local (i.e. nationally generated information by the government and other agencies) and the global sources (which are mostly modeled or interpolated estimates developed by international organizations based on different sources of information), and assesses the country capacity and monitoring strategies to meet the increasing data demand in the coming years. Primarily, this paper provides a situation analysis of Bangladesh in terms of production and use of health estimates for monitoring progress towards the post-2015 development goals for the health sector. Results: The analysis reveals that Bangladesh is data rich, particularly from household surveys and health facility assessments. Practices of data utilization also exist, with wide acceptability of survey results for informing policy, programme review and course corrections. Despite high data availability from multiple sources, the country capacity for providing regular updates of major global health estimates/indicators remains low. Major challenges also include limited human resources, capacity to generate quality data and multiplicity of data sources, where discrepancy and lack of linkages among different data sources (local sources and between local and global estimates) present emerging challenges for interpretation of the resulting estimates. Conclusion: To fulfill the increased data requirement for the post-2015 era, Bangladesh needs to invest more in electronic data capture and routine health information systems. Streamlining of data sources, integration of parallel information systems into a common platform, and capacity building for data generation and analysis are recommended as priority actions for Bangladesh in the coming years. In addition to automation of routine health information systems, establishing an Indicator Reference Group for Bangladesh to analyze data; building country capacity in data quality assessment and triangulation; and feeding into global, inter-agency estimates for better reporting would address a number of mentioned challenges in the short- and long-run. PMID:28532305
Estimating the Burden of Osteoarthritis to Plan for the Future.
Marshall, Deborah A; Vanderby, Sonia; Barnabe, Cheryl; MacDonald, Karen V; Maxwell, Colleen; Mosher, Dianne; Wasylak, Tracy; Lix, Lisa; Enns, Ed; Frank, Cy; Noseworthy, Tom
2015-10-01
With aging and obesity trends, the incidence and prevalence of osteoarthritis (OA) is expected to rise in Canada, increasing the demand for health resources. Resource planning to meet this increasing need requires estimates of the anticipated number of OA patients. Using administrative data from Alberta, we estimated OA incidence and prevalence rates and examined their sensitivity to alternative case definitions. We identified cases in a linked data set spanning 1993 to 2010 (population registry, Discharge Abstract Database, physician claims, Ambulatory Care Classification System, and prescription drug data) using diagnostic codes and drug identification numbers. In the base case, incident cases were captured for patients with an OA diagnostic code for at least 2 physician visits within 2 years or any hospital admission. Seven alternative case definitions were applied and compared. Age- and sex-standardized incidence and prevalence rates were estimated to be 8.6 and 80.3 cases per 1,000 population, respectively, in the base case. Physician claims data alone captured 88% of OA cases. Prevalence rate estimates required 15 years of longitudinal data to plateau. Compared to the base case, estimates are sensitive to alternative case definitions. Administrative databases are a key source for estimating the burden and epidemiologic trends of chronic diseases such as OA in Canada. Despite their limitations, these data provide valuable information for estimating disease burden and planning health services. Estimates of OA are mostly defined through physician claims data and require a long period of longitudinal data. © 2015, American College of Rheumatology.
Albin, Thomas J
2017-07-01
Occasionally practitioners must work with single dimensions defined as combinations (sums or differences) of percentile values, but lack information (e.g. variances) to estimate the accommodation achieved. This paper describes methods to predict accommodation proportions for such combinations of percentile values, e.g. two 90th percentile values. Kreifeldt and Nah z-score multipliers were used to estimate the proportions accommodated by combinations of percentile values of 2-15 variables; two simplified versions required less information about variance and/or correlation. The estimates were compared to actual observed proportions; for combinations of 2-15 percentile values the average absolute differences ranged between 0.5 and 1.5 percentage points. The multipliers were also used to estimate adjusted percentile values, that, when combined, estimate a desired proportion of the combined measurements. For combinations of two and three adjusted variables, the average absolute difference between predicted and observed proportions ranged between 0.5 and 3.0 percentage points. Copyright © 2017 Elsevier Ltd. All rights reserved.
An estimation framework for building information modeling (BIM)-based demolition waste by type.
Kim, Young-Chan; Hong, Won-Hwa; Park, Jae-Woo; Cha, Gi-Wook
2017-12-01
Most existing studies on demolition waste (DW) quantification do not have an official standard to estimate the amount and type of DW. Therefore, there are limitations in the existing literature for estimating DW with a consistent classification system. Building information modeling (BIM) is a technology that can generate and manage all the information required during the life cycle of a building, from design to demolition. Nevertheless, there has been a lack of research regarding its application to the demolition stage of a building. For an effective waste management plan, the estimation of the type and volume of DW should begin from the building design stage. However, the lack of tools hinders an early estimation. This study proposes a BIM-based framework that estimates DW in the early design stages, to achieve an effective and streamlined planning, processing, and management. Specifically, the input of construction materials in the Korean construction classification system and those in the BIM library were matched. Based on this matching integration, the estimates of DW by type were calculated by applying the weight/unit volume factors and the rates of DW volume change. To verify the framework, its operation was demonstrated by means of an actual BIM modeling and by comparing its results with those available in the literature. This study is expected to contribute not only to the estimation of DW at the building level, but also to the automated estimation of DW at the district level.
NASA Astrophysics Data System (ADS)
Pham, T. D.
2016-12-01
Recurrence plots display binary texture of time series from dynamical systems with single dots and line structures. Using fuzzy recurrence plots, recurrences of the phase-space states can be visualized as grayscale texture, which is more informative for pattern analysis. The proposed method replaces the crucial similarity threshold required by symmetrical recurrence plots with the number of cluster centers, where the estimate of the latter parameter is less critical than the estimate of the former.
Egomotion Estimation with Optic Flow and Air Velocity Sensors
2012-09-17
Program Manager This report is published in the interest of scientific and technical information exchange, and its publication...flight height is known. Franz et al. (2004) have developed a method of distance and groundspeed estimation using an omnidirectional camera, but knowledge ...method we have described works in both constant and varying wind and even over sloped terrain. Our method also does not require any prior knowledge of
NASA Astrophysics Data System (ADS)
Debski, Wojciech
2015-06-01
The spatial location of sources of seismic waves is one of the first tasks when transient waves from natural (uncontrolled) sources are analysed in many branches of physics, including seismology, oceanology, to name a few. Source activity and its spatial variability in time, the geometry of recording network, the complexity and heterogeneity of wave velocity distribution are all factors influencing the performance of location algorithms and accuracy of the achieved results. Although estimating of the earthquake foci location is relatively simple, a quantitative estimation of the location accuracy is really a challenging task even if the probabilistic inverse method is used because it requires knowledge of statistics of observational, modelling and a priori uncertainties. In this paper, we addressed this task when statistics of observational and/or modelling errors are unknown. This common situation requires introduction of a priori constraints on the likelihood (misfit) function which significantly influence the estimated errors. Based on the results of an analysis of 120 seismic events from the Rudna copper mine operating in southwestern Poland, we propose an approach based on an analysis of Shanon's entropy calculated for the a posteriori distribution. We show that this meta-characteristic of the a posteriori distribution carries some information on uncertainties of the solution found.
The issues of current rainfall estimation techniques in mountain natural multi-hazard investigation
NASA Astrophysics Data System (ADS)
Zhuo, Lu; Han, Dawei; Chen, Ningsheng; Wang, Tao
2017-04-01
Mountain hazards (e.g., landslides, debris flows, and floods) induced by rainfall are complex phenomena that require good knowledge of rainfall representation at different spatiotemporal scales. This study reveals rainfall estimation from gauges is rather unrepresentative over a large spatial area in mountain regions. As a result, the conventional practice of adopting the triggering threshold for hazard early warning purposes is insufficient. The main reason is because of the huge orographic influence on rainfall distribution. Modern rainfall estimation methods such as numerical weather prediction modelling and remote sensing utilising radar from the space or on land are able to provide spatially more representative rainfall information in mountain areas. But unlike rain gauges, they only indirectly provide rainfall measurements. Remote sensing suffers from many sources of errors such as weather conditions, attenuation and sampling methods, while numerical weather prediction models suffer from spatiotemporal and amplitude errors depending on the model physics, dynamics, and model configuration. A case study based on Sichuan, China is used to illustrate the significant difference among the three aforementioned rainfall estimation methods. We argue none of those methods can be relied on individually, and the challenge is on how to make the full utilisation of the three methods conjunctively because each of them only provides partial information. We propose that a data fusion approach should be adopted based on the Bayesian inference method. However such an approach requires the uncertainty information from all those estimation techniques which still need extensive research. We hope this study will raise the awareness of this important issue and highlight the knowledge gap that should be filled in so that such a challenging problem could be tackled collectively by the community.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-03
... vehicles and items of replacement equipment to conduct a notification and remedy campaign (recall) when... equipment. Estimated Total Annual Burden: 150 hours. ADDRESSES: Send comments, within 30 days, to the Office...
DOT National Transportation Integrated Search
1997-01-01
The success of Advanced Traveler Information Systems (ATIS) and Advanced Traffic Management Systems (ATMS) depends on the availability and dissemination of timely and accurate estimates of current and emerging traffic network conditions. Real-time Dy...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-06
....S. citizens who engage in a specified activity (other than commercial fishing), within a specified.... Estimated Total Annual Cost to Public: $360 in recordkeeping/ reporting costs and $0 in capital costs (if...
33 CFR Appendix B to Part 273 - Information Requirements for Aquatic Plant Control Program Reports
Code of Federal Regulations, 2011 CFR
2011-07-01
... ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE AQUATIC PLANT CONTROL Pt. 273, App. B Appendix B to... source of reinfestation; extent of infestation including estimated surface area, depth or density; nature...
33 CFR Appendix B to Part 273 - Information Requirements for Aquatic Plant Control Program Reports
Code of Federal Regulations, 2010 CFR
2010-07-01
... ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE AQUATIC PLANT CONTROL Pt. 273, App. B Appendix B to... source of reinfestation; extent of infestation including estimated surface area, depth or density; nature...
NASA Astrophysics Data System (ADS)
Luce, C.; Tonina, D.; Gariglio, F. P.; Applebee, R.
2012-12-01
Differences in the diurnal variations of temperature at different depths in streambed sediments are commonly used for estimating vertical fluxes of water in the streambed. We applied spatial and temporal rescaling of the advection-diffusion equation to derive two new relationships that greatly extend the kinds of information that can be derived from streambed temperature measurements. The first equation provides a direct estimate of the Peclet number from the amplitude decay and phase delay information. The analytical equation is explicit (e.g. no numerical root-finding is necessary), and invertable. The thermal front velocity can be estimated from the Peclet number when the thermal diffusivity is known. The second equation allows for an independent estimate of the thermal diffusivity directly from the amplitude decay and phase delay information. Several improvements are available with the new information. The first equation uses a ratio of the amplitude decay and phase delay information; thus Peclet number calculations are independent of depth. The explicit form also makes it somewhat faster and easier to calculate estimates from a large number of sensors or multiple positions along one sensor. Where current practice requires a priori estimation of streambed thermal diffusivity, the new approach allows an independent calculation, improving precision of estimates. Furthermore, when many measurements are made over space and time, expectations of the spatial correlation and temporal invariance of thermal diffusivity are valuable for validation of measurements. Finally, the closed-form explicit solution allows for direct calculation of propagation of uncertainties in error measurements and parameter estimates, providing insight about error expectations for sensors placed at different depths in different environments as a function of surface temperature variation amplitudes. The improvements are expected to increase the utility of temperature measurement methods for studying groundwater-surface water interactions across space and time scales. We discuss the theoretical implications of the new solutions supported by examples with data for illustration and validation.
An Improved Aerial Target Localization Method with a Single Vector Sensor
Zhao, Anbang; Bi, Xuejie; Hui, Juan; Zeng, Caigao; Ma, Lin
2017-01-01
This paper focuses on the problems encountered in the actual data processing with the use of the existing aerial target localization methods, analyzes the causes of the problems, and proposes an improved algorithm. Through the processing of the sea experiment data, it is found that the existing algorithms have higher requirements for the accuracy of the angle estimation. The improved algorithm reduces the requirements of the angle estimation accuracy and obtains the robust estimation results. The closest distance matching estimation algorithm and the horizontal distance estimation compensation algorithm are proposed. The smoothing effect of the data after being post-processed by using the forward and backward two-direction double-filtering method has been improved, thus the initial stage data can be filtered, so that the filtering results retain more useful information. In this paper, the aerial target height measurement methods are studied, the estimation results of the aerial target are given, so as to realize the three-dimensional localization of the aerial target and increase the understanding of the underwater platform to the aerial target, so that the underwater platform has better mobility and concealment. PMID:29135956
Real-Time Parameter Estimation in the Frequency Domain
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
2000-01-01
A method for real-time estimation of parameters in a linear dynamic state-space model was developed and studied. The application is aircraft dynamic model parameter estimation from measured data in flight. Equation error in the frequency domain was used with a recursive Fourier transform for the real-time data analysis. Linear and nonlinear simulation examples and flight test data from the F-18 High Alpha Research Vehicle were used to demonstrate that the technique produces accurate model parameter estimates with appropriate error bounds. Parameter estimates converged in less than one cycle of the dominant dynamic mode, using no a priori information, with control surface inputs measured in flight during ordinary piloted maneuvers. The real-time parameter estimation method has low computational requirements and could be implemented
NASA Astrophysics Data System (ADS)
Sadeghipour, N.; Davis, S. C.; Tichauer, K. M.
2017-01-01
New precision medicine drugs oftentimes act through binding to specific cell-surface cancer receptors, and thus their efficacy is highly dependent on the availability of those receptors and the receptor concentration per cell. Paired-agent molecular imaging can provide quantitative information on receptor status in vivo, especially in tumor tissue; however, to date, published approaches to paired-agent quantitative imaging require that only ‘trace’ levels of imaging agent exist compared to receptor concentration. This strict requirement may limit applicability, particularly in drug binding studies, which seek to report on a biological effect in response to saturating receptors with a drug moiety. To extend the regime over which paired-agent imaging may be used, this work presents a generalized simplified reference tissue model (GSRTM) for paired-agent imaging developed to approximate receptor concentration in both non-receptor-saturated and receptor-saturated conditions. Extensive simulation studies show that tumor receptor concentration estimates recovered using the GSRTM are more accurate in receptor-saturation conditions than the standard simple reference tissue model (SRTM) (% error (mean ± sd): GSRTM 0 ± 1 and SRTM 50 ± 1) and match the SRTM accuracy in non-saturated conditions (% error (mean ± sd): GSRTM 5 ± 5 and SRTM 0 ± 5). To further test the approach, GSRTM-estimated receptor concentration was compared to SRTM-estimated values extracted from tumor xenograft in vivo mouse model data. The GSRTM estimates were observed to deviate from the SRTM in tumors with low receptor saturation (which are likely in a saturated regime). Finally, a general ‘rule-of-thumb’ algorithm is presented to estimate the expected level of receptor saturation that would be achieved in a given tissue provided dose and pharmacokinetic information about the drug or imaging agent being used, and physiological information about the tissue. These studies suggest that the GSRTM is necessary when receptor saturation exceeds 20% and highlight the potential for GSRTM to accurately measure receptor concentrations under saturation conditions, such as might be required during high dose drug studies, or for imaging applications where high concentrations of imaging agent are required to optimize signal-to-noise conditions. This model can also be applied to PET and SPECT imaging studies that tend to suffer from noisier data, but require one less parameter to fit if images are converted to imaging agent concentration (quantitative PET/SPECT).
Racimo, Allison R; Talathi, Nakul S; Zelenski, Nicole A; Wells, Lawrence; Shah, Apurva S
2018-05-02
Price transparency allows patients to make value-based health care decisions and is particularly important for individuals who are uninsured or enrolled in high-deductible health care plans. The availability of consumer prices for children undergoing orthopaedic surgery has not been previously investigated. We aimed to determine the availability of price estimates from hospitals in the United States for an archetypal pediatric orthopaedic surgical procedure (closed reduction and percutaneous pinning of a distal radius fracture) and identify variations in price estimates across hospitals. This prospective investigation utilized a scripted telephone call to obtain price estimates from 50 "top-ranked hospitals" for pediatric orthopaedics and 1 "non-top-ranked hospital" from each state and the District of Columbia. Price estimates were requested using a standardized script, in which an investigator posed as the mother of a child with a displaced distal radius fracture that needed closed reduction and pinning. Price estimates (complete or partial) were recorded for each hospital. The number of calls and the duration of time required to obtain the pricing information was also recorded. Variation was assessed, and hospitals were compared on the basis of ranking, teaching status, and region. Less than half (44%) of the 101 hospitals provided a complete price estimate. The mean price estimate for top-ranked hospitals ($17,813; range, $2742 to $49,063) was 50% higher than the price estimate for non-top-ranked hospitals ($11,866; range, $3623 to $22,967) (P=0.020). Differences in price estimates were attributable to differences in hospital fees (P=0.003), not surgeon fees. Top-ranked hospitals required more calls than non-top-ranked hospitals (4.4±2.9 vs. 2.8±2.3 calls, P=0.003). A longer duration of time was required to obtain price estimates from top-ranked hospitals than from non-top-ranked hospitals (8.2±9.4 vs. 4.1±5.1 d, P=0.024). Price estimates for pediatric orthopaedic procedures are difficult to obtain. Top-ranked hospitals are more expensive and less likely to provide price information than non-top-ranked hospitals, with price differences primarily caused by variation in hospital fees, not surgeon fees. Level II-economic and decision analyses.
Information-Driven Active Audio-Visual Source Localization
Schult, Niclas; Reineking, Thomas; Kluss, Thorsten; Zetzsche, Christoph
2015-01-01
We present a system for sensorimotor audio-visual source localization on a mobile robot. We utilize a particle filter for the combination of audio-visual information and for the temporal integration of consecutive measurements. Although the system only measures the current direction of the source, the position of the source can be estimated because the robot is able to move and can therefore obtain measurements from different directions. These actions by the robot successively reduce uncertainty about the source’s position. An information gain mechanism is used for selecting the most informative actions in order to minimize the number of actions required to achieve accurate and precise position estimates in azimuth and distance. We show that this mechanism is an efficient solution to the action selection problem for source localization, and that it is able to produce precise position estimates despite simplified unisensory preprocessing. Because of the robot’s mobility, this approach is suitable for use in complex and cluttered environments. We present qualitative and quantitative results of the system’s performance and discuss possible areas of application. PMID:26327619
Time-to-contact estimation of accelerated stimuli is based on first-order information.
Benguigui, Nicolas; Ripoll, Hubert; Broderick, Michael P
2003-12-01
The goal of this study was to test whether 1st-order information, which does not account for acceleration, is used (a) to estimate the time to contact (TTC) of an accelerated stimulus after the occlusion of a final part of its trajectory and (b) to indirectly intercept an accelerated stimulus with a thrown projectile. Both tasks require the production of an action on the basis of predictive information acquired before the arrival of the stimulus at the target and allow the experimenter to make quantitative predictions about the participants' use (or nonuse) of 1st-order information. The results show that participants do not use information about acceleration and that they commit errors that rely quantitatively on 1st-order information even when acceleration is psychophysically detectable. In the indirect interceptive task, action is planned about 200 ms before the initiation of the movement, at which time the 1st-order TTC attains a critical value. ((c) 2003 APA, all rights reserved)
Acquisition of 3d Information for Vanished Structure by Using Only AN Ancient Picture
NASA Astrophysics Data System (ADS)
Kunii, Y.; Sakamoto, R.
2016-06-01
In order to acquire 3D information for reconstruction of vanished historical structure, grasp of 3D shape of such structure was attempted by using an ancient picture. Generally, 3D information of a structure is acquired by photogrammetric theory which requires two or more pictures. This paper clarifies that the geometrical information of the structure was obtained only from an ancient picture, and 3D information was acquired. This kind of method was applied for an ancient picture of the Old Imperial Theatre. The Old Imperial Theatre in the picture is constituted by two-point perspective. Therefore, estimated value of focal length of camera, length of camera to the Old Imperial Theatre and some parameters were calculated by estimation of field angle, using body height as an index of length and some geometrical information. Consequently, 3D coordinate of 120 measurement points on the surface of the Old Imperial Theatre were calculated respectively, and 3DCG modeling of the Old Imperial Theatre was realized.
JOB HORIZONS FOR COLLEGE WOMEN.
ERIC Educational Resources Information Center
BARSKY, LILLIAN; TERLIN, ROSE
DETAILED INFORMATION IS PROVIDED ON A VARIETY OF PROFESSIONS FOR WOMEN. EDUCATIONAL REQUIREMENTS, JOB OPPORTUNITIES AND RESPONSIBILITIES, ESTIMATED SALARIES, AND OPPORTUNITIES FOR ADVANCEMENT ARE DISCUSSED IN SUCH OCCUPATIONS AS ACCOUNTANT, HOME ECONOMIST, ENGINEER, OCCUPATIONAL THERAPIST, NURSE, SCIENTIST, REAL ESTATE AGENT AND BROKER,…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-14
... uranium enrichment facility in accordance with 10 CFR Parts 40 and 70. 7. An estimate of the number of... amended, and (b) the liability insurance required of uranium enrichment facility licensees pursuant to...
20171015 - Predicting Exposure Pathways with Machine Learning (ISES)
Prioritizing the risk posed to human health from the thousands of chemicals in the environment requires tools that can estimate exposure rates from limited information. High throughput models exist to make predictions of exposure via specific, important pathways such as residenti...
Exposure-Based Prioritization of Chemicals for Risk Assessment
Manufactured chemicals are used extensively to produce a wide variety of consumer goods and are required by important industrial sectors. Presently, information is insufficient to estimate risks posed to human health and the environment from the over ten thousand chemical substan...
Sokhey, Taegh; Gaebler-Spira, Deborah; Kording, Konrad P.
2017-01-01
Background It is important to understand the motor deficits of children with Cerebral Palsy (CP). Our understanding of this motor disorder can be enriched by computational models of motor control. One crucial stage in generating movement involves combining uncertain information from different sources, and deficits in this process could contribute to reduced motor function in children with CP. Healthy adults can integrate previously-learned information (prior) with incoming sensory information (likelihood) in a close-to-optimal way when estimating object location, consistent with the use of Bayesian statistics. However, there are few studies investigating how children with CP perform sensorimotor integration. We compare sensorimotor estimation in children with CP and age-matched controls using a model-based analysis to understand the process. Methods and findings We examined Bayesian sensorimotor integration in children with CP, aged between 5 and 12 years old, with Gross Motor Function Classification System (GMFCS) levels 1–3 and compared their estimation behavior with age-matched typically-developing (TD) children. We used a simple sensorimotor estimation task which requires participants to combine probabilistic information from different sources: a likelihood distribution (current sensory information) with a prior distribution (learned target information). In order to examine sensorimotor integration, we quantified how participants weighed statistical information from the two sources (prior and likelihood) and compared this to the statistical optimal weighting. We found that the weighing of statistical information in children with CP was as statistically efficient as that of TD children. Conclusions We conclude that Bayesian sensorimotor integration is not impaired in children with CP and therefore, does not contribute to their motor deficits. Future research has the potential to enrich our understanding of motor disorders by investigating the stages of motor processing set out by computational models. Therapeutic interventions should exploit the ability of children with CP to use statistical information. PMID:29186196
Estimating prefledging survival: Allowing for brood mixing and dependence among brood mates
Flint, Paul L.; Pollock, Kenneth H.; Thomas, Dana; Sedinger, James S.
1995-01-01
Estimates of juvenile survival from hatch to fledging provide important information on waterfowl productivity. We develop a model for estimating survival of young waterfowl from hatch to fledging. Our model enables interchange of individuals among broods and relaxes the assumption that individuals within broods have independent survival probabilities. The model requires repeated observations of individually identifiable adults and their offspring that are not individually identifiable. A modified Kaplan-Meier procedure (Pollock et al. 1989a,b) and a modified Mayfield procedure (Mayfield 1961, 1975; Johnson 1979) can be used under this general modeling framework, and survival rates and corresponding variances of the point estimators can be determined.
NASA Technical Reports Server (NTRS)
Lee, Taesik; Jeziorek, Peter
2004-01-01
Large complex projects cost large sums of money throughout their life cycle for a variety of reasons and causes. For such large programs, the credible estimation of the project cost, a quick assessment of the cost of making changes, and the management of the project budget with effective cost reduction determine the viability of the project. Cost engineering that deals with these issues requires a rigorous method and systematic processes. This paper introduces a logical framework to a&e effective cost engineering. The framework is built upon Axiomatic Design process. The structure in the Axiomatic Design process provides a good foundation to closely tie engineering design and cost information together. The cost framework presented in this paper is a systematic link between the functional domain (FRs), physical domain (DPs), cost domain (CUs), and a task/process-based model. The FR-DP map relates a system s functional requirements to design solutions across all levels and branches of the decomposition hierarchy. DPs are mapped into CUs, which provides a means to estimate the cost of design solutions - DPs - from the cost of the physical entities in the system - CUs. The task/process model describes the iterative process ot-developing each of the CUs, and is used to estimate the cost of CUs. By linking the four domains, this framework provides a superior traceability from requirements to cost information.
NASA Astrophysics Data System (ADS)
Miola, Apollonia; Ciuffo, Biagio
2011-04-01
Maritime transport plays a central role in the transport sector's sustainability debate. Its contribution to air pollution and greenhouse gases is significant. An effective policy strategy to regulate air emissions requires their robust estimation in terms of quantification and location. This paper provides a critical analysis of the ship emission modelling approaches and data sources available, identifying their limits and constraints. It classifies the main methodologies on the basis of the approach followed (bottom-up or top-down) for the evaluation and geographic characterisation of emissions. The analysis highlights the uncertainty of results from the different methods. This is mainly due to the level of uncertainty connected with the sources of information that are used as inputs to the different studies. This paper describes the sources of the information required for these analyses, paying particular attention to AIS data and to the possible problems associated with their use. One way of reducing the overall uncertainty in the results could be the simultaneous use of different sources of information. This paper presents an alternative methodology based on this approach. As a final remark, it can be expected that new approaches to the problem together with more reliable data sources over the coming years could give more impetus to the debate on the global impact of maritime traffic on the environment that, currently, has only reached agreement via the "consensus" estimates provided by IMO (2009).
Long-billed curlews on the Yakima Training Center: Information for base realignment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hand, K.D.; Cadwell, L.L.; Eberhardt, L.E.
1994-02-01
This report summarizes and discusses the results obtained during 1992 from the study of long-billed curlews on the Yakima Training Center (YTC), which Pacific Northwest Laboratory conducted for the US Department of the Army. This study was initiated to provide basic ecological information on YTC long-billed curlews (Numenius americanus). The long-billed curlew is a relatively common spring and summer resident on the YTC. However, other than casual observations, very little is known about the distribution, density, reproductive success, and habitat requirements for this species on the YTC. Until recently the long-billed curlew was a US Fish and Wildlife Service candidatemore » for listing as threatened or endangered; however, on November 21, 1991 it was down-listed to Class IIIc. The Washington Department of Wildlife lists the long-billed curlew as a ``species of special concern.`` Specific objectives of this study were to (1) locate nesting areas, (2) locate brood-rearing areas, (3) evaluate habitat requirements, (4) determine diet, (5) evaluate response to troop activities, (6) evaluate the impact of livestock grazing, (7) estimate the population size, and (8) estimate recruitment rates. Six curlews (four females and two males) were captured and fitted with radio transmitters. These birds were relocated to obtain nesting, habitat use, and feeding information. Road surveys conducted over most of the YTC provided information on the bird`s general distribution, habitat requirements, and nesting and brood-rearing areas.« less
NASA Astrophysics Data System (ADS)
Wong, T. E.; Noone, D. C.; Kleiber, W.
2014-12-01
The single largest uncertainty in climate model energy balance is the surface latent heating over tropical land. Furthermore, the partitioning of the total latent heat flux into contributions from surface evaporation and plant transpiration is of great importance, but notoriously poorly constrained. Resolving these issues will require better exploiting information which lies at the interface between observations and advanced modeling tools, both of which are imperfect. There are remarkably few observations which can constrain these fluxes, placing strict requirements on developing statistical methods to maximize the use of limited information to best improve models. Previous work has demonstrated the power of incorporating stable water isotopes into land surface models for further constraining ecosystem processes. We present results from a stable water isotopically-enabled land surface model (iCLM4), including model experiments partitioning the latent heat flux into contributions from plant transpiration and surface evaporation. It is shown that the partitioning results are sensitive to the parameterization of kinetic fractionation used. We discuss and demonstrate an approach to calibrating select model parameters to observational data in a Bayesian estimation framework, requiring Markov Chain Monte Carlo sampling of the posterior distribution, which is shown to constrain uncertain parameters as well as inform relevant values for operational use. Finally, we discuss the application of the estimation scheme to iCLM4, including entropy as a measure of information content and specific challenges which arise in calibration models with a large number of parameters.
MapSentinel: Can the Knowledge of Space Use Improve Indoor Tracking Further?
Jia, Ruoxi; Jin, Ming; Zou, Han; Yesilata, Yigitcan; Xie, Lihua; Spanos, Costas
2016-01-01
Estimating an occupant’s location is arguably the most fundamental sensing task in smart buildings. The applications for fine-grained, responsive building operations require the location sensing systems to provide location estimates in real time, also known as indoor tracking. Existing indoor tracking systems require occupants to carry specialized devices or install programs on their smartphone to collect inertial sensing data. In this paper, we propose MapSentinel, which performs non-intrusive location sensing based on WiFi access points and ultrasonic sensors. MapSentinel combines the noisy sensor readings with the floormap information to estimate locations. One key observation supporting our work is that occupants exhibit distinctive motion characteristics at different locations on the floormap, e.g., constrained motion along the corridor or in the cubicle zones, and free movement in the open space. While extensive research has been performed on using a floormap as a tool to obtain correct walking trajectories without wall-crossings, there have been few attempts to incorporate the knowledge of space use available from the floormap into the location estimation. This paper argues that the knowledge of space use as an additional information source presents new opportunities for indoor tracking. The fusion of heterogeneous information is theoretically formulated within the Factor Graph framework, and the Context-Augmented Particle Filtering algorithm is developed to efficiently solve real-time walking trajectories. Our evaluation in a large office space shows that the MapSentinel can achieve accuracy improvement of 31.3% compared with the purely WiFi-based tracking system. PMID:27049387
MapSentinel: Can the Knowledge of Space Use Improve Indoor Tracking Further?
Jia, Ruoxi; Jin, Ming; Zou, Han; Yesilata, Yigitcan; Xie, Lihua; Spanos, Costas
2016-04-02
Estimating an occupant's location is arguably the most fundamental sensing task in smart buildings. The applications for fine-grained, responsive building operations require the location sensing systems to provide location estimates in real time, also known as indoor tracking. Existing indoor tracking systems require occupants to carry specialized devices or install programs on their smartphone to collect inertial sensing data. In this paper, we propose MapSentinel, which performs non-intrusive location sensing based on WiFi access points and ultrasonic sensors. MapSentinel combines the noisy sensor readings with the floormap information to estimate locations. One key observation supporting our work is that occupants exhibit distinctive motion characteristics at different locations on the floormap, e.g., constrained motion along the corridor or in the cubicle zones, and free movement in the open space. While extensive research has been performed on using a floormap as a tool to obtain correct walking trajectories without wall-crossings, there have been few attempts to incorporate the knowledge of space use available from the floormap into the location estimation. This paper argues that the knowledge of space use as an additional information source presents new opportunities for indoor tracking. The fusion of heterogeneous information is theoretically formulated within the Factor Graph framework, and the Context-Augmented Particle Filtering algorithm is developed to efficiently solve real-time walking trajectories. Our evaluation in a large office space shows that the MapSentinel can achieve accuracy improvement of 31.3% compared with the purely WiFi-based tracking system.
NASA Astrophysics Data System (ADS)
Multsch, S.; Exbrayat, J.-F.; Kirby, M.; Viney, N. R.; Frede, H.-G.; Breuer, L.
2014-11-01
Irrigation agriculture plays an increasingly important role in food supply. Many evapotranspiration models are used today to estimate the water demand for irrigation. They consider different stages of crop growth by empirical crop coefficients to adapt evapotranspiration throughout the vegetation period. We investigate the importance of the model structural vs. model parametric uncertainty for irrigation simulations by considering six evapotranspiration models and five crop coefficient sets to estimate irrigation water requirements for growing wheat in the Murray-Darling Basin, Australia. The study is carried out using the spatial decision support system SPARE:WATER. We find that structural model uncertainty is far more important than model parametric uncertainty to estimate irrigation water requirement. Using the Reliability Ensemble Averaging (REA) technique, we are able to reduce the overall predictive model uncertainty by more than 10%. The exceedance probability curve of irrigation water requirements shows that a certain threshold, e.g. an irrigation water limit due to water right of 400 mm, would be less frequently exceeded in case of the REA ensemble average (45%) in comparison to the equally weighted ensemble average (66%). We conclude that multi-model ensemble predictions and sophisticated model averaging techniques are helpful in predicting irrigation demand and provide relevant information for decision making.
Estimation of biophysical properties of upland Sitka spruce (Picea sitchensis) plantations
NASA Technical Reports Server (NTRS)
Green, Robert M.
1993-01-01
It is widely accepted that estimates of forest above-ground biomass are required as inputs to forest ecosystem models, and that SAR data have the potential to provide such information. This study describes relationships between polarimetric radar backscatter and key biophysical properties of a coniferous plantation in upland central Wales, U.K. Over the test site, topography was relatively complex and was expected to influence the amount of radar backscatter.
A Tracker for Broken and Closely-Spaced Lines
1997-10-01
to combine the current level flow estimate and the previous level flow estimate. However, the result is still not good enough for some reasons. First...geometric attributes are not good enough to discriminate line segments, when they are crowded, parallel and closely-spaced to each other. On the other...level information [10]. Still, it is not good at dealing with closely-spaced line segments. Because it requires a proper size of square neighborhood to
Gracia-Lor, Emma; Castiglioni, Sara; Bade, Richard; Been, Frederic; Castrignanò, Erika; Covaci, Adrian; González-Mariño, Iria; Hapeshi, Evroula; Kasprzyk-Hordern, Barbara; Kinyua, Juliet; Lai, Foon Yin; Letzel, Thomas; Lopardo, Luigi; Meyer, Markus R; O'Brien, Jake; Ramin, Pedram; Rousis, Nikolaos I; Rydevik, Axel; Ryu, Yeonsuk; Santos, Miguel M; Senta, Ivan; Thomaidis, Nikolaos S; Veloutsou, Sofia; Yang, Zhugen; Zuccato, Ettore; Bijlsma, Lubertus
2017-02-01
The information obtained from the chemical analysis of specific human excretion products (biomarkers) in urban wastewater can be used to estimate the exposure or consumption of the population under investigation to a defined substance. A proper biomarker can provide relevant information about lifestyle habits, health and wellbeing, but its selection is not an easy task as it should fulfil several specific requirements in order to be successfully employed. This paper aims to summarize the current knowledge related to the most relevant biomarkers used so far. In addition, some potential wastewater biomarkers that could be used for future applications were evaluated. For this purpose, representative chemical classes have been chosen and grouped in four main categories: (i) those that provide estimates of lifestyle factors and substance use, (ii) those used to estimate the exposure to toxicants present in the environment and food, (iii) those that have the potential to provide information about public health and illness and (iv) those used to estimate the population size. To facilitate the evaluation of the eligibility of a compound as a biomarker, information, when available, on stability in urine and wastewater and pharmacokinetic data (i.e. metabolism and urinary excretion profile) has been reviewed. Finally, several needs and recommendations for future research are proposed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Empirical Bayes approach to the estimation of "unsafety": the multivariate regression method.
Hauer, E
1992-10-01
There are two kinds of clues to the unsafety of an entity: its traits (such as traffic, geometry, age, or gender) and its historical accident record. The Empirical Bayes approach to unsafety estimation makes use of both kinds of clues. It requires information about the mean and the variance of the unsafety in a "reference population" of similar entities. The method now in use for this purpose suffers from several shortcomings. First, a very large reference population is required. Second, the choice of reference population is to some extent arbitrary. Third, entities in the reference population usually cannot match the traits of the entity the unsafety of which is estimated. To alleviate these shortcomings the multivariate regression method for estimating the mean and variance of unsafety in reference populations is offered. Its logical foundations are described and its soundness is demonstrated. The use of the multivariate method makes the Empirical Bayes approach to unsafety estimation applicable to a wider range of circumstances and yields better estimates of unsafety. The application of the method to the tasks of identifying deviant entities and of estimating the effect of interventions on unsafety are discussed and illustrated by numerical examples.
Tank characterization report for double-shell tank 241-AW-105
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sasaki, L.M.
1997-06-05
One of the major functions of the Tank Waste Remediation System (TWRS) is to characterize wastes in support of waste management and disposal activities at the Hanford Site. Analytical data from sampling and analysis, along with other available information about a tank, are compiled and maintained in a tank characterization report (TCR). This report and its appendices serve as the TCR for double-shell tank 241-AW-105. The objectives of this report are to use characterization data in response to technical issues associated with tank 241-AW-105 waste; and to provide a standard characterization of this waste in terms of a best-basis inventorymore » estimate. The response to technical issues is summarized in Section 2.0, and the best-basis inventory estimate is presented in Section 3.0. Recommendations regarding safety status and additional sampling needs are provided in Section 4.0. Supporting data and information are contained in the appendices. This report supports the requirements of the Hanford Federal Facility Agreement and Consent Order milestone Characterization. information presented in this report originated from sample analyses and known historical sources. While only the results of a recent sampling event will be used to fulfill the requirements of the data quality objectives (DQOs), other information can be used to support or question conclusions derived from these results. Historical information for tank 241-AW-105 is provided in Appendix A, including surveillance information, records pertaining to waste transfers and tank operations, and expected tank contents derived from a process knowledge model. The recent sampling event listed, as well as pertinent sample data obtained before 1996, are summarized in Appendix B along with the sampling results. The results of the 1996 grab sampling event satisfied the data requirements specified in the sampling and analysis plan (SAP) for this tank. In addition, the tank headspace flammability was measured, which addresses one of the requirements specified in the safety screening DQO. The statistical analysis and numerical manipulation of data used in issue resolution are reported in Appendix C. Appendix D contains the evaluation to establish the best basis for the inventory estimate and the statistical analysis performed for this evaluation. A bibliography that resulted from an in-depth literature search of all known information sources applicable to tank 241-AW-105 and its respective waste types is contained in Appendix E. A majority of the documents listed in Appendix E may be found in the Tank Characterization and Safety Resource Center.« less
Cultural Consensus Theory: Aggregating Continuous Responses in a Finite Interval
NASA Astrophysics Data System (ADS)
Batchelder, William H.; Strashny, Alex; Romney, A. Kimball
Cultural consensus theory (CCT) consists of cognitive models for aggregating responses of "informants" to test items about some domain of their shared cultural knowledge. This paper develops a CCT model for items requiring bounded numerical responses, e.g. probability estimates, confidence judgments, or similarity judgments. The model assumes that each item generates a latent random representation in each informant, with mean equal to the consensus answer and variance depending jointly on the informant and the location of the consensus answer. The manifest responses may reflect biases of the informants. Markov Chain Monte Carlo (MCMC) methods were used to estimate the model, and simulation studies validated the approach. The model was applied to an existing cross-cultural dataset involving native Japanese and English speakers judging the similarity of emotion terms. The results sharpened earlier studies that showed that both cultures appear to have very similar cognitive representations of emotion terms.
Loague, Keith; Green, Richard E; Giambelluca, Thomas W; Liang, Tony C; Yost, Russell S
2016-11-01
A simple mobility index, when combined with a geographic information system, can be used to generate rating maps which indicate qualitatively the potential for various organic chemicals to leach to groundwater. In this paper we investigate the magnitude of uncertainty associated with pesticide mobility estimates as a result of data uncertainties. Our example is for the Pearl Harbor Basin, Oahu, Hawaii. The two pesticides included in our analysis are atrazine (2-chloro-4-ethylamino-6-isopropylamino-s-triazine) and diuron [3-(3,4-dichlorophenyl)-1,1-dimethylarea]. The mobility index used here is known as the Attenuation Factor (AF); it requires soil, hydrogeologic, climatic, and chemical information as input data. We employ first-order uncertainty analysis to characterize the uncertainty in estimates of AF resulting from uncertainties in the various input data. Soils in the Pearl Harbor Basin are delineated at the order taxonomic category for this study. Our results show that there can be a significant amount of uncertainty in estimates of pesticide mobility for the Pearl Harbor Basin. This information needs to be considered if future decisions concerning chemical regulation are to be based on estimates of pesticide mobility determined from simple indices. Copyright © 2016. Published by Elsevier B.V.
Small area estimation of proportions with different levels of auxiliary data.
Chandra, Hukum; Kumar, Sushil; Aditya, Kaustav
2018-03-01
Binary data are often of interest in many small areas of applications. The use of standard small area estimation methods based on linear mixed models becomes problematic for such data. An empirical plug-in predictor (EPP) under a unit-level generalized linear mixed model with logit link function is often used for the estimation of a small area proportion. However, this EPP requires the availability of unit-level population information for auxiliary data that may not be always accessible. As a consequence, in many practical situations, this EPP approach cannot be applied. Based on the level of auxiliary information available, different small area predictors for estimation of proportions are proposed. Analytic and bootstrap approaches to estimating the mean squared error of the proposed small area predictors are also developed. Monte Carlo simulations based on both simulated and real data show that the proposed small area predictors work well for generating the small area estimates of proportions and represent a practical alternative to the above approach. The developed predictor is applied to generate estimates of the proportions of indebted farm households at district-level using debt investment survey data from India. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Simultaneous Mean and Covariance Correction Filter for Orbit Estimation.
Wang, Xiaoxu; Pan, Quan; Ding, Zhengtao; Ma, Zhengya
2018-05-05
This paper proposes a novel filtering design, from a viewpoint of identification instead of the conventional nonlinear estimation schemes (NESs), to improve the performance of orbit state estimation for a space target. First, a nonlinear perturbation is viewed or modeled as an unknown input (UI) coupled with the orbit state, to avoid the intractable nonlinear perturbation integral (INPI) required by NESs. Then, a simultaneous mean and covariance correction filter (SMCCF), based on a two-stage expectation maximization (EM) framework, is proposed to simply and analytically fit or identify the first two moments (FTM) of the perturbation (viewed as UI), instead of directly computing such the INPI in NESs. Orbit estimation performance is greatly improved by utilizing the fit UI-FTM to simultaneously correct the state estimation and its covariance. Third, depending on whether enough information is mined, SMCCF should outperform existing NESs or the standard identification algorithms (which view the UI as a constant independent of the state and only utilize the identified UI-mean to correct the state estimation, regardless of its covariance), since it further incorporates the useful covariance information in addition to the mean of the UI. Finally, our simulations demonstrate the superior performance of SMCCF via an orbit estimation example.
Geographic information system/watershed model interface
Fisher, Gary T.
1989-01-01
Geographic information systems allow for the interactive analysis of spatial data related to water-resources investigations. A conceptual design for an interface between a geographic information system and a watershed model includes functions for the estimation of model parameter values. Design criteria include ease of use, minimal equipment requirements, a generic data-base management system, and use of a macro language. An application is demonstrated for a 90.1-square-kilometer subbasin of the Patuxent River near Unity, Maryland, that performs automated derivation of watershed parameters for hydrologic modeling.
Lawn, Joy E; Bianchi-Jassir, Fiorella; Russell, Neal J; Kohli-Lynch, Maya; Tann, Cally J; Hall, Jennifer; Madrid, Lola; Baker, Carol J; Bartlett, Linda; Cutland, Clare; Gravett, Michael G; Heath, Paul T; Ip, Margaret; Le Doare, Kirsty; Madhi, Shabir A; Rubens, Craig E; Saha, Samir K; Schrag, Stephanie; Sobanjo-ter Meulen, Ajoke; Vekemans, Johan; Seale, Anna C
2017-01-01
Abstract Improving maternal, newborn, and child health is central to Sustainable Development Goal targets for 2030, requiring acceleration especially to prevent 5.6 million deaths around the time of birth. Infections contribute to this burden, but etiological data are limited. Group B Streptococcus (GBS) is an important perinatal pathogen, although previously focus has been primarily on liveborn children, especially early-onset disease. In this first of an 11-article supplement, we discuss the following: (1) Why estimate the worldwide burden of GBS disease? (2) What outcomes of GBS in pregnancy should be included? (3) What data and epidemiological parameters are required? (4) What methods and models can be used to transparently estimate this burden of GBS? (5) What are the challenges with available data? and (6) How can estimates address data gaps to better inform GBS interventions including maternal immunization? We review all available GBS data worldwide, including maternal GBS colonization, risk of neonatal disease (with/without intrapartum antibiotic prophylaxis), maternal GBS disease, neonatal/infant GBS disease, and subsequent impairment, plus GBS-associated stillbirth, preterm birth, and neonatal encephalopathy. We summarize our methods for searches, meta-analyses, and modeling including a compartmental model. Our approach is consistent with the World Health Organization (WHO) Guidelines for Accurate and Transparent Health Estimates Reporting (GATHER), published in The Lancet and the Public Library of Science (PLoS). We aim to address priority epidemiological gaps highlighted by WHO to inform potential maternal vaccination. PMID:29117323
NASA Astrophysics Data System (ADS)
Harney, Robert C.
1997-03-01
A novel methodology offering the potential for resolving two of the significant problems of implementing multisensor target recognition systems, i.e., the rational selection of a specific sensor suite and optimal allocation of requirements among sensors, is presented. Based on a sequence of conjectures (and their supporting arguments) concerning the relationship of extractable information content to recognition performance of a sensor system, a set of heuristics (essentially a reformulation of Johnson's criteria applicable to all sensor and data types) is developed. An approach to quantifying the information content of sensor data is described. Coupling this approach with the widely accepted Johnson's criteria for target recognition capabilities results in a quantitative method for comparing the target recognition ability of diverse sensors (imagers, nonimagers, active, passive, electromagnetic, acoustic, etc.). Extension to describing the performance of multiple sensors is straightforward. The application of the technique to sensor selection and requirements allocation is discussed.
NASA Astrophysics Data System (ADS)
Huo, Ming-Xia; Li, Ying
2017-12-01
Quantum error correction is important to quantum information processing, which allows us to reliably process information encoded in quantum error correction codes. Efficient quantum error correction benefits from the knowledge of error rates. We propose a protocol for monitoring error rates in real time without interrupting the quantum error correction. Any adaptation of the quantum error correction code or its implementation circuit is not required. The protocol can be directly applied to the most advanced quantum error correction techniques, e.g. surface code. A Gaussian processes algorithm is used to estimate and predict error rates based on error correction data in the past. We find that using these estimated error rates, the probability of error correction failures can be significantly reduced by a factor increasing with the code distance.
The Health Insurance Portability and Accountability Act: security and privacy requirements.
Tribble, D A
2001-05-01
The security and privacy requirements of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) and their implications for pharmacy are discussed. HIPAA was enacted to improve the portability of health care insurance for persons leaving jobs. A section of the act encourages the use of electronic communications for health care claims adjudication, mandates the use of new standard code sets and transaction sets, and establishes the need for regulations to protect the security and privacy of individually identifiable health care information. Creating these regulations became the task of the Department of Health and Human Services. Regulations on security have been published for comment. Regulations on privacy and the definition of standard transaction sets and code sets are complete. National identifiers for patients, providers, and payers have not yet been established. The HIPAA regulations on security and privacy will require that pharmacies adopt policies and procedures that limit access to health care information. Existing pharmacy information systems may require upgrading or replacement. Costs of implementation nationwide are estimated to exceed $8 billion. The health care community has two years from the finalization of each regulation to comply with that regulation. The security and privacy requirements of HIPAA will require pharmacies to review their practices regarding the storage, use, and disclosure of protected health care information.
Chatterjee, Nilanjan; Chen, Yi-Hau; Maas, Paige; Carroll, Raymond J.
2016-01-01
Information from various public and private data sources of extremely large sample sizes are now increasingly available for research purposes. Statistical methods are needed for utilizing information from such big data sources while analyzing data from individual studies that may collect more detailed information required for addressing specific hypotheses of interest. In this article, we consider the problem of building regression models based on individual-level data from an “internal” study while utilizing summary-level information, such as information on parameters for reduced models, from an “external” big data source. We identify a set of very general constraints that link internal and external models. These constraints are used to develop a framework for semiparametric maximum likelihood inference that allows the distribution of covariates to be estimated using either the internal sample or an external reference sample. We develop extensions for handling complex stratified sampling designs, such as case-control sampling, for the internal study. Asymptotic theory and variance estimators are developed for each case. We use simulation studies and a real data application to assess the performance of the proposed methods in contrast to the generalized regression (GR) calibration methodology that is popular in the sample survey literature. PMID:27570323
Physiological Capacities: Estimating an Athlete's Potential.
ERIC Educational Resources Information Center
Lemon, Peter W. R.
1982-01-01
Several simple performance tests are described for assessing an athlete's major energy-producing capabilities. The tests are suitable for mass screening because they are easy to administer, require no sophisticated equipment, and can be done quickly. Information for evaluating test results is included. (PP)
45 CFR 602.20 - Standards for financial management systems.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Section 602.20 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE... subgrant. Financial information must be related to performance or productivity data, including the... agreement. If unit cost data are required, estimates based on available documentation will be accepted...
Estimating upper-stem and limb-wood volume in northeastern hardwoods
Wayne G. Banks; Frederick E. Hampf
1955-01-01
In the nationwide forest survey being made by the U.S. Forest Service, one of the items required is the cubic-foot volume in limbs of hardwood trees. Pulp companies and others have shown interest in this kind of information.
45 CFR 602.20 - Standards for financial management systems.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Section 602.20 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE... subgrant. Financial information must be related to performance or productivity data, including the... agreement. If unit cost data are required, estimates based on available documentation will be accepted...
Rattray, Megan; Desbrow, Ben; Roberts, Shelley
Nutrition is an important part of recovery for hospitalized patients. The aim of this study was to assess the nutritional adequacy of meals provided to and consumed by patients prescribed a therapeutic diet. Patients (N = 110) prescribed a therapeutic diet (texture-modified, low-fiber, oral fluid, or food allergy or intolerance diets) for medical or nutritional reasons were recruited from six wards of a tertiary hospital. Complete (24-h) dietary provisions and intakes were directly observed and analyzed for energy (kJ) and protein (g) content. A chart audit gathered demographic, clinical, and nutrition-related information to calculate each patient's disease-specific estimated energy and protein requirements. Provisions and intake were considered adequate if they met ≥75% of the patient's estimated requirements. Mean energy and protein provided to patients (5844 ± 2319 kJ, 53 ± 30 g) were significantly lower than their mean estimated requirements (8786 ± 1641 kJ, 86 ± 18 g). Consequently, mean nutrition intake (4088 ± 2423 kJ, 37 ± 28 g) were significantly lower than estimated requirements. Only 37% (41) of patients were provided with and 18% (20) consumed adequate nutrition to meet their estimated requirements. No therapeutic diet provided adequate food to meet the energy and protein requirements of all recipients. Patients on oral fluid diets had the highest estimated requirements (9497 ± 1455 kJ, 93 ± 16 g) and the lowest nutrient provision (3497 ± 1388 kJ, 25 ± 19 g) and intake (2156 ± 1394 kJ, 14 ± 14 g). Hospitalized patients prescribed therapeutic diets (particularly fluid-only diets) are at risk for malnutrition. Further research is required to determine the most effective strategies to improve nutritional provision and intake among patients prescribed therapeutic diets. Copyright © 2017 Elsevier Inc. All rights reserved.
The Impact of Advanced Greenhouse Gas Measurement Science on Policy Goals and Research Strategies
NASA Astrophysics Data System (ADS)
Abrahams, L.; Clavin, C.; McKittrick, A.
2016-12-01
In support of the Paris agreement, accurate characterizations of U.S. greenhouse gas (GHG) emissions estimates have been area of increased scientific focus. Over the last several years, the scientific community has placed significant emphasis on understanding, quantifying, and reconciling measurement and modeling methods that characterize methane emissions from petroleum and natural gas sources. This work has prompted national policy discussions and led to the improvement of regional and national methane emissions estimates. Research campaigns focusing on reconciling atmospheric measurements ("top-down") and process-based emissions estimates ("bottom-up") have sought to identify where measurement technology advances could inform policy objectives. A clear next step is development and deployment of advanced detection capabilities that could aid U.S. emissions mitigation and verification goals. The breadth of policy-relevant outcomes associated with advances in GHG measurement science are demonstrated by recent improvements in the petroleum and natural gas sector emission estimates in the EPA Greenhouse Gas Inventory, ambitious efforts to apply inverse modeling results to inform or validate national GHG inventory, and outcomes from federal GHG measurement science technology development programs. In this work, we explore the variety of policy-relevant outcomes impacted by advances in GHG measurement science, with an emphasis on improving GHG inventory estimates, identifying emissions mitigation strategies, and informing technology development requirements.
Using satellite image data to estimate soil moisture
NASA Astrophysics Data System (ADS)
Chuang, Chi-Hung; Yu, Hwa-Lung
2017-04-01
Soil moisture is considered as an important parameter in various study fields, such as hydrology, phenology, and agriculture. In hydrology, soil moisture is an significant parameter to decide how much rainfall that will infiltrate into permeable layer and become groundwater resource. Although soil moisture is a critical role in many environmental studies, so far the measurement of soil moisture is using ground instrument such as electromagnetic soil moisture sensor. Use of ground instrumentation can directly obtain the information, but the instrument needs maintenance and consume manpower to operation. If we need wide range region information, ground instrumentation probably is not suitable. To measure wide region soil moisture information, we need other method to achieve this purpose. Satellite remote sensing techniques can obtain satellite image on Earth, this can be a way to solve the spatial restriction on instrument measurement. In this study, we used MODIS data to retrieve daily soil moisture pattern estimation, i.e., crop water stress index (cwsi), over the year of 2015. The estimations are compared with the observations at the soil moisture stations from Taiwan Bureau of soil and water conservation. Results show that the satellite remote sensing data can be helpful to the soil moisture estimation. Further analysis can be required to obtain the optimal parameters for soil moisture estimation in Taiwan.
Risk Classification with an Adaptive Naive Bayes Kernel Machine Model.
Minnier, Jessica; Yuan, Ming; Liu, Jun S; Cai, Tianxi
2015-04-22
Genetic studies of complex traits have uncovered only a small number of risk markers explaining a small fraction of heritability and adding little improvement to disease risk prediction. Standard single marker methods may lack power in selecting informative markers or estimating effects. Most existing methods also typically do not account for non-linearity. Identifying markers with weak signals and estimating their joint effects among many non-informative markers remains challenging. One potential approach is to group markers based on biological knowledge such as gene structure. If markers in a group tend to have similar effects, proper usage of the group structure could improve power and efficiency in estimation. We propose a two-stage method relating markers to disease risk by taking advantage of known gene-set structures. Imposing a naive bayes kernel machine (KM) model, we estimate gene-set specific risk models that relate each gene-set to the outcome in stage I. The KM framework efficiently models potentially non-linear effects of predictors without requiring explicit specification of functional forms. In stage II, we aggregate information across gene-sets via a regularization procedure. Estimation and computational efficiency is further improved with kernel principle component analysis. Asymptotic results for model estimation and gene set selection are derived and numerical studies suggest that the proposed procedure could outperform existing procedures for constructing genetic risk models.
Environmental Assessment: T-10 Hush House Tinker Air Force Base, Oklahoma
2008-07-01
the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden...require modification of Tinker AFB’s current permits. PUBLIC COMMENTS: A Notice of Availability for public review of the Draft EA was published in the
Perspectives of UV nowcasting to monitor personal pro-health outdoor activities.
Krzyścin, Janusz W; Lesiak, Aleksandra; Narbutt, Joanna; Sobolewski, Piotr; Guzikowski, Jakub
2018-07-01
Nowcasting model for online monitoring of personal outdoor behaviour is proposed. It is envisaged that it will provide an effective e-tool used by smartphone users. The model could estimate maximum duration of safe (without erythema risk) outdoor activity. Moreover, there are options to estimate duration of sunbathing to get adequate amount of vitamin D 3 and doses necessary for the antipsoriatic heliotherapy. The application requires information of starting time of sunbathing and the user's phototype. At the beginning the user will be informed of the approximate duration of sunbathing required to get the minimum erythemal dose, adequate amount of vitamin D 3 , and the dose necessary for the antipsoriatic heliotherapy. After every 20-min the application will recalculate the remaining duration of sunbathing based on the UVI measured in the preceding 20 min. If the estimate of remaining duration is <20 min the user will be informed that the deadline of sunbathing is approaching. Finally, a warning signal will be sent to stop sunbathing if the measured dose reaches the required dose. The proposed model is verified using the data collected at two measuring sites for the warm period of 2017 (1st April-30th September) in large Polish cities (Warsaw and Lodz). First instrument represents the UVI monitoring station. The information concerning sunbathing duration, which is sent to a remote user, is evaluated on the basis of the UVI measurements collected by the second measuring unit in a distance of ~7 km and 10 km for Warsaw and Lodz, respectively. The statistical analysis of the differences between sunbathing duration by nowcasting model and observation shows that the model provides reliable doses received by the users during outdoor activities in proximity (~10 km) to the UVI source site. Standard 24 h UVI forecast based on prognostic values of total ozone and cloudiness appears to only be valid for sunny days. Copyright © 2018 Elsevier B.V. All rights reserved.
Einav, Liran; Finkelstein, Amy; Schrimpf, Paul
2009-01-01
Much of the extensive empirical literature on insurance markets has focused on whether adverse selection can be detected. Once detected, however, there has been little attempt to quantify its welfare cost, or to assess whether and what potential government interventions may reduce these costs. To do so, we develop a model of annuity contract choice and estimate it using data from the U.K. annuity market. The model allows for private information about mortality risk as well as heterogeneity in preferences over different contract options. We focus on the choice of length of guarantee among individuals who are required to buy annuities. The results suggest that asymmetric information along the guarantee margin reduces welfare relative to a first best symmetric information benchmark by about £127 million per year, or about 2 percent of annuitized wealth. We also find that by requiring that individuals choose the longest guarantee period allowed, mandates could achieve the first-best allocation. However, we estimate that other mandated guarantee lengths would have detrimental effects on welfare. Since determining the optimal mandate is empirically difficult, our findings suggest that achieving welfare gains through mandatory social insurance may be harder in practice than simple theory may suggest. PMID:20592943
Einav, Liran; Finkelstein, Amy; Schrimpf, Paul
2010-05-01
Much of the extensive empirical literature on insurance markets has focused on whether adverse selection can be detected. Once detected, however, there has been little attempt to quantify its welfare cost, or to assess whether and what potential government interventions may reduce these costs. To do so, we develop a model of annuity contract choice and estimate it using data from the U.K. annuity market. The model allows for private information about mortality risk as well as heterogeneity in preferences over different contract options. We focus on the choice of length of guarantee among individuals who are required to buy annuities. The results suggest that asymmetric information along the guarantee margin reduces welfare relative to a first best symmetric information benchmark by about £127 million per year, or about 2 percent of annuitized wealth. We also find that by requiring that individuals choose the longest guarantee period allowed, mandates could achieve the first-best allocation. However, we estimate that other mandated guarantee lengths would have detrimental effects on welfare. Since determining the optimal mandate is empirically difficult, our findings suggest that achieving welfare gains through mandatory social insurance may be harder in practice than simple theory may suggest.
Zhang, Han; Wheeler, William; Song, Lei; Yu, Kai
2017-07-07
As meta-analysis results published by consortia of genome-wide association studies (GWASs) become increasingly available, many association summary statistics-based multi-locus tests have been developed to jointly evaluate multiple single-nucleotide polymorphisms (SNPs) to reveal novel genetic architectures of various complex traits. The validity of these approaches relies on the accurate estimate of z-score correlations at considered SNPs, which in turn requires knowledge on the set of SNPs assessed by each study participating in the meta-analysis. However, this exact SNP coverage information is usually unavailable from the meta-analysis results published by GWAS consortia. In the absence of the coverage information, researchers typically estimate the z-score correlations by making oversimplified coverage assumptions. We show through real studies that such a practice can generate highly inflated type I errors, and we demonstrate the proper way to incorporate correct coverage information into multi-locus analyses. We advocate that consortia should make SNP coverage information available when posting their meta-analysis results, and that investigators who develop analytic tools for joint analyses based on summary data should pay attention to the variation in SNP coverage and adjust for it appropriately. Published by Oxford University Press 2017. This work is written by US Government employees and is in the public domain in the US.
Robust estimation of the proportion of treatment effect explained by surrogate marker information.
Parast, Layla; McDermott, Mary M; Tian, Lu
2016-05-10
In randomized treatment studies where the primary outcome requires long follow-up of patients and/or expensive or invasive obtainment procedures, the availability of a surrogate marker that could be used to estimate the treatment effect and could potentially be observed earlier than the primary outcome would allow researchers to make conclusions regarding the treatment effect with less required follow-up time and resources. The Prentice criterion for a valid surrogate marker requires that a test for treatment effect on the surrogate marker also be a valid test for treatment effect on the primary outcome of interest. Based on this criterion, methods have been developed to define and estimate the proportion of treatment effect on the primary outcome that is explained by the treatment effect on the surrogate marker. These methods aim to identify useful statistical surrogates that capture a large proportion of the treatment effect. However, current methods to estimate this proportion usually require restrictive model assumptions that may not hold in practice and thus may lead to biased estimates of this quantity. In this paper, we propose a nonparametric procedure to estimate the proportion of treatment effect on the primary outcome that is explained by the treatment effect on a potential surrogate marker and extend this procedure to a setting with multiple surrogate markers. We compare our approach with previously proposed model-based approaches and propose a variance estimation procedure based on a perturbation-resampling method. Simulation studies demonstrate that the procedure performs well in finite samples and outperforms model-based procedures when the specified models are not correct. We illustrate our proposed procedure using a data set from a randomized study investigating a group-mediated cognitive behavioral intervention for peripheral artery disease participants. Copyright © 2015 John Wiley & Sons, Ltd.
Arsenault, Joanne E; Brown, Kenneth H
2017-01-01
Background: Previous research indicates that young children in low-income countries (LICs) generally consume greater amounts of protein than published estimates of protein requirements, but this research did not account for protein quality based on the mix of amino acids and the digestibility of ingested protein. Objective: Our objective was to estimate the prevalence of inadequate protein and amino acid intake by young children in LICs, accounting for protein quality. Methods: Seven data sets with information on dietary intake for children (6–35 mo of age) from 6 LICs (Peru, Guatemala, Ecuador, Bangladesh, Uganda, and Zambia) were reanalyzed to estimate protein and amino acid intake and assess adequacy. The protein digestibility–corrected amino acid score of each child’s diet was calculated and multiplied by the original (crude) protein intake to obtain an estimate of available protein intake. Distributions of usual intake were obtained to estimate the prevalence of inadequate protein and amino acid intake for each cohort according to Estimated Average Requirements. Results: The prevalence of inadequate protein intake was highest in breastfeeding children aged 6–8 mo: 24% of Bangladeshi and 16% of Peruvian children. With the exception of Bangladesh, the prevalence of inadequate available protein intake decreased by age 9–12 mo and was very low in all sites (0–2%) after 12 mo of age. Inadequate protein intake in children <12 mo of age was due primarily to low energy intake from complementary foods, not inadequate protein density. Conclusions: Overall, most children consumed protein amounts greater than requirements, except for the younger breastfeeding children, who were consuming low amounts of complementary foods. These findings reinforce previous evidence that dietary protein is not generally limiting for children in LICs compared with estimated requirements for healthy children, even after accounting for protein quality. However, unmeasured effects of infection and intestinal dysfunction on the children’s protein requirements could modify this conclusion. PMID:28202639
Hauschild, L; Lovatto, P A; Pomar, J; Pomar, C
2012-07-01
The objective of this study was to develop and evaluate a mathematical model used to estimate the daily amino acid requirements of individual growing-finishing pigs. The model includes empirical and mechanistic model components. The empirical component estimates daily feed intake (DFI), BW, and daily gain (DG) based on individual pig information collected in real time. Based on DFI, BW, and DG estimates, the mechanistic component uses classic factorial equations to estimate the optimal concentration of amino acids that must be offered to each pig to meet its requirements. The model was evaluated with data from a study that investigated the effect of feeding pigs with a 3-phase or daily multiphase system. The DFI and BW values measured in this study were compared with those estimated by the empirical component of the model. The coherence of the values estimated by the mechanistic component was evaluated by analyzing if it followed a normal pattern of requirements. Lastly, the proposed model was evaluated by comparing its estimates with those generated by the existing growth model (InraPorc). The precision of the proposed model and InraPorc in estimating DFI and BW was evaluated through the mean absolute error. The empirical component results indicated that the DFI and BW trajectories of individual pigs fed ad libitum could be predicted 1 d (DFI) or 7 d (BW) ahead with the average mean absolute error of 12.45 and 1.85%, respectively. The average mean absolute error obtained with the InraPorc for the average individual of the population was 14.72% for DFI and 5.38% for BW. Major differences were observed when estimates from InraPorc were compared with individual observations. The proposed model, however, was effective in tracking the change in DFI and BW for each individual pig. The mechanistic model component estimated the optimal standardized ileal digestible Lys to NE ratio with reasonable between animal (average CV = 7%) and overtime (average CV = 14%) variation. Thus, the amino acid requirements estimated by model are animal- and time-dependent and follow, in real time, the individual DFI and BW growth patterns. The proposed model can follow the average feed intake and feed weight trajectory of each individual pig in real time with good accuracy. Based on these trajectories and using classical factorial equations, the model makes it possible to estimate dynamically the AA requirements of each animal, taking into account the intake and growth changes of the animal.
Nonlinear models for estimating GSFC travel requirements
NASA Technical Reports Server (NTRS)
Buffalano, C.; Hagan, F. J.
1974-01-01
A methodology is presented for estimating travel requirements for a particular period of time. Travel models were generated using nonlinear regression analysis techniques on a data base of FY-72 and FY-73 information from 79 GSFC projects. Although the subject matter relates to GSFX activities, the type of analysis used and the manner of selecting the relevant variables would be of interest to other NASA centers, government agencies, private corporations and, in general, any organization with a significant travel budget. Models were developed for each of six types of activity: flight projects (in-house and out-of-house), experiments on non-GSFC projects, international projects, ART/SRT, data analysis, advanced studies, tracking and data, and indirects.
McCullagh, Laura; Schmitz, Susanne; Barry, Michael; Walsh, Cathal
2017-11-01
In Ireland, all new drugs for which reimbursement by the healthcare payer is sought undergo a health technology assessment by the National Centre for Pharmacoeconomics. The National Centre for Pharmacoeconomics estimate expected value of perfect information but not partial expected value of perfect information (owing to computational expense associated with typical methodologies). The objective of this study was to examine the feasibility and utility of estimating partial expected value of perfect information via a computationally efficient, non-parametric regression approach. This was a retrospective analysis of evaluations on drugs for cancer that had been submitted to the National Centre for Pharmacoeconomics (January 2010 to December 2014 inclusive). Drugs were excluded if cost effective at the submitted price. Drugs were excluded if concerns existed regarding the validity of the applicants' submission or if cost-effectiveness model functionality did not allow required modifications to be made. For each included drug (n = 14), value of information was estimated at the final reimbursement price, at a threshold equivalent to the incremental cost-effectiveness ratio at that price. The expected value of perfect information was estimated from probabilistic analysis. Partial expected value of perfect information was estimated via a non-parametric approach. Input parameters with a population value at least €1 million were identified as potential targets for research. All partial estimates were determined within minutes. Thirty parameters (across nine models) each had a value of at least €1 million. These were categorised. Collectively, survival analysis parameters were valued at €19.32 million, health state utility parameters at €15.81 million and parameters associated with the cost of treating adverse effects at €6.64 million. Those associated with drug acquisition costs and with the cost of care were valued at €6.51 million and €5.71 million, respectively. This research demonstrates that the estimation of partial expected value of perfect information via this computationally inexpensive approach could be considered feasible as part of the health technology assessment process for reimbursement purposes within the Irish healthcare system. It might be a useful tool in prioritising future research to decrease decision uncertainty.
Wiggins, Paul A
2015-07-21
This article describes the application of a change-point algorithm to the analysis of stochastic signals in biological systems whose underlying state dynamics consist of transitions between discrete states. Applications of this analysis include molecular-motor stepping, fluorophore bleaching, electrophysiology, particle and cell tracking, detection of copy number variation by sequencing, tethered-particle motion, etc. We present a unified approach to the analysis of processes whose noise can be modeled by Gaussian, Wiener, or Ornstein-Uhlenbeck processes. To fit the model, we exploit explicit, closed-form algebraic expressions for maximum-likelihood estimators of model parameters and estimated information loss of the generalized noise model, which can be computed extremely efficiently. We implement change-point detection using the frequentist information criterion (which, to our knowledge, is a new information criterion). The frequentist information criterion specifies a single, information-based statistical test that is free from ad hoc parameters and requires no prior probability distribution. We demonstrate this information-based approach in the analysis of simulated and experimental tethered-particle-motion data. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.
1984-05-23
Because the cost accounting reports provide the historical cost information for the cost estimating reports, we also tested the reasonableness of... accounting and cost estimating reports must be based on timely and accurate infor- mation. The reports, therefore, require the continual attention of... accounting system reported less than half the value of site direct charges (labor, materials, equipment usage, and other costs ) that should have been
Digital program for solving the linear stochastic optimal control and estimation problem
NASA Technical Reports Server (NTRS)
Geyser, L. C.; Lehtinen, B.
1975-01-01
A computer program is described which solves the linear stochastic optimal control and estimation (LSOCE) problem by using a time-domain formulation. The LSOCE problem is defined as that of designing controls for a linear time-invariant system which is disturbed by white noise in such a way as to minimize a performance index which is quadratic in state and control variables. The LSOCE problem and solution are outlined; brief descriptions are given of the solution algorithms, and complete descriptions of each subroutine, including usage information and digital listings, are provided. A test case is included, as well as information on the IBM 7090-7094 DCS time and storage requirements.
NASA Technical Reports Server (NTRS)
Balas, Mark J.; Thapa Magar, Kaman S.; Frost, Susan A.
2013-01-01
A theory called Adaptive Disturbance Tracking Control (ADTC) is introduced and used to track the Tip Speed Ratio (TSR) of 5 MW Horizontal Axis Wind Turbine (HAWT). Since ADTC theory requires wind speed information, a wind disturbance generator model is combined with lower order plant model to estimate the wind speed as well as partial states of the wind turbine. In this paper, we present a proof of stability and convergence of ADTC theory with lower order estimator and show that the state feedback can be adaptive.
Non-invasive estimation of dissipation from non-equilibrium fluctuations in chemical reactions.
Muy, S; Kundu, A; Lacoste, D
2013-09-28
We show how to extract an estimate of the entropy production from a sufficiently long time series of stationary fluctuations of chemical reactions. This method, which is based on recent work on fluctuation theorems, is direct, non-invasive, does not require any knowledge about the underlying dynamics and is applicable even when only partial information is available. We apply it to simple stochastic models of chemical reactions involving a finite number of states, and for this case, we study how the estimate of dissipation is affected by the degree of coarse-graining present in the input data.
NASA Astrophysics Data System (ADS)
Dafflon, B.; Barrash, W.; Cardiff, M.; Johnson, T. C.
2011-12-01
Reliable predictions of groundwater flow and solute transport require an estimation of the detailed distribution of the parameters (e.g., hydraulic conductivity, effective porosity) controlling these processes. However, such parameters are difficult to estimate because of the inaccessibility and complexity of the subsurface. In this regard, developments in parameter estimation techniques and investigations of field experiments are still challenging and necessary to improve our understanding and the prediction of hydrological processes. Here we analyze a conservative tracer test conducted at the Boise Hydrogeophysical Research Site in 2001 in a heterogeneous unconfined fluvial aquifer. Some relevant characteristics of this test include: variable-density (sinking) effects because of the injection concentration of the bromide tracer, the relatively small size of the experiment, and the availability of various sources of geophysical and hydrological information. The information contained in this experiment is evaluated through several parameter estimation approaches, including a grid-search-based strategy, stochastic simulation of hydrological property distributions, and deterministic inversion using regularization and pilot-point techniques. Doing this allows us to investigate hydraulic conductivity and effective porosity distributions and to compare the effects of assumptions from several methods and parameterizations. Our results provide new insights into the understanding of variable-density transport processes and the hydrological relevance of incorporating various sources of information in parameter estimation approaches. Among others, the variable-density effect and the effective porosity distribution, as well as their coupling with the hydraulic conductivity structure, are seen to be significant in the transport process. The results also show that assumed prior information can strongly influence the estimated distributions of hydrological properties.
Vision-guided gripping of a cylinder
NASA Technical Reports Server (NTRS)
Nicewarner, Keith E.; Kelley, Robert B.
1991-01-01
The motivation for vision-guided servoing is taken from tasks in automated or telerobotic space assembly and construction. Vision-guided servoing requires the ability to perform rapid pose estimates and provide predictive feature tracking. Monocular information from a gripper-mounted camera is used to servo the gripper to grasp a cylinder. The procedure is divided into recognition and servo phases. The recognition stage verifies the presence of a cylinder in the camera field of view. Then an initial pose estimate is computed and uncluttered scan regions are selected. The servo phase processes only the selected scan regions of the image. Given the knowledge, from the recognition phase, that there is a cylinder in the image and knowing the radius of the cylinder, 4 of the 6 pose parameters can be estimated with minimal computation. The relative motion of the cylinder is obtained by using the current pose and prior pose estimates. The motion information is then used to generate a predictive feature-based trajectory for the path of the gripper.
The Rights of Homeless Students.
ERIC Educational Resources Information Center
Strong, Penny
This booklet presents information concerning homelessness and the education of homeless children nationwide and in Illinois. Estimates of the number of homeless children vary widely. Reasons for homeless children's failure to attend school include school residency requirements, delays in transfer of documents, and lack of transportation. The…
77 FR 70165 - Proposed Information Collection Activity; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-23
... Quarterly Financial Report. OMB No.: 0970-0205. Description: This is a financial report submitted following... Programs Quarterly 62 4 20 4,960 Financial Report Estimated Total Annual Burden Hours: 4,960. In compliance... requirement to report program expenditures made in the [[Page 70166
Estimating future flood frequency and magnitude in basins affected by glacier wastage.
DOT National Transportation Integrated Search
2014-10-01
Infrastructure, such as bridge crossings, requires informed structural designs in order to be effective and reliable for : decades. A typical bridge is intended to operate for 75 years or more, a period of time anticipated to exhibit a warming : clim...
NASA Astrophysics Data System (ADS)
Koshigai, Masaru; Marui, Atsunao
Water table provides important information for the evaluation of groundwater resource. Recently, the estimation of water table in wide area is required for effective evaluation of groundwater resources. However, evaluation process is met with difficulties due to technical and economic constraints. Regression analysis for the prediction of groundwater levels based on geomorphologic and geologic conditions is considered as a reliable tool for the estimation of water table of wide area. Data of groundwater levels were extracted from the public database of geotechnical information. It was observed that changes in groundwater level depend on climate conditions. It was also observed and confirmed that there exist variations of groundwater levels according to geomorphologic and geologic conditions. The objective variable of the regression analysis was groundwater level. And the explanatory variables were elevation and the dummy variable consisting of group number. The constructed regression formula was significant according to the determination coefficients and analysis of the variance. Therefore, combining the regression formula and mesh map, the statistical method to estimate the water table based on geomorphologic and geologic condition for the whole country could be established.
A critical look at national monitoring programs for birds and other wildlife species
Sauer, J.R.; O'Shea, T.J.; Bogon, M.A.
2003-01-01
Concerns?about declines in numerous taxa have created agreat deal of interest in survey development. Because birds have traditionally been monitored by a variety of methods, bird surveys form natural models for development of surveys for other taxa. Here I suggest that most bird surveys are not appropriate models for survey design. Most lack important design components associated with estimation of population parameters at sample sites or with sampling over space, leading to estimates that may be biased, I discuss the limitations of national bird monitoring programs designed to monitor population size. Although these surveys are often analyzed, careful consideration must be given to factors that may bias estimates but that cannot be evaluated within the survey. Bird surveys with appropriate designs have generally been developed as part of management programs that have specific information needs. Experiences gained from bird surveys provide important information for development of surveys for other taxa, and statistical developments in estimation of population sizes from counts provide new approaches to overcoming the limitations evident in many bird surveys. Design of surveys is a collaborative effort, requiring input from biologists, statisticians, and the managers who will use the information from the surveys.
Estimating demographic parameters using a combination of known-fate and open N-mixture models
Schmidt, Joshua H.; Johnson, Devin S.; Lindberg, Mark S.; Adams, Layne G.
2015-01-01
Accurate estimates of demographic parameters are required to infer appropriate ecological relationships and inform management actions. Known-fate data from marked individuals are commonly used to estimate survival rates, whereas N-mixture models use count data from unmarked individuals to estimate multiple demographic parameters. However, a joint approach combining the strengths of both analytical tools has not been developed. Here we develop an integrated model combining known-fate and open N-mixture models, allowing the estimation of detection probability, recruitment, and the joint estimation of survival. We demonstrate our approach through both simulations and an applied example using four years of known-fate and pack count data for wolves (Canis lupus). Simulation results indicated that the integrated model reliably recovered parameters with no evidence of bias, and survival estimates were more precise under the joint model. Results from the applied example indicated that the marked sample of wolves was biased toward individuals with higher apparent survival rates than the unmarked pack mates, suggesting that joint estimates may be more representative of the overall population. Our integrated model is a practical approach for reducing bias while increasing precision and the amount of information gained from mark–resight data sets. We provide implementations in both the BUGS language and an R package.
Estimating demographic parameters using a combination of known-fate and open N-mixture models.
Schmidt, Joshua H; Johnson, Devin S; Lindberg, Mark S; Adams, Layne G
2015-10-01
Accurate estimates of demographic parameters are required to infer appropriate ecological relationships and inform management actions. Known-fate data from marked individuals are commonly used to estimate survival rates, whereas N-mixture models use count data from unmarked individuals to estimate multiple demographic parameters. However, a joint approach combining the strengths of both analytical tools has not been developed. Here we develop an integrated model combining known-fate and open N-mixture models, allowing the estimation of detection probability, recruitment, and the joint estimation of survival. We demonstrate our approach through both simulations and an applied example using four years of known-fate and pack count data for wolves (Canis lupus). Simulation results indicated that the integrated model reliably recovered parameters with no evidence of bias, and survival estimates were more precise under the joint model. Results from the applied example indicated that the marked sample of wolves was biased toward individuals with higher apparent survival rates than the unmarked pack mates, suggesting that joint estimates may be more representative of the overall population. Our integrated model is a practical approach for reducing bias while increasing precision and the amount of information gained from mark-resight data sets. We provide implementations in both the BUGS language and an R package.
Remontet, L; Bossard, N; Belot, A; Estève, J
2007-05-10
Relative survival provides a measure of the proportion of patients dying from the disease under study without requiring the knowledge of the cause of death. We propose an overall strategy based on regression models to estimate the relative survival and model the effects of potential prognostic factors. The baseline hazard was modelled until 10 years follow-up using parametric continuous functions. Six models including cubic regression splines were considered and the Akaike Information Criterion was used to select the final model. This approach yielded smooth and reliable estimates of mortality hazard and allowed us to deal with sparse data taking into account all the available information. Splines were also used to model simultaneously non-linear effects of continuous covariates and time-dependent hazard ratios. This led to a graphical representation of the hazard ratio that can be useful for clinical interpretation. Estimates of these models were obtained by likelihood maximization. We showed that these estimates could be also obtained using standard algorithms for Poisson regression. Copyright 2006 John Wiley & Sons, Ltd.
Reconstruction of financial networks for robust estimation of systemic risk
NASA Astrophysics Data System (ADS)
Mastromatteo, Iacopo; Zarinelli, Elia; Marsili, Matteo
2012-03-01
In this paper we estimate the propagation of liquidity shocks through interbank markets when the information about the underlying credit network is incomplete. We show that techniques such as maximum entropy currently used to reconstruct credit networks severely underestimate the risk of contagion by assuming a trivial (fully connected) topology, a type of network structure which can be very different from the one empirically observed. We propose an efficient message-passing algorithm to explore the space of possible network structures and show that a correct estimation of the network degree of connectedness leads to more reliable estimations for systemic risk. Such an algorithm is also able to produce maximally fragile structures, providing a practical upper bound for the risk of contagion when the actual network structure is unknown. We test our algorithm on ensembles of synthetic data encoding some features of real financial networks (sparsity and heterogeneity), finding that more accurate estimations of risk can be achieved. Finally we find that this algorithm can be used to control the amount of information that regulators need to require from banks in order to sufficiently constrain the reconstruction of financial networks.
A comparison of approaches for estimating bottom-sediment mass in large reservoirs
Juracek, Kyle E.
2006-01-01
Estimates of sediment and sediment-associated constituent loads and yields from drainage basins are necessary for the management of reservoir-basin systems to address important issues such as reservoir sedimentation and eutrophication. One method for the estimation of loads and yields requires a determination of the total mass of sediment deposited in a reservoir. This method involves a sediment volume-to-mass conversion using bulk-density information. A comparison of four computational approaches (partition, mean, midpoint, strategic) for using bulk-density information to estimate total bottom-sediment mass in four large reservoirs indicated that the differences among the approaches were not statistically significant. However, the lack of statistical significance may be a result of the small sample size. Compared to the partition approach, which was presumed to provide the most accurate estimates of bottom-sediment mass, the results achieved using the strategic, mean, and midpoint approaches differed by as much as ?4, ?20, and ?44 percent, respectively. It was concluded that the strategic approach may merit further investigation as a less time consuming and less costly alternative to the partition approach.
Integrating Aggregate Exposure Pathway (AEP) and Adverse ...
High throughput toxicity testing (HTT) holds the promise of providing data for tens of thousands of chemicals that currently have no data due to the cost and time required for animal testing. Interpretation of these results require information linking the perturbations seen in vitro with adverse outcomes in vivo and requires knowledge of how estimated exposure to the chemicals compare to the in vitro concentrations that show an effect. This abstract discusses how Adverse Outcome Pathways (AOPs) can be used to link HTT with adverse outcomes of regulatory significance and how Aggregate Exposure Pathways (AEPs) can connect concentrations of environment stressors at a source with an expected target site concentration designed to provide exposure estimates that are comparable to concentrations identified in HTT. Presentation at the ICCA-LRI and JRC Workshop: Fit-For-Purpose Exposure Assessment For Risk-Based Decision Making
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-21
...The proposed information collection requirement described below has been submitted to the Office of Management and Budget (OMB) for review, as required by the Paperwork Reduction Act. The Department is soliciting public comments on the subject proposal. Estimates derived from the RHFS sample will help public and private stakeholders better understand the financing, operating costs, and property characteristics of the multifamily rental housing stock in the United States. Many of the questions are similar to those found on the 1995 Property Owners and Managers Survey and the rental housing portion of the 2001 Residential Finance Survey.
Heer, D M; Passel, J F
1987-01-01
This article compares 2 different methods for estimating the number of undocumented Mexican adults in Los Angeles County. The 1st method, the survey-based method, uses a combination of 1980 census data and the results of a survey conducted in Los Angeles County in 1980 and 1981. A sample was selected from babies born in Los Angeles County who had a mother or father of Mexican origin. The survey included questions about the legal status of the baby's parents and certain other relatives. The resulting estimates of undocumented Mexican immigrants are for males aged 18-44 and females aged 18-39. The 2nd method, the residual method, involves comparison of census figures for aliens counted with estimates of legally-resident aliens developed principally with data from the Immigration and Naturalization Service (INS). For this study, estimates by age, sex, and period of entry were produced for persons born in Mexico and living in Los Angeles County. The results of this research indicate that it is possible to measure undocumented immigration with different techniques, yet obtain results that are similar. Both techniques presented here are limited in that they represent estimates of undocumented aliens based on the 1980 census. The number of additional undocumented aliens not counted remains a subject of conjecture. The fact that the proportions undocumented shown in the survey (228,700) are quite similar to the residual estimates (317,800) suggests that the number of undocumented aliens not counted in the census may not be an extremely large fraction of the undocumented population. The survey-based estimates have some significant advantages over the residual estimates. The survey provides tabulations of the undocumented population by characteristics other than the limited demographic information provided by the residual technique. On the other hand, the survey-based estimates require that a survey be conducted and, if national or regional estimates are called for, they may require a number of surveys. The residual technique, however, also requires a data source other than the census. However, the INS discontinued the annual registration of aliens after 1981. Thus, estimates of undocumented aliens based on the residual technique will probably not be possible for subnational areas using the 1990 census unless the registration program is reinstituted. Perhaps the best information on the undocumented population in the 1990 census will come from an improved version of the survey-based technique described here applied in selected local areas.
Value of Information Analysis for Time-lapse Seismic Data by Simulation-Regression
NASA Astrophysics Data System (ADS)
Dutta, G.; Mukerji, T.; Eidsvik, J.
2016-12-01
A novel method to estimate the Value of Information (VOI) of time-lapse seismic data in the context of reservoir development is proposed. VOI is a decision analytic metric quantifying the incremental value that would be created by collecting information prior to making a decision under uncertainty. The VOI has to be computed before collecting the information and can be used to justify its collection. Previous work on estimating the VOI of geophysical data has involved explicit approximation of the posterior distribution of reservoir properties given the data and then evaluating the prospect values for that posterior distribution of reservoir properties. Here, we propose to directly estimate the prospect values given the data by building a statistical relationship between them using regression. Various regression techniques such as Partial Least Squares Regression (PLSR), Multivariate Adaptive Regression Splines (MARS) and k-Nearest Neighbors (k-NN) are used to estimate the VOI, and the results compared. For a univariate Gaussian case, the VOI obtained from simulation-regression has been shown to be close to the analytical solution. Estimating VOI by simulation-regression is much less computationally expensive since the posterior distribution of reservoir properties given each possible dataset need not be modeled and the prospect values need not be evaluated for each such posterior distribution of reservoir properties. This method is flexible, since it does not require rigid model specification of posterior but rather fits conditional expectations non-parametrically from samples of values and data.
Ferreira, Diogo C; van der Linden, Marx G; de Oliveira, Leandro C; Onuchic, José N; de Araújo, Antônio F Pereira
2016-04-01
Recent ab initio folding simulations for a limited number of small proteins have corroborated a previous suggestion that atomic burial information obtainable from sequence could be sufficient for tertiary structure determination when combined to sequence-independent geometrical constraints. Here, we use simulations parameterized by native burials to investigate the required amount of information in a diverse set of globular proteins comprising different structural classes and a wide size range. Burial information is provided by a potential term pushing each atom towards one among a small number L of equiprobable concentric layers. An upper bound for the required information is provided by the minimal number of layers L(min) still compatible with correct folding behavior. We obtain L(min) between 3 and 5 for seven small to medium proteins with 50 ≤ Nr ≤ 110 residues while for a larger protein with Nr = 141 we find that L ≥ 6 is required to maintain native stability. We additionally estimate the usable redundancy for a given L ≥ L(min) from the burial entropy associated to the largest folding-compatible fraction of "superfluous" atoms, for which the burial term can be turned off or target layers can be chosen randomly. The estimated redundancy for small proteins with L = 4 is close to 0.8. Our results are consistent with the above-average quality of burial predictions used in previous simulations and indicate that the fraction of approachable proteins could increase significantly with even a mild, plausible, improvement on sequence-dependent burial prediction or on sequence-independent constraints that augment the detectable redundancy during simulations. © 2016 Wiley Periodicals, Inc.
Transfer Entropy as a Log-Likelihood Ratio
NASA Astrophysics Data System (ADS)
Barnett, Lionel; Bossomaier, Terry
2012-09-01
Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.
Transfer entropy as a log-likelihood ratio.
Barnett, Lionel; Bossomaier, Terry
2012-09-28
Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.
Remote sensing as a tool for estimating soil erosion potential
NASA Technical Reports Server (NTRS)
Morris-Jones, D. R.; Morgan, K. M.; Kiefer, R. W.
1979-01-01
The Universal Soil Loss Equation is a frequently used methodology for estimating soil erosion potential. The Universal Soil Loss Equation requires a variety of types of geographic information (e.g. topographic slope, soil erodibility, land use, crop type, and soil conservation practice) in order to function. This information is traditionally gathered from topographic maps, soil surveys, field surveys, and interviews with farmers. Remote sensing data sources and interpretation techniques provide an alternative method for collecting information regarding land use, crop type, and soil conservation practice. Airphoto interpretation techniques and medium altitude, multi-date color and color infrared positive transparencies (70mm) were utilized in this study to determine their effectiveness for gathering the desired land use/land cover data. Successful results were obtained within the test site, a 6136 hectare watershed in Dane County, Wisconsin.
Wiener, Renda Soylemez; Schwartz, Lisa M; Woloshin, Steven; Welch, H Gilbert
2011-08-02
Because pulmonary nodules are found in up to 25% of patients undergoing computed tomography of the chest, the question of whether to perform biopsy is becoming increasingly common. Data on complications after transthoracic needle lung biopsy are limited to case series from selected institutions. To determine population-based estimates of risks for complications after transthoracic needle biopsy of a pulmonary nodule. Cross-sectional analysis. The 2006 State Ambulatory Surgery Databases and State Inpatient Databases for California, Florida, Michigan, and New York from the Healthcare Cost and Utilization Project. 15 865 adults who had transthoracic needle biopsy of a pulmonary nodule. Percentage of biopsies complicated by hemorrhage, any pneumothorax, or pneumothorax requiring a chest tube, and adjusted odds ratios for these complications associated with various biopsy characteristics, calculated by using multivariate, population-averaged generalized estimating equations. Although hemorrhage was rare, complicating 1.0% (95% CI, 0.9% to 1.2%) of biopsies, 17.8% (CI, 11.8% to 23.8%) of patients with hemorrhage required a blood transfusion. In contrast, the risk for any pneumothorax was 15.0% (CI, 14.0% to 16.0%), and 6.6% (CI, 6.0% to 7.2%) of all biopsies resulted in pneumothorax requiring a chest tube. Compared with patients without complications, those who experienced hemorrhage or pneumothorax requiring a chest tube had longer lengths of stay (P < 0.001) and were more likely to develop respiratory failure requiring mechanical ventilation (P = 0.020). Patients aged 60 to 69 years (as opposed to younger or older patients), smokers, and those with chronic obstructive pulmonary disease had higher risk for complications. Estimated risks may be inaccurate if coding of complications is incomplete. The analyzed databases contain little clinical detail (such as information on nodule characteristics or biopsy pathology) and cannot indicate whether performing the biopsy produced useful information. Whereas hemorrhage is an infrequent complication of transthoracic needle lung biopsy, pneumothorax is common and often necessitates chest tube placement. These population-based data should help patients and physicians make more informed choices about whether to perform biopsy of a pulmonary nodule. Department of Veterans Affairs and National Cancer Institute.
Hydrologic Process-oriented Optimization of Electrical Resistivity Tomography
NASA Astrophysics Data System (ADS)
Hinnell, A.; Bechtold, M.; Ferre, T. A.; van der Kruk, J.
2010-12-01
Electrical resistivity tomography (ERT) is commonly used in hydrologic investigations. Advances in joint and coupled hydrogeophysical inversion have enhanced the quantitative use of ERT to construct and condition hydrologic models (i.e. identify hydrologic structure and estimate hydrologic parameters). However the selection of which electrical resistivity data to collect and use is often determined by a combination of data requirements for geophysical analysis, intuition on the part of the hydrogeophysicist and logistical constraints of the laboratory or field site. One of the advantages of coupled hydrogeophysical inversion is the direct link between the hydrologic model and the individual geophysical data used to condition the model. That is, there is no requirement to collect geophysical data suitable for independent geophysical inversion. The geophysical measurements collected can be optimized for estimation of hydrologic model parameters rather than to develop a geophysical model. Using a synthetic model of drip irrigation we evaluate the value of individual resistivity measurements to describe the soil hydraulic properties and then use this information to build a data set optimized for characterizing hydrologic processes. We then compare the information content in the optimized data set with the information content in a data set optimized using a Jacobian sensitivity analysis.
Challenges of Developing Design Discharge Estimates with Uncertain Data and Information
NASA Astrophysics Data System (ADS)
Senarath, S. U. S.
2016-12-01
This study focuses on design discharge estimates obtained for gauged basins through flood flow frequency analysis. Bulletin 17B (B17B) guidelines are widely used in the USA for developing these design estimates, which are required for many water resources engineering design applications. A set of outlier and historical data, and distribution parameter selection options is included in these guidelines. These options are provided in the guidelines as a means of accounting for uncertain data and information, primarily in the flow record. The individual as well as the cumulative effects of each of these preferences on design discharge estimates are evaluated in this study by using data from several gauges that are part of the United States Geological Survey's Hydro-Climatic Data Network. The results of this study show that despite the availability of rigorous and detailed guidelines for flood frequency analysis, the design discharge estimates can still vary substantially, from user to user, based on data and model parameter selection options chosen by each user. Thus, the findings of this study have strong implications for water resources engineers and other professionals who use B17B-based design discharge estimates in their work.
Line-Constrained Camera Location Estimation in Multi-Image Stereomatching.
Donné, Simon; Goossens, Bart; Philips, Wilfried
2017-08-23
Stereomatching is an effective way of acquiring dense depth information from a scene when active measurements are not possible. So-called lightfield methods take a snapshot from many camera locations along a defined trajectory (usually uniformly linear or on a regular grid-we will assume a linear trajectory) and use this information to compute accurate depth estimates. However, they require the locations for each of the snapshots to be known: the disparity of an object between images is related to both the distance of the camera to the object and the distance between the camera positions for both images. Existing solutions use sparse feature matching for camera location estimation. In this paper, we propose a novel method that uses dense correspondences to do the same, leveraging an existing depth estimation framework to also yield the camera locations along the line. We illustrate the effectiveness of the proposed technique for camera location estimation both visually for the rectification of epipolar plane images and quantitatively with its effect on the resulting depth estimation. Our proposed approach yields a valid alternative for sparse techniques, while still being executed in a reasonable time on a graphics card due to its highly parallelizable nature.
NASA Technical Reports Server (NTRS)
Jones, D. W.
1971-01-01
The navigation and guidance process for the Jupiter, Saturn and Uranus planetary encounter phases of the 1977 Grand Tour interior mission was simulated. Reference approach navigation accuracies were defined and the relative information content of the various observation types were evaluated. Reference encounter guidance requirements were defined, sensitivities to assumed simulation model parameters were determined and the adequacy of the linear estimation theory was assessed. A linear sequential estimator was used to provide an estimate of the augmented state vector, consisting of the six state variables of position and velocity plus the three components of a planet position bias. The guidance process was simulated using a nonspherical model of the execution errors. Computation algorithms which simulate the navigation and guidance process were derived from theory and implemented into two research-oriented computer programs, written in FORTRAN.
Qomariyah, Siti Nurul; Braunholtz, David; Achadi, Endang L; Witten, Karen H; Pambudi, Eko Setyo; Anggondowati, Trisari; Latief, Kamaluddin; Graham, Wendy J
2010-11-17
The maternal mortality ratio (MMR) remains high in most developing countries. Local, recent estimates of MMR are needed to motivate policymakers and evaluate interventions. But, estimating MMR, in the absence of vital registration systems, is difficult. This paper describes an efficient approach using village informant networks to capture maternal death cases (Maternal Deaths from Informants/Maternal Death Follow on Review or MADE-IN/MADE-FOR) developed to address this gap, and examines its validity and efficiency. MADE-IN used two village informant networks - heads of neighbourhood units (RTs) and health volunteers (Kaders). Informants were invited to attend separate network meetings - through the village head (for the RT) and through health centre for the kaders. Attached to the letter was a form with written instructions requesting informants list deaths of women of reproductive age (WRA) in the village during the previous two years. At a 'listing meeting' the informants' understanding on the form was checked, informants could correct their forms, and then collectively agreed a consolidated list. MADE-FOR consisted of visits relatives of likely pregnancy related deaths (PRDs) identified from MADE-IN, to confirm the PRD status and gather information about the cause of death. Capture-recapture (CRC) analysis enabled estimation of coverage rates of the two networks, and of total PRDs. The RT network identified a higher proportion of PRDs than the kaders (estimated 0.85 vs. 0.71), but the latter was easier and cheaper to access. Assigned PRD status amongst identified WRA deaths was more accurate for the kader network, and seemingly for more recent deaths, and for deaths from rural areas. Assuming information on live births from an existing source to calculate the MMR, MADE-IN/MADE-FOR cost only $0.1 (US) per women-year risk of exposure, substantially cheaper than alternatives. This study shows that reliable local, recent estimates of MMR can be obtained relatively cheaply using two independent informant networks to identify cases. Neither network captured all PRDs, but capture-recapture analysis allowed self-calibration. However, it requires careful avoidance of false-positives, and matching of cases identified by both networks, which was achieved by the home visit.
Simple expression for the quantum Fisher information matrix
NASA Astrophysics Data System (ADS)
Šafránek, Dominik
2018-04-01
Quantum Fisher information matrix (QFIM) is a cornerstone of modern quantum metrology and quantum information geometry. Apart from optimal estimation, it finds applications in description of quantum speed limits, quantum criticality, quantum phase transitions, coherence, entanglement, and irreversibility. We derive a surprisingly simple formula for this quantity, which, unlike previously known general expression, does not require diagonalization of the density matrix, and is provably at least as efficient. With a minor modification, this formula can be used to compute QFIM for any finite-dimensional density matrix. Because of its simplicity, it could also shed more light on the quantum information geometry in general.
An attempt to estimate students' workload.
Pogacnik, M; Juznic, P; Kosorok-Drobnic, M; Pogacnik, A; Cestnik, V; Kogovsek, J; Pestevsek, U; Fernandes, Tito
2004-01-01
Following the recent introduction of the European Credit Transfer System (ECTS) into several European university programs, a new interest has developed in determining students' workload. ECTS credits are numerical values describing the student workload required to complete course units; ECTS has the potential to facilitate comparison and create transparency between institutional curricula. ECTS credits are frequently listed alongside institutional credits in course outlines and module summaries. Measuring student workload has been difficult; to a large extent, estimates are based only upon anecdotal and casual information. To gather more systematic information, we asked students at the Veterinary Faculty, University of Ljubljana, to estimate the actual total workload they committed to fulfill their coursework obligations for specific subjects in the veterinary degree program by reporting their attendance at defined contact hours and their estimated time for outside study, including the time required for examinations and other activities. Students also reported the final grades they received for these subjects. The results show that certain courses require much more work than others, independent of credit unit assignment. Generally, the courses with more contact hours tend also to demand more independent work; the best predictor of both actual student workload and student success is the amount of contact time in which they participate. The data failed to show any strong connection between students' total workload and grades they received; rather, they showed some evidence that regular presence at contact hours was the most positive influence on grades. Less frequent presence at lectures tended to indicate less time spent on independent study. It was also found that pre-clinical and clinical courses tended to require more work from students than other, more general subjects. While the present study does not provide conclusive evidence, it does indicate the need for further inquiry into the nature of the relationship between teaching and learning in higher education and for evaluation of the benefits (or otherwise) of more "self-directed" study.
Department of Defense Base Structure Report for Fiscal Year 1991. Supplement
1990-08-01
replacement value is estimated at $600 billion.-- Defense installations and properties range from unmanned navigational aid stations of less than a half...REPORTING REQUIREMENT •>ýThe Base Structure Report is prepared bYAthe Department of De-fense• to ( a ) provide information on military installations, (b...support costs and evaluate possible alternatives to reduce such costs.-•-- A written report on DoD base structure is required to be submitted annually
ERIC Educational Resources Information Center
Badger, Elizabeth
1992-01-01
Explains a set of processes that teachers might use to structure their evaluation of students' learning and understanding. Illustrates the processes of setting goals, deciding what to assess, gathering information, and using the results through a measurement task requiring students to estimate the number of popcorn kernels in a container. (MDH)
Assessing the Social Influence of Television: A Social Cognition Perspective on Cultivation Effects.
ERIC Educational Resources Information Center
Shrum, L. J.
1995-01-01
Uses an information-processing perspective to illustrate how television viewing may affect social judgements. Posits heuristic processing as a mechanism that can explain why heavier television viewing results in higher first-order cultivation judgments (those requiring estimates of set size). (SR)
40 CFR Appendix D to Part 60 - Required Emission Inventory Information
Code of Federal Regulations, 2010 CFR
2010-07-01
... one device operates in series. The method of control efficiency determination shall be indicated (e.g., design efficiency, measured efficiency, estimated efficiency). (iii) Annual average control efficiency, in percent, taking into account control equipment down time. This shall be a combined efficiency when...
40 CFR Appendix D to Part 60 - Required Emission Inventory Information
Code of Federal Regulations, 2011 CFR
2011-07-01
... one device operates in series. The method of control efficiency determination shall be indicated (e.g., design efficiency, measured efficiency, estimated efficiency). (iii) Annual average control efficiency, in percent, taking into account control equipment down time. This shall be a combined efficiency when...
40 CFR Appendix D to Part 60 - Required Emission Inventory Information
Code of Federal Regulations, 2012 CFR
2012-07-01
... one device operates in series. The method of control efficiency determination shall be indicated (e.g., design efficiency, measured efficiency, estimated efficiency). (iii) Annual average control efficiency, in percent, taking into account control equipment down time. This shall be a combined efficiency when...
40 CFR Appendix D to Part 60 - Required Emission Inventory Information
Code of Federal Regulations, 2013 CFR
2013-07-01
... one device operates in series. The method of control efficiency determination shall be indicated (e.g., design efficiency, measured efficiency, estimated efficiency). (iii) Annual average control efficiency, in percent, taking into account control equipment down time. This shall be a combined efficiency when...
40 CFR Appendix D to Part 60 - Required Emission Inventory Information
Code of Federal Regulations, 2014 CFR
2014-07-01
... one device operates in series. The method of control efficiency determination shall be indicated (e.g., design efficiency, measured efficiency, estimated efficiency). (iii) Annual average control efficiency, in percent, taking into account control equipment down time. This shall be a combined efficiency when...
75 FR 47583 - Proposed Agency Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-06
... as soon as possible. The Desk Officer may be telephoned at 202-395-4650. ADDRESSES: Written comments... Program; and for measuring attainment of DOE's program goals as required by the Government Performance and... respondents are homeowners and former college students as described in the SUMMARY; (5) Annual Estimated...
78 FR 54729 - Reports, Forms, and Record Keeping Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-05
... information: Title--NHTSA Distracted Driving Survey Project. Type of Request--Revision of previously approved... region, age, and gender. The National Survey on Distracted Driving Attitudes and Behaviors (NSDDAB) will... driving behaviors. The estimated average amount of time to complete the survey is 20 minutes. This...
75 FR 55629 - Reports, Forms, and Recordkeeping Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-13
... technologies. The findings from this proposed collection of information will assist NHTSA in designing... interview. Prior to the administration of the survey, a total of 15 pretest interviews, averaging 20 minutes... result of the pretest, the Contractor would begin the main survey administration. Estimate of the Total...
75 FR 71423 - Notice of Proposed Information Collection Requests
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-23
... Review: Revision. Title of Collection: Federal Perkins Loan Program and General Provision Regulations... Estimated Number of Annual Burden Hours: 133,520. Abstract: Under the Federal Perkins Loan Program... provide for the making and servicing of Perkins Loans. If the Department did not require the collection...
76 FR 27671 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-12
... limit their business to the sale and redemption of securities of registered investment companies and... required to register only because they effect transactions in securities futures products. The information... securities business or do not hold inventories of securities. For these reasons, the staff estimates that the...
40 CFR 63.53 - Application content for case-by-case MACT determinations.
Code of Federal Regulations, 2012 CFR
2012-07-01
... identified emission point or group of affected emission points, an identification of control technology in... on the design, operation, size, estimated control efficiency and any other information deemed... CATEGORIES Requirements for Control Technology Determinations for Major Sources in Accordance With Clean Air...
40 CFR 63.53 - Application content for case-by-case MACT determinations.
Code of Federal Regulations, 2014 CFR
2014-07-01
... identified emission point or group of affected emission points, an identification of control technology in... on the design, operation, size, estimated control efficiency and any other information deemed... CATEGORIES Requirements for Control Technology Determinations for Major Sources in Accordance With Clean Air...
40 CFR 63.53 - Application content for case-by-case MACT determinations.
Code of Federal Regulations, 2013 CFR
2013-07-01
... identified emission point or group of affected emission points, an identification of control technology in... on the design, operation, size, estimated control efficiency and any other information deemed... CATEGORIES Requirements for Control Technology Determinations for Major Sources in Accordance With Clean Air...
40 CFR 63.53 - Application content for case-by-case MACT determinations.
Code of Federal Regulations, 2011 CFR
2011-07-01
... identified emission point or group of affected emission points, an identification of control technology in... on the design, operation, size, estimated control efficiency and any other information deemed... CATEGORIES Requirements for Control Technology Determinations for Major Sources in Accordance With Clean Air...
Toto, Tami; Jensen, Michael; Bartholomew, Mary Jane
2012-09-22
The Navigation Best Estimate (NAVBE) VAP was developed in response to the 2012-2013 Marine ARM GPCI Investigation of Clouds (MAGIC) deployment, the first ship-based deployment of the second ARM Mobile Facility (AMF2). It has since been applied to the 2015 ARM Cloud Aerosol Precipitation EXperiment (ACAPEX) deployment. A number of different instruments on the ships collected Global Positioning System (GPS) and Inertial Navigation System (INS) measurements during the MAGIC campaign. The motivation of the NAVBE VAP is to consolidate many different sources of this information in a single, continuous datastream to be used when information is required about ship location and orientation and to provide a more complete estimate than would be available from any one instrument. The result is 10 Hz and 1-min data streams reporting ship position and attitude
Improving carbon monitoring and reporting in forests using spatially-explicit information.
Boisvenue, Céline; Smiley, Byron P; White, Joanne C; Kurz, Werner A; Wulder, Michael A
2016-12-01
Understanding and quantifying carbon (C) exchanges between the biosphere and the atmosphere-specifically the process of C removal from the atmosphere, and how this process is changing-is the basis for developing appropriate adaptation and mitigation strategies for climate change. Monitoring forest systems and reporting on greenhouse gas (GHG) emissions and removals are now required components of international efforts aimed at mitigating rising atmospheric GHG. Spatially-explicit information about forests can improve the estimates of GHG emissions and removals. However, at present, remotely-sensed information on forest change is not commonly integrated into GHG reporting systems. New, detailed (30-m spatial resolution) forest change products derived from satellite time series informing on location, magnitude, and type of change, at an annual time step, have recently become available. Here we estimate the forest GHG balance using these new Landsat-based change data, a spatial forest inventory, and develop yield curves as inputs to the Carbon Budget Model of the Canadian Forest Sector (CBM-CFS3) to estimate GHG emissions and removals at a 30 m resolution for a 13 Mha pilot area in Saskatchewan, Canada. Our results depict the forests as cumulative C sink (17.98 Tg C or 0.64 Tg C year -1 ) between 1984 and 2012 with an average C density of 206.5 (±0.6) Mg C ha -1 . Comparisons between our estimates and estimates from Canada's National Forest Carbon Monitoring, Accounting and Reporting System (NFCMARS) were possible only on a subset of our study area. In our simulations the area was a C sink, while the official reporting simulations, it was a C source. Forest area and overall C stock estimates also differ between the two simulated estimates. Both estimates have similar uncertainties, but the spatially-explicit results we present here better quantify the potential improvement brought on by spatially-explicit modelling. We discuss the source of the differences between these estimates. This study represents an important first step towards the integration of spatially-explicit information into Canada's NFCMARS.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-06
...We (U.S. Fish and Wildlife Service) have sent an Information Collection Request (ICR) to OMB for review and approval. We summarize the ICR below and describe the nature of the collection and the estimated burden and cost. This information collection is scheduled to expire on November 30, 2013. We may not conduct or sponsor and a person is not required to respond to a collection of information unless it displays a currently valid OMB control number. However, under OMB regulations, we may continue to conduct or sponsor this information collection while it is pending at OMB.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-13
...We (U.S. Fish and Wildlife Service) have sent an Information Collection Request (ICR) to OMB for review and approval. We summarize the ICR below and describe the nature of the collection and the estimated burden and cost. This information collection is scheduled to expire on March 31, 2013. We may not conduct or sponsor and a person is not required to respond to a collection of information unless it displays a currently valid OMB control number. However, under OMB regulations, we may continue to conduct or sponsor this information collection while it is pending at OMB.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-17
...We (U.S. Fish and Wildlife Service) have sent an Information Collection Request (ICR) to OMB for review and approval. We summarize the ICR below and describe the nature of the collection and the estimated burden and cost. This information collection is scheduled to expire on December 31, 2013. We may not conduct or sponsor and a person is not required to respond to a collection of information unless it displays a currently valid OMB control number. However, under OMB regulations, we may continue to conduct or sponsor this information collection while it is pending at OMB.
NASA Astrophysics Data System (ADS)
Multsch, S.; Exbrayat, J.-F.; Kirby, M.; Viney, N. R.; Frede, H.-G.; Breuer, L.
2015-04-01
Irrigation agriculture plays an increasingly important role in food supply. Many evapotranspiration models are used today to estimate the water demand for irrigation. They consider different stages of crop growth by empirical crop coefficients to adapt evapotranspiration throughout the vegetation period. We investigate the importance of the model structural versus model parametric uncertainty for irrigation simulations by considering six evapotranspiration models and five crop coefficient sets to estimate irrigation water requirements for growing wheat in the Murray-Darling Basin, Australia. The study is carried out using the spatial decision support system SPARE:WATER. We find that structural model uncertainty among reference ET is far more important than model parametric uncertainty introduced by crop coefficients. These crop coefficients are used to estimate irrigation water requirement following the single crop coefficient approach. Using the reliability ensemble averaging (REA) technique, we are able to reduce the overall predictive model uncertainty by more than 10%. The exceedance probability curve of irrigation water requirements shows that a certain threshold, e.g. an irrigation water limit due to water right of 400 mm, would be less frequently exceeded in case of the REA ensemble average (45%) in comparison to the equally weighted ensemble average (66%). We conclude that multi-model ensemble predictions and sophisticated model averaging techniques are helpful in predicting irrigation demand and provide relevant information for decision making.
NASA Technical Reports Server (NTRS)
Sheppard, Albert P.; Wood, Joan M.
1976-01-01
Candidate experiments designed for the space shuttle transportation system and the long duration exposure facility are summarized. The data format covers: experiment title, Experimenter, technical abstract, benefits/justification, technical discussion of experiment approach and objectives, related work and experience, experiment facts space properties used, environmental constraints, shielding requirements, if any, physical description, and sketch of major elements. Information was also included on experiment hardware, research required to develop experiment, special requirements, cost estimate, safety considerations, and interactions with spacecraft and other experiments.
Occupancy models to study wildlife
Bailey, Larissa; Adams, Michael John
2005-01-01
Many wildlife studies seek to understand changes or differences in the proportion of sites occupied by a species of interest. These studies are hampered by imperfect detection of these species, which can result in some sites appearing to be unoccupied that are actually occupied. Occupancy models solve this problem and produce unbiased estimates of occupancy and related parameters. Required data (detection/non-detection information) are relatively simple and inexpensive to collect. Software is available free of charge to aid investigators in occupancy estimation.
A preliminary estimate of future communications traffic for the electric power system
NASA Technical Reports Server (NTRS)
Barnett, R. M.
1981-01-01
Diverse new generator technologies using renewable energy, and to improve operational efficiency throughout the existing electric power systems are presented. A description of a model utility and the information transfer requirements imposed by incorporation of dispersed storage and generation technologies and implementation of more extensive energy management are estimated. An example of possible traffic for an assumed system, and an approach that can be applied to other systems, control configurations, or dispersed storage and generation penetrations is provided.
Fuel Burn Estimation Using Real Track Data
NASA Technical Reports Server (NTRS)
Chatterji, Gano B.
2011-01-01
A procedure for estimating fuel burned based on actual flight track data, and drag and fuel-flow models is described. The procedure consists of estimating aircraft and wind states, lift, drag and thrust. Fuel-flow for jet aircraft is determined in terms of thrust, true airspeed and altitude as prescribed by the Base of Aircraft Data fuel-flow model. This paper provides a theoretical foundation for computing fuel-flow with most of the information derived from actual flight data. The procedure does not require an explicit model of thrust and calibrated airspeed/Mach profile which are typically needed for trajectory synthesis. To validate the fuel computation method, flight test data provided by the Federal Aviation Administration were processed. Results from this method show that fuel consumed can be estimated within 1% of the actual fuel consumed in the flight test. Next, fuel consumption was estimated with simplified lift and thrust models. Results show negligible difference with respect to the full model without simplifications. An iterative takeoff weight estimation procedure is described for estimating fuel consumption, when takeoff weight is unavailable, and for establishing fuel consumption uncertainty bounds. Finally, the suitability of using radar-based position information for fuel estimation is examined. It is shown that fuel usage could be estimated within 5.4% of the actual value using positions reported in the Airline Situation Display to Industry data with simplified models and iterative takeoff weight computation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holden, Jacob; Wood, Eric W; Zhu, Lei
A data-driven technique for estimation of energy requirements for a proposed vehicle trip has been developed. Based on over 700,000 miles of driving data, the technique has been applied to generate a model that estimates trip energy requirements. The model uses a novel binning approach to categorize driving by road type, traffic conditions, and driving profile. The trip-level energy estimations can easily be aggregated to any higher-level transportation system network desired. The model has been tested and validated on the Austin, Texas, data set used to build this model. Ground-truth energy consumption for the data set was obtained from Futuremore » Automotive Systems Technology Simulator (FASTSim) vehicle simulation results. The energy estimation model has demonstrated 12.1 percent normalized total absolute error. The energy estimation from the model can be used to inform control strategies in routing tools, such as change in departure time, alternate routing, and alternate destinations, to reduce energy consumption. The model can also be used to determine more accurate energy consumption of regional or national transportation networks if trip origin and destinations are known. Additionally, this method allows the estimation tool to be tuned to a specific driver or vehicle type.« less
Data vs. information: A system paradigm
NASA Technical Reports Server (NTRS)
Billingsley, F. C.
1982-01-01
The data system designer requires data parameters, and is dependent on the user to convert information needs to these data parameters. This conversion will be done with more or less accuracy, beginning a chain of inaccuracies which propagate through the system, and which, in the end, may prevent the user from converting the data received into the information required. The concept to be pursued is that errors occur in various parts of the system, and, having occurred, propagate to the end. Modeling of the system may allow an estimation of the effects at any point and the final accumulated effect, and may prove a method of allocating an error budget among the system components. The selection of the various technical parameters which a data system must meet must be done in relation to the ability of the user to turn the cold, impersonal data into a live, personal decision or piece of information.
Cost estimation and analysis using the Sherpa Automated Mine Cost Engineering System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stebbins, P.E.
1993-09-01
The Sherpa Automated Mine Cost Engineering System is a menu-driven software package designed to estimate capital and operating costs for proposed surface mining operations. The program is engineering (as opposed to statistically) based, meaning that all equipment, manpower, and supply requirements are determined from deposit geology, project design and mine production information using standard engineering techniques. These requirements are used in conjunction with equipment, supply, and labor cost databases internal to the program to estimate all associated costs. Because virtually all on-site cost parameters are interrelated within the program, Sherpa provides an efficient means of examining the impact of changesmore » in the equipment mix on total capital and operating costs. If any aspect of the operation is changed, Sherpa immediately adjusts all related aspects as necessary. For instance, if the user wishes to examine the cost ramifications of selecting larger trucks, the program not only considers truck purchase and operation costs, it also automatically and immediately adjusts excavator requirements, operator and mechanic needs, repair facility size, haul road construction and maintenance costs, and ancillary equipment specifications.« less
NASA Astrophysics Data System (ADS)
Lumme, E.; Pomoell, J.; Kilpua, E. K. J.
2017-12-01
Estimates of the photospheric magnetic, electric, and plasma velocity fields are essential for studying the dynamics of the solar atmosphere, for example through the derivative quantities of Poynting and relative helicity flux and using the fields to obtain the lower boundary condition for data-driven coronal simulations. In this paper we study the performance of a data processing and electric field inversion approach that requires only high-resolution and high-cadence line-of-sight or vector magnetograms, which we obtain from the Helioseismic and Magnetic Imager (HMI) onboard Solar Dynamics Observatory (SDO). The approach does not require any photospheric velocity estimates, and the lacking velocity information is compensated for using ad hoc assumptions. We show that the free parameters of these assumptions can be optimized to reproduce the time evolution of the total magnetic energy injection through the photosphere in NOAA AR 11158, when compared to recent state-of-the-art estimates for this active region. However, we find that the relative magnetic helicity injection is reproduced poorly, reaching at best a modest underestimation. We also discuss the effect of some of the data processing details on the results, including the masking of the noise-dominated pixels and the tracking method of the active region, neither of which has received much attention in the literature so far. In most cases the effect of these details is small, but when the optimization of the free parameters of the ad hoc assumptions is considered, a consistent use of the noise mask is required. The results found in this paper imply that the data processing and electric field inversion approach that uses only the photospheric magnetic field information offers a flexible and straightforward way to obtain photospheric magnetic and electric field estimates suitable for practical applications such as coronal modeling studies.
Lawn, Joy E; Bianchi-Jassir, Fiorella; Russell, Neal J; Kohli-Lynch, Maya; Tann, Cally J; Hall, Jennifer; Madrid, Lola; Baker, Carol J; Bartlett, Linda; Cutland, Clare; Gravett, Michael G; Heath, Paul T; Ip, Margaret; Le Doare, Kirsty; Madhi, Shabir A; Rubens, Craig E; Saha, Samir K; Schrag, Stephanie; Sobanjo-Ter Meulen, Ajoke; Vekemans, Johan; Seale, Anna C
2017-11-06
Improving maternal, newborn, and child health is central to Sustainable Development Goal targets for 2030, requiring acceleration especially to prevent 5.6 million deaths around the time of birth. Infections contribute to this burden, but etiological data are limited. Group B Streptococcus (GBS) is an important perinatal pathogen, although previously focus has been primarily on liveborn children, especially early-onset disease. In this first of an 11-article supplement, we discuss the following: (1) Why estimate the worldwide burden of GBS disease? (2) What outcomes of GBS in pregnancy should be included? (3) What data and epidemiological parameters are required? (4) What methods and models can be used to transparently estimate this burden of GBS? (5) What are the challenges with available data? and (6) How can estimates address data gaps to better inform GBS interventions including maternal immunization? We review all available GBS data worldwide, including maternal GBS colonization, risk of neonatal disease (with/without intrapartum antibiotic prophylaxis), maternal GBS disease, neonatal/infant GBS disease, and subsequent impairment, plus GBS-associated stillbirth, preterm birth, and neonatal encephalopathy. We summarize our methods for searches, meta-analyses, and modeling including a compartmental model. Our approach is consistent with the World Health Organization (WHO) Guidelines for Accurate and Transparent Health Estimates Reporting (GATHER), published in The Lancet and the Public Library of Science (PLoS). We aim to address priority epidemiological gaps highlighted by WHO to inform potential maternal vaccination. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.
A History-based Estimation for LHCb job requirements
NASA Astrophysics Data System (ADS)
Rauschmayr, Nathalie
2015-12-01
The main goal of a Workload Management System (WMS) is to find and allocate resources for the given tasks. The more and better job information the WMS receives, the easier will be to accomplish its task, which directly translates into higher utilization of resources. Traditionally, the information associated with each job, like expected runtime, is defined beforehand by the Production Manager in best case and fixed arbitrary values by default. In the case of LHCb's Workload Management System no mechanisms are provided which automate the estimation of job requirements. As a result, much more CPU time is normally requested than actually needed. Particularly, in the context of multicore jobs this presents a major problem, since single- and multicore jobs shall share the same resources. Consequently, grid sites need to rely on estimations given by the VOs in order to not decrease the utilization of their worker nodes when making multicore job slots available. The main reason for going to multicore jobs is the reduction of the overall memory footprint. Therefore, it also needs to be studied how memory consumption of jobs can be estimated. A detailed workload analysis of past LHCb jobs is presented. It includes a study of job features and their correlation with runtime and memory consumption. Following the features, a supervised learning algorithm is developed based on a history based prediction. The aim is to learn over time how jobs’ runtime and memory evolve influenced due to changes in experiment conditions and software versions. It will be shown that estimation can be notably improved if experiment conditions are taken into account.
Grizzly Bear Noninvasive Genetic Tagging Surveys: Estimating the Magnitude of Missed Detections.
Fisher, Jason T; Heim, Nicole; Code, Sandra; Paczkowski, John
2016-01-01
Sound wildlife conservation decisions require sound information, and scientists increasingly rely on remotely collected data over large spatial scales, such as noninvasive genetic tagging (NGT). Grizzly bears (Ursus arctos), for example, are difficult to study at population scales except with noninvasive data, and NGT via hair trapping informs management over much of grizzly bears' range. Considerable statistical effort has gone into estimating sources of heterogeneity, but detection error-arising when a visiting bear fails to leave a hair sample-has not been independently estimated. We used camera traps to survey grizzly bear occurrence at fixed hair traps and multi-method hierarchical occupancy models to estimate the probability that a visiting bear actually leaves a hair sample with viable DNA. We surveyed grizzly bears via hair trapping and camera trapping for 8 monthly surveys at 50 (2012) and 76 (2013) sites in the Rocky Mountains of Alberta, Canada. We used multi-method occupancy models to estimate site occupancy, probability of detection, and conditional occupancy at a hair trap. We tested the prediction that detection error in NGT studies could be induced by temporal variability within season, leading to underestimation of occupancy. NGT via hair trapping consistently underestimated grizzly bear occupancy at a site when compared to camera trapping. At best occupancy was underestimated by 50%; at worst, by 95%. Probability of false absence was reduced through successive surveys, but this mainly accounts for error imparted by movement among repeated surveys, not necessarily missed detections by extant bears. The implications of missed detections and biased occupancy estimates for density estimation-which form the crux of management plans-require consideration. We suggest hair-trap NGT studies should estimate and correct detection error using independent survey methods such as cameras, to ensure the reliability of the data upon which species management and conservation actions are based.
Discrete-to-continuous transition in quantum phase estimation
NASA Astrophysics Data System (ADS)
Rządkowski, Wojciech; Demkowicz-Dobrzański, Rafał
2017-09-01
We analyze the problem of quantum phase estimation in which the set of allowed phases forms a discrete N -element subset of the whole [0 ,2 π ] interval, φn=2 π n /N , n =0 ,⋯,N -1 , and study the discrete-to-continuous transition N →∞ for various cost functions as well as the mutual information. We also analyze the relation between the problems of phase discrimination and estimation by considering a step cost function of a given width σ around the true estimated value. We show that in general a direct application of the theory of covariant measurements for a discrete subgroup of the U(1 ) group leads to suboptimal strategies due to an implicit requirement of estimating only the phases that appear in the prior distribution. We develop the theory of subcovariant measurements to remedy this situation and demonstrate truly optimal estimation strategies when performing a transition from discrete to continuous phase estimation.
NASA Astrophysics Data System (ADS)
Tomita, H.; Hihara, T.; Kubota, M.
2018-01-01
Near-surface air-specific humidity is a key variable in the estimation of air-sea latent heat flux and evaporation from the ocean surface. An accurate estimation over the global ocean is required for studies on global climate, air-sea interactions, and water cycles. Current remote sensing techniques are problematic and a major source of errors for flux and evaporation. Here we propose a new method to estimate surface humidity using satellite microwave radiometer instruments, based on a new finding about the relationship between multichannel brightness temperatures measured by satellite sensors, surface humidity, and vertical moisture structure. Satellite estimations using the new method were compared with in situ observations to evaluate this method, confirming that it could significantly improve satellite estimations with high impact on satellite estimation of latent heat flux. We recommend the adoption of this method for any satellite microwave radiometer observations.
Inertia Estimation of Spacecraft Based on Modified Law of Conservation of Angular Momentum
NASA Astrophysics Data System (ADS)
Kim, Dong Hoon; Choi, Dae-Gyun; Oh, Hwa-Suk
2010-12-01
In general, the information of inertia properties is required to control a spacecraft. The inertia properties are changed by some activities such as consumption of propellant, deployment of solar panel, sloshing, etc. Extensive estimation methods have been investigated to obtain the precise inertia properties. The gyro-based attitude data including noise and bias needs to be compensated for improvement of attitude control accuracy. A modified estimation method based on the law of conservation of angular momentum is suggested to avoid inconvenience like filtering process for noiseeffect compensation. The conventional method is modified and beforehand estimated moment of inertia is applied to improve estimation efficiency of product of inertia. The performance of the suggested method has been verified for the case of STSAT-3, Korea Science Technology Satellite.
Generalized Full-Information Item Bifactor Analysis
Cai, Li; Yang, Ji Seung; Hansen, Mark
2011-01-01
Full-information item bifactor analysis is an important statistical method in psychological and educational measurement. Current methods are limited to single group analysis and inflexible in the types of item response models supported. We propose a flexible multiple-group item bifactor analysis framework that supports a variety of multidimensional item response theory models for an arbitrary mixing of dichotomous, ordinal, and nominal items. The extended item bifactor model also enables the estimation of latent variable means and variances when data from more than one group are present. Generalized user-defined parameter restrictions are permitted within or across groups. We derive an efficient full-information maximum marginal likelihood estimator. Our estimation method achieves substantial computational savings by extending Gibbons and Hedeker’s (1992) bifactor dimension reduction method so that the optimization of the marginal log-likelihood only requires two-dimensional integration regardless of the dimensionality of the latent variables. We use simulation studies to demonstrate the flexibility and accuracy of the proposed methods. We apply the model to study cross-country differences, including differential item functioning, using data from a large international education survey on mathematics literacy. PMID:21534682
77 FR 75971 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-26
...: National Oceanic and Atmospheric Administration (NOAA). Title: Economic Value of Puerto Rico's Coral Reef... market and non-market economic values of Puerto Rico's coral reef ecosystems. Estimates will be made for...'s coral reef ecosystems. The required information is to conduct focus groups to help in designing...
USDA-ARS?s Scientific Manuscript database
Data assimilation and regression are two commonly used methods for predicting agricultural yield from remote sensing observations. Data assimilation is a generative approach because it requires explicit approximations of the Bayesian prior and likelihood to compute the probability density function...
14 CFR 420.27 - Launch site location review-information requirements.
Code of Federal Regulations, 2011 CFR
2011-01-01
... corridor, and each impact range and impact dispersion area for each launch point; (b) Each launch vehicle... the analysis; (f) Each populated area located within a flight corridor or impact dispersion area; (g) The estimated casualty expectancy calculated for each populated area within a flight corridor or...
14 CFR 420.27 - Launch site location review-information requirements.
Code of Federal Regulations, 2012 CFR
2012-01-01
... corridor, and each impact range and impact dispersion area for each launch point; (b) Each launch vehicle... the analysis; (f) Each populated area located within a flight corridor or impact dispersion area; (g) The estimated casualty expectancy calculated for each populated area within a flight corridor or...
14 CFR 420.27 - Launch site location review-information requirements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... corridor, and each impact range and impact dispersion area for each launch point; (b) Each launch vehicle... the analysis; (f) Each populated area located within a flight corridor or impact dispersion area; (g) The estimated casualty expectancy calculated for each populated area within a flight corridor or...
14 CFR 420.27 - Launch site location review-information requirements.
Code of Federal Regulations, 2013 CFR
2013-01-01
... corridor, and each impact range and impact dispersion area for each launch point; (b) Each launch vehicle... the analysis; (f) Each populated area located within a flight corridor or impact dispersion area; (g) The estimated casualty expectancy calculated for each populated area within a flight corridor or...
Code of Federal Regulations, 2010 CFR
2010-07-01
...) Benzene concentration. An estimate of the average gasoline benzene concentration corresponding to the time... engineering and permitting, Procurement and Construction, and Commissioning and startup. (7) Basic information regarding the selected technology pathway for compliance (e.g., precursor re-routing or other technologies...
40 CFR 98.187 - Records that must be retained.
Code of Federal Regulations, 2013 CFR
2013-07-01
... carbon mass balance procedure is used to determine process CO2 emissions according to the requirements in... subpart (tons). (5) Average carbon content determined and records of the supplier provided information or... of how company records of measurements are used to estimate the carbon input to each smelting furnace...
40 CFR 98.187 - Records that must be retained.
Code of Federal Regulations, 2014 CFR
2014-07-01
... carbon mass balance procedure is used to determine process CO2 emissions according to the requirements in... subpart (tons). (5) Average carbon content determined and records of the supplier provided information or... of how company records of measurements are used to estimate the carbon input to each smelting furnace...
40 CFR 98.187 - Records that must be retained.
Code of Federal Regulations, 2010 CFR
2010-07-01
... carbon mass balance procedure is used to determine process CO2 emissions according to the requirements in... subpart (tons). (5) Average carbon content determined and records of the supplier provided information or... of how company records of measurements are used to estimate the carbon input to each smelting furnace...
40 CFR 98.187 - Records that must be retained.
Code of Federal Regulations, 2012 CFR
2012-07-01
... carbon mass balance procedure is used to determine process CO2 emissions according to the requirements in... subpart (tons). (5) Average carbon content determined and records of the supplier provided information or... of how company records of measurements are used to estimate the carbon input to each smelting furnace...
40 CFR 98.187 - Records that must be retained.
Code of Federal Regulations, 2011 CFR
2011-07-01
... carbon mass balance procedure is used to determine process CO2 emissions according to the requirements in... subpart (tons). (5) Average carbon content determined and records of the supplier provided information or... of how company records of measurements are used to estimate the carbon input to each smelting furnace...
76 FR 12367 - Proposed Information Collection; Visibility Valuation Survey Pilot Study
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-07
... Survey Pilot Study AGENCY: National Park Service, U.S. Department of the Interior. ACTION: Notice... Code of Federal Regulations). Updated estimates of visibility benefits are required because the studies... a pilot study to test the survey instrument and implementation procedures prior to the full survey...
ERIC Educational Resources Information Center
Hardy, Elisabet
2002-01-01
Describes several options for school districts to comply with Governmental Accounting Standards Board (GASB) Statements 34 and 35 that require school districts to inventory their fixed assets and measure the value of these assets over their estimated life for inclusion in their financial statements. Information about GASB Statements 34 and 35 is…
Cycle Counting Methods of the Aircraft Engine
ERIC Educational Resources Information Center
Fedorchenko, Dmitrii G.; Novikov, Dmitrii K.
2016-01-01
The concept of condition-based gas turbine-powered aircraft operation is realized all over the world, which implementation requires knowledge of the end-of-life information related to components of aircraft engines in service. This research proposes an algorithm for estimating the equivalent cyclical running hours. This article provides analysis…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-19
.... SUPPLEMENTARY INFORMATION: Background Pursuant to a request by Industrial Plastics and Machine, Inc... Revocation in Part, 77 FR 59168 (September 26, 2012). \\2\\ See Letter from Industrial Plastics and Machine... estimated antidumping duties required at the time of entry, or withdrawal from warehouse, for consumption in...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-26
.... Abstract: 46 U.S.C. 51309 authorizes the Academy to confer academic degrees. To maintain the appropriate academic standards, the program must be accredited by the appropriate accreditation body. The survey is part of USMMA's academic accreditation process. Annual Estimated Burden Hours: 250 hours. Addresses...
USDA-ARS?s Scientific Manuscript database
Thermal infrared (TIR) remote sensing of land-surface temperature (LST) provides valuable information about the sub-surface moisture status required for estimating evapotranspiration (ET) and detecting the onset and severity of drought. While empirical indices measuring anomalies in LST and vegetati...
Modelling topographic potential for erosion and deposition using GIS
Helena Mitasova; Louis R. Iverson
1996-01-01
Modelling of erosion and deposition in complex terrain within a geographical information system (GIS) requires a high resolution digital elevation model (DEM), reliable estimation of topographic parameters, and formulation of erosion models adequate for digital representation of spatially distributed parameters. Regularized spline with tension was integrated within a...
While only limited data are available to characterize the potential toxicity of over 8 million commercially available chemical substances, there is even less information available on the exposure and use-scenarios that are required to link potential toxicity to human and ecologic...
NASA Technical Reports Server (NTRS)
McGovern, Patrick J.; Solomon, Sean C.; Smith, David E.; Zuber, Maria T.; Neumann, Gregory A.; Head, J. W., III; Phillips, Roger J.; Simons, Mark
2001-01-01
We calculate localized gravity/topography admittances for Mars, in order to estimate elastic lithosphere thickness. A finite-amplitude correction to modeled gravity is required to properly interpret admittances in high-relief regions of Mars. Additional information is contained in the original extended abstract.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-23
.... Without the language, contracting officers may think they are required to utilize outdated information... Government and its institution. In particular, one respondent thought that the estimates would have to be...), and National Aeronautics and Space Administration (NASA). ACTION: Final rule. SUMMARY: The Civilian...
NASA Astrophysics Data System (ADS)
Sawant, S. A.; Chakraborty, M.; Suradhaniwar, S.; Adinarayana, J.; Durbha, S. S.
2016-06-01
Satellite based earth observation (EO) platforms have proved capability to spatio-temporally monitor changes on the earth's surface. Long term satellite missions have provided huge repository of optical remote sensing datasets, and United States Geological Survey (USGS) Landsat program is one of the oldest sources of optical EO datasets. This historical and near real time EO archive is a rich source of information to understand the seasonal changes in the horticultural crops. Citrus (Mandarin / Nagpur Orange) is one of the major horticultural crops cultivated in central India. Erratic behaviour of rainfall and dependency on groundwater for irrigation has wide impact on the citrus crop yield. Also, wide variations are reported in temperature and relative humidity causing early fruit onset and increase in crop water requirement. Therefore, there is need to study the crop growth stages and crop evapotranspiration at spatio-temporal scale for managing the scarce resources. In this study, an attempt has been made to understand the citrus crop growth stages using Normalized Difference Time Series (NDVI) time series data obtained from Landsat archives (http://earthexplorer.usgs.gov/). Total 388 Landsat 4, 5, 7 and 8 scenes (from year 1990 to Aug. 2015) for Worldwide Reference System (WRS) 2, path 145 and row 45 were selected to understand seasonal variations in citrus crop growth. Considering Landsat 30 meter spatial resolution to obtain homogeneous pixels with crop cover orchards larger than 2 hectare area was selected. To consider change in wavelength bandwidth (radiometric resolution) with Landsat sensors (i.e. 4, 5, 7 and 8) NDVI has been selected to obtain continuous sensor independent time series. The obtained crop growth stage information has been used to estimate citrus basal crop coefficient information (Kcb). Satellite based Kcb estimates were used with proximal agrometeorological sensing system observed relevant weather parameters for crop ET estimation. The results show that time series EO based crop growth stage estimates provide better information about geographically separated citrus orchards. Attempts are being made to estimate regional variations in citrus crop water requirement for effective irrigation planning. In future high resolution Sentinel 2 observations from European Space Agency (ESA) will be used to fill the time gaps and to get better understanding about citrus crop canopy parameters.
Sensor Transmission Power Schedule for Smart Grids
NASA Astrophysics Data System (ADS)
Gao, C.; Huang, Y. H.; Li, J.; Liu, X. D.
2017-11-01
Smart grid has attracted much attention by the requirement of new generation renewable energy. Nowadays, the real-time state estimation, with the help of phasor measurement unit, plays an important role to keep smart grid stable and efficient. However, the limitation of the communication channel is not considered by related work. Considering the familiar limited on-board batteries wireless sensor in smart grid, transmission power schedule is designed in this paper, which minimizes energy consumption with proper EKF filtering performance requirement constrain. Based on the event-triggered estimation theory, the filtering algorithm is also provided to utilize the information contained in the power schedule. Finally, its feasibility and performance is demonstrated using the standard IEEE 39-bus system with phasor measurement units (PMUs).
Motion Estimation and Compensation Strategies in Dynamic Computerized Tomography
NASA Astrophysics Data System (ADS)
Hahn, Bernadette N.
2017-12-01
A main challenge in computerized tomography consists in imaging moving objects. Temporal changes during the measuring process lead to inconsistent data sets, and applying standard reconstruction techniques causes motion artefacts which can severely impose a reliable diagnostics. Therefore, novel reconstruction techniques are required which compensate for the dynamic behavior. This article builds on recent results from a microlocal analysis of the dynamic setting, which enable us to formulate efficient analytic motion compensation algorithms for contour extraction. Since these methods require information about the dynamic behavior, we further introduce a motion estimation approach which determines parameters of affine and certain non-affine deformations directly from measured motion-corrupted Radon-data. Our methods are illustrated with numerical examples for both types of motion.
Maximum-likelihood methods in wavefront sensing: stochastic models and likelihood functions
Barrett, Harrison H.; Dainty, Christopher; Lara, David
2008-01-01
Maximum-likelihood (ML) estimation in wavefront sensing requires careful attention to all noise sources and all factors that influence the sensor data. We present detailed probability density functions for the output of the image detector in a wavefront sensor, conditional not only on wavefront parameters but also on various nuisance parameters. Practical ways of dealing with nuisance parameters are described, and final expressions for likelihoods and Fisher information matrices are derived. The theory is illustrated by discussing Shack–Hartmann sensors, and computational requirements are discussed. Simulation results show that ML estimation can significantly increase the dynamic range of a Shack–Hartmann sensor with four detectors and that it can reduce the residual wavefront error when compared with traditional methods. PMID:17206255
Global Burden Of Disease Studies: Implications For Mental And Substance Use Disorders.
Whiteford, Harvey; Ferrari, Alize; Degenhardt, Louisa
2016-06-01
Global Burden of Disease studies have highlighted mental and substance use disorders as the leading cause of disability globally. Using the studies' findings for policy and planning requires an understanding of how estimates are generated, the required epidemiological data are gathered, disability and premature mortality are defined and counted, and comparative risk assessment for risk-factor analysis is undertaken. The high burden of mental and substance use disorders has increased their priority on the global health agenda, but not enough to prompt concerted action by governments and international agencies. Using Global Burden of Disease estimates in health policy and planning requires combining them with other information such as evidence on the cost-effectiveness of interventions designed to reduce the disorders' burden. Concerted action is required by mental health advocates and policy makers to assemble this evidence, taking into account the health, social, and economic challenges facing each country. Project HOPE—The People-to-People Health Foundation, Inc.
The relationship between cost estimates reliability and BIM adoption: SEM analysis
NASA Astrophysics Data System (ADS)
Ismail, N. A. A.; Idris, N. H.; Ramli, H.; Rooshdi, R. R. Raja Muhammad; Sahamir, S. R.
2018-02-01
This paper presents the usage of Structural Equation Modelling (SEM) approach in analysing the effects of Building Information Modelling (BIM) technology adoption in improving the reliability of cost estimates. Based on the questionnaire survey results, SEM analysis using SPSS-AMOS application examined the relationships between BIM-improved information and cost estimates reliability factors, leading to BIM technology adoption. Six hypotheses were established prior to SEM analysis employing two types of SEM models, namely the Confirmatory Factor Analysis (CFA) model and full structural model. The SEM models were then validated through the assessment on their uni-dimensionality, validity, reliability, and fitness index, in line with the hypotheses tested. The final SEM model fit measures are: P-value=0.000, RMSEA=0.079<0.08, GFI=0.824, CFI=0.962>0.90, TLI=0.956>0.90, NFI=0.935>0.90 and ChiSq/df=2.259; indicating that the overall index values achieved the required level of model fitness. The model supports all the hypotheses evaluated, confirming that all relationship exists amongst the constructs are positive and significant. Ultimately, the analysis verified that most of the respondents foresee better understanding of project input information through BIM visualization, its reliable database and coordinated data, in developing more reliable cost estimates. They also perceive to accelerate their cost estimating task through BIM adoption.
Synchronization for Optical PPM with Inter-Symbol Guard Times
NASA Astrophysics Data System (ADS)
Rogalin, R.; Srinivasan, M.
2017-05-01
Deep space optical communications promises orders of magnitude growth in communication capacity, supporting high data rate applications such as video streaming and high-bandwidth science instruments. Pulse position modulation is the modulation format of choice for deep space applications, and by inserting inter-symbol guard times between the symbols, the signal carries the timing information needed by the demodulator. Accurately extracting this timing information is crucial to demodulating and decoding this signal. In this article, we propose a number of timing and frequency estimation schemes for this modulation format, and in particular highlight a low complexity maximum likelihood timing estimator that significantly outperforms the prior art in this domain. This method does not require an explicit synchronization sequence, freeing up channel resources for data transmission.
Pediatric chest and abdominopelvic CT: organ dose estimation based on 42 patient models.
Tian, Xiaoyu; Li, Xiang; Segars, W Paul; Paulson, Erik K; Frush, Donald P; Samei, Ehsan
2014-02-01
To estimate organ dose from pediatric chest and abdominopelvic computed tomography (CT) examinations and evaluate the dependency of organ dose coefficients on patient size and CT scanner models. The institutional review board approved this HIPAA-compliant study and did not require informed patient consent. A validated Monte Carlo program was used to perform simulations in 42 pediatric patient models (age range, 0-16 years; weight range, 2-80 kg; 24 boys, 18 girls). Multidetector CT scanners were modeled on those from two commercial manufacturers (LightSpeed VCT, GE Healthcare, Waukesha, Wis; SOMATOM Definition Flash, Siemens Healthcare, Forchheim, Germany). Organ doses were estimated for each patient model for routine chest and abdominopelvic examinations and were normalized by volume CT dose index (CTDI(vol)). The relationships between CTDI(vol)-normalized organ dose coefficients and average patient diameters were evaluated across scanner models. For organs within the image coverage, CTDI(vol)-normalized organ dose coefficients largely showed a strong exponential relationship with the average patient diameter (R(2) > 0.9). The average percentage differences between the two scanner models were generally within 10%. For distributed organs and organs on the periphery of or outside the image coverage, the differences were generally larger (average, 3%-32%) mainly because of the effect of overranging. It is feasible to estimate patient-specific organ dose for a given examination with the knowledge of patient size and the CTDI(vol). These CTDI(vol)-normalized organ dose coefficients enable one to readily estimate patient-specific organ dose for pediatric patients in clinical settings. This dose information, and, as appropriate, attendant risk estimations, can provide more substantive information for the individual patient for both clinical and research applications and can yield more expansive information on dose profiles across patient populations within a practice. © RSNA, 2013.
A particle swarm model for estimating reliability and scheduling system maintenance
NASA Astrophysics Data System (ADS)
Puzis, Rami; Shirtz, Dov; Elovici, Yuval
2016-05-01
Modifying data and information system components may introduce new errors and deteriorate the reliability of the system. Reliability can be efficiently regained with reliability centred maintenance, which requires reliability estimation for maintenance scheduling. A variant of the particle swarm model is used to estimate reliability of systems implemented according to the model view controller paradigm. Simulations based on data collected from an online system of a large financial institute are used to compare three component-level maintenance policies. Results show that appropriately scheduled component-level maintenance greatly reduces the cost of upholding an acceptable level of reliability by reducing the need in system-wide maintenance.
Integration of Rooftop Photovoltaic Systems in St. Paul Ford Site's Redevelopment Plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olis, D.; Mosey, G.
The purpose of this analysis is to estimate how much electricity the redeveloped Ford Motor Company assembly plant site in St. Paul, Minnesota, might consume under different development scenarios and how much rooftop photovoltaic (PV) generation might be possible at the site. Because the current development scenarios are high-level, preliminary sketches that describe mixes of residential, retail, commercial, and industrial spaces, electricity consumption and available rooftop area for PV under each scenario can only be grossly estimated. These results are only indicative and should be used for estimating purposes only and to help inform development goals and requirements moving forward.
Multichannel Doppler Processing for an Experimental Low-Angle Tracking System
1990-05-01
estimation techniques at sea. Because of clutter and noise, it is necessary to use a number of different processing algorithms to extract the required...a number of different processing algorithms to extract the required information. Consequently, the ELAT radar system is composed of multiple...corresponding to RF frequencies, f, and f2. For mode 3, the ambiguities occur at vbi = 15.186 knots and vb2 = 16.96 knots. The sea clutter, with a spectrum
Fulton, Lawrence; Kerr, Bernie; Inglis, James M; Brooks, Matthew; Bastian, Nathaniel D
2015-07-01
In this study, we re-evaluate air ambulance requirements (rules of allocation) and planning considerations based on an Army-approved, Theater Army Analysis scenario. A previous study using workload only estimated a requirement of 0.4 to 0.6 aircraft per admission, a significant bolus over existence-based rules. In this updated study, we estimate requirements for Phase III (major combat operations) using a simulation grounded in previously published work and Phase IV (stability operations) based on four rules of allocation: unit existence rules, workload factors, theater structure (geography), and manual input. This study improves upon previous work by including the new air ambulance mission requirements of Department of Defense 51001.1, Roles and Functions of the Services, by expanding the analysis over two phases, and by considering unit rotation requirements known as Army Force Generation based on Department of Defense policy. The recommendations of this study are intended to inform future planning factors and already provided decision support to the Army Aviation Branch in determining force structure requirements. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.
Roberto, Christina A; Haynos, Ann F; Schwartz, Marlene B; Brownell, Kelly D; White, Marney A
2013-09-01
Menu labeling is a public health policy that requires chain restaurants in the USA to post kilocalorie information on their menus to help consumers make informed choices. However, there is concern that such a policy might promote disordered eating. This web-based study compared individuals with self-reported binge eating disorder (N = 52), bulimia nervosa (N = 25), and purging disorder (N = 17) and those without eating disorders (No ED) (N = 277) on restaurant calorie information knowledge and perceptions of menu labeling legislation. On average, people answered 1.46 ± 1.08 questions correctly (out of 6) (25%) on a calorie information quiz and 92% of the sample was in favor of menu labeling. The findings did not differ based on eating disorder, dieting, or weight status, or race/ethnicity. The results indicated that people have difficulty estimating the calories in restaurant meals and individuals with and without eating disorders are largely in favor of menu labeling laws.
Scalable DB+IR Technology: Processing Probabilistic Datalog with HySpirit.
Frommholz, Ingo; Roelleke, Thomas
2016-01-01
Probabilistic Datalog (PDatalog, proposed in 1995) is a probabilistic variant of Datalog and a nice conceptual idea to model Information Retrieval in a logical, rule-based programming paradigm. Making PDatalog work in real-world applications requires more than probabilistic facts and rules, and the semantics associated with the evaluation of the programs. We report in this paper some of the key features of the HySpirit system required to scale the execution of PDatalog programs. Firstly, there is the requirement to express probability estimation in PDatalog. Secondly, fuzzy-like predicates are required to model vague predicates (e.g. vague match of attributes such as age or price). Thirdly, to handle large data sets there are scalability issues to be addressed, and therefore, HySpirit provides probabilistic relational indexes and parallel and distributed processing . The main contribution of this paper is a consolidated view on the methods of the HySpirit system to make PDatalog applicable in real-scale applications that involve a wide range of requirements typical for data (information) management and analysis.
Algorithms and architectures for robot vision
NASA Technical Reports Server (NTRS)
Schenker, Paul S.
1990-01-01
The scope of the current work is to develop practical sensing implementations for robots operating in complex, partially unstructured environments. A focus in this work is to develop object models and estimation techniques which are specific to requirements of robot locomotion, approach and avoidance, and grasp and manipulation. Such problems have to date received limited attention in either computer or human vision - in essence, asking not only how perception is in general modeled, but also what is the functional purpose of its underlying representations. As in the past, researchers are drawing on ideas from both the psychological and machine vision literature. Of particular interest is the development 3-D shape and motion estimates for complex objects when given only partial and uncertain information and when such information is incrementally accrued over time. Current studies consider the use of surface motion, contour, and texture information, with the longer range goal of developing a fused sensing strategy based on these sources and others.
2009-04-16
Report 110-335 accompanying the National Defense Authorization Act for Fiscal Year 2009. The Senate Report required the Comptroller General to... Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per...including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports , 1215 Jefferson Davis
Direct Estimation of Structure and Motion from Multiple Frames
1990-03-01
sequential frames in an image sequence. As a consequence, the information that can be extracted from a single optical flow field is limited to a snapshot of...researchers have developed techniques that extract motion and structure inform.4tion without computation of the optical flow. Best known are the "direct...operated iteratively on a sequence of images to recover structure. It required feature extraction and matching. Broida and Chellappa [9] suggested the use of
Online Sensor Fault Detection Based on an Improved Strong Tracking Filter
Wang, Lijuan; Wu, Lifeng; Guan, Yong; Wang, Guohui
2015-01-01
We propose a method for online sensor fault detection that is based on the evolving Strong Tracking Filter (STCKF). The cubature rule is used to estimate states to improve the accuracy of making estimates in a nonlinear case. A residual is the difference in value between an estimated value and the true value. A residual will be regarded as a signal that includes fault information. The threshold is set at a reasonable level, and will be compared with residuals to determine whether or not the sensor is faulty. The proposed method requires only a nominal plant model and uses STCKF to estimate the original state vector. The effectiveness of the algorithm is verified by simulation on a drum-boiler model. PMID:25690553
NASA Technical Reports Server (NTRS)
Spencer, M. M.; Wolf, J. M.; Schall, M. A.
1974-01-01
A system of computer programs were developed which performs geometric rectification and line-by-line mapping of airborne multispectral scanner data to ground coordinates and estimates ground area. The system requires aircraft attitude and positional information furnished by ancillary aircraft equipment, as well as ground control points. The geometric correction and mapping procedure locates the scan lines, or the pixels on each line, in terms of map grid coordinates. The area estimation procedure gives ground area for each pixel or for a predesignated parcel specified in map grid coordinates. The results of exercising the system with simulated data showed the uncorrected video and corrected imagery and produced area estimates accurate to better than 99.7%.
Dhanda, D S; Guzauskas, G F; Carlson, J J; Basu, A; Veenstra, D L
2017-11-01
Evidence requirements for implementation of precision medicine (PM), whether informed by genomic or clinical data, are not well defined. Evidence requirements are driven by uncertainty and its attendant consequences; these aspects can be quantified by a novel technique in health economics: value of information analysis (VOI). We utilized VOI analysis to compare the evidence levels over time for warfarin dosing based on pharmacogenomic vs. amiodarone-warfarin drug-drug interaction information. The primary outcome was the expected value of perfect information (EVPI), which is an estimate of the upper limit of the societal value of conducting future research. Over the past decade, the EVPI for the pharmacogenomic strategy decreased from $1,550 to $140 vs. $1,220 to $280 per patient for the drug-interaction strategy. Evidence levels thus appear to be higher for pharmacogenomic-guided vs. drug-interaction-guided warfarin dosing. Clinical guidelines and reimbursement policies for warfarin PM could be informed by these findings. © 2017 American Society for Clinical Pharmacology and Therapeutics.
Information technologies for Marine Corps combat medicine.
Carey, N B; Rattelman, C R; Nguyen, H Q
1998-09-01
Future Marine Corps warfighting concepts will make it more difficult to locate casualties, which will complicate casualty evacuation, lengthen casualty wait times, and require infantrymen or corpsmen to provide more extensive treatment. In these future scenarios, information flow and communications will be critical to medical functions. We asked, for Navy medical support to the Marines, what information will future combat medicine require and what technologies should supply those information needs? Based on analyses of patient data streams, focus groups of Navy medical personnel, and our estimates of the cost and feasibility of communications systems, we recommend the following: (1) increase medical training for some fraction of Marines, especially in hemorrhage control; (2) augment corpsmen's training; (3) furnish data systems for evacuation and supply that would provide in-transit visibility and simplify requests; (4) provide all ground medical personnel with access to treatment information systems and limited voice communications; and (5) exploit e-mail systems to reduce reliance on voice communications. Implementation time frames are discussed.
Estimating health service utilization for treatment of pneumococcal disease: the case of Brazil.
Sartori, A M C; Novaes, C G; de Soárez, P C; Toscano, C M; Novaes, H M D
2013-07-02
Health service utilization (HSU) is an essential component of economic evaluations of health initiatives. Defining HSU for cases of pneumococcal disease (PD) is particularly complex considering the varying clinical manifestations and diverse severity. We describe the process of developing estimates of HSU for PD as part of an economic evaluation of the introduction of pneumococcal conjugate vaccine in Brazil. Nationwide inpatient and outpatient HSU by children under-5 years with meningitis (PM), sepsis (PS), non-meningitis non-sepsis invasive PD (NMNS), pneumonia, and acute otitis media (AOM) was estimated. We assumed that all cases of invasive PD (PM, PS, and NMNS) required hospitalization. The study perspective was the health system, including both the public and private sectors. Data sources were obtained from national health information systems, including the Hospital Information System (SIH/SUS) and the Notifiable Diseases Information System (SINAN); surveys; and community-based and health care facility-based studies. We estimated hospitalization rates of 7.69 per 100,000 children under-5 years for PM (21.4 for children <1 years of age and 4.3 for children aged 1-4 years), 5.89 for PS (20.94 and 2.17), and 4.01 for NMNS (5.5 and 3.64) in 2004, with an overall hospitalization rate of 17.59 for all invasive PD (47.27 and 10.11). The estimated incidence rate of all-cause pneumonia was 93.4 per 1000 children under-5 (142.8 for children <1 years of age and 81.2 for children aged 1-4 years), considering both hospital and outpatient care. Secondary data derived from health information systems and the available literature enabled the development of national HSU estimates for PD in Brazil. Estimating HSU for noninvasive disease was challenging, particularly in the case of outpatient care, for which secondary data are scarce. Information for the private sector is lacking in Brazil, but estimates were possible with data from the public sector and national population surveys. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Murphy, P. C.
1986-01-01
An algorithm for maximum likelihood (ML) estimation is developed with an efficient method for approximating the sensitivities. The ML algorithm relies on a new optimization method referred to as a modified Newton-Raphson with estimated sensitivities (MNRES). MNRES determines sensitivities by using slope information from local surface approximations of each output variable in parameter space. With the fitted surface, sensitivity information can be updated at each iteration with less computational effort than that required by either a finite-difference method or integration of the analytically determined sensitivity equations. MNRES eliminates the need to derive sensitivity equations for each new model, and thus provides flexibility to use model equations in any convenient format. A random search technique for determining the confidence limits of ML parameter estimates is applied to nonlinear estimation problems for airplanes. The confidence intervals obtained by the search are compared with Cramer-Rao (CR) bounds at the same confidence level. The degree of nonlinearity in the estimation problem is an important factor in the relationship between CR bounds and the error bounds determined by the search technique. Beale's measure of nonlinearity is developed in this study for airplane identification problems; it is used to empirically correct confidence levels and to predict the degree of agreement between CR bounds and search estimates.
Radi, Marjan; Dezfouli, Behnam; Abu Bakar, Kamalrulnizam; Abd Razak, Shukor
2014-01-01
Network connectivity and link quality information are the fundamental requirements of wireless sensor network protocols to perform their desired functionality. Most of the existing discovery protocols have only focused on the neighbor discovery problem, while a few number of them provide an integrated neighbor search and link estimation. As these protocols require a careful parameter adjustment before network deployment, they cannot provide scalable and accurate network initialization in large-scale dense wireless sensor networks with random topology. Furthermore, performance of these protocols has not entirely been evaluated yet. In this paper, we perform a comprehensive simulation study on the efficiency of employing adaptive protocols compared to the existing nonadaptive protocols for initializing sensor networks with random topology. In this regard, we propose adaptive network initialization protocols which integrate the initial neighbor discovery with link quality estimation process to initialize large-scale dense wireless sensor networks without requiring any parameter adjustment before network deployment. To the best of our knowledge, this work is the first attempt to provide a detailed simulation study on the performance of integrated neighbor discovery and link quality estimation protocols for initializing sensor networks. This study can help system designers to determine the most appropriate approach for different applications. PMID:24678277
Gupta, A; Davidson, C M; McIsaac, M A
2016-08-01
Surveys that collect information on injuries often focus on the single "most serious" event to help limit recall error and reduce survey length. However, this can mask less serious injuries and result in biased incidence estimates for specific injury subcategories. Data from the 2002 Health Behaviour in School-aged Children (HBSC) survey and from the Canadian Hospitals Injury Reporting and Prevention Program (CHIRPP) were used to compare estimates of sports injury incidence in Canadian children. HBSC data indicate that 6.7% of children report sustaining a sports injury that required an emergency department (ED) visit. However, details were only collected on a child's "most serious" injury, so children who had multiple injuries requiring an ED visit may have had sports injuries that went unreported. The rate of 6.7% can be seen to be an underestimate by as much as 4.3%. Corresponding CHIRPP surveillance data indicate an incidence of 9.9%. Potential masking bias is also highlighted in our analysis of injuries attended by other health care providers. The "one most serious injury" line of questioning induces potentially substantial masking bias in the estimation of sports injury incidence, which limits researchers' ability to quantify the burden of sports injury. Longer survey recall periods naturally lead to greater masking. The design of future surveys should take these issues into account. In order to accurately inform policy decisions and the direction of future research, researchers must be aware of these limitations.
Prioritizing Chemicals and Data Requirements for Screening-Level Exposure and Risk Assessment
Brown, Trevor N.; Wania, Frank; Breivik, Knut; McLachlan, Michael S.
2012-01-01
Background: Scientists and regulatory agencies strive to identify chemicals that may cause harmful effects to humans and the environment; however, prioritization is challenging because of the large number of chemicals requiring evaluation and limited data and resources. Objectives: We aimed to prioritize chemicals for exposure and exposure potential and obtain a quantitative perspective on research needs to better address uncertainty in screening assessments. Methods: We used a multimedia mass balance model to prioritize > 12,000 organic chemicals using four far-field human exposure metrics. The propagation of variance (uncertainty) in key chemical information used as model input for calculating exposure metrics was quantified. Results: Modeled human concentrations and intake rates span approximately 17 and 15 orders of magnitude, respectively. Estimates of exposure potential using human concentrations and a unit emission rate span approximately 13 orders of magnitude, and intake fractions span 7 orders of magnitude. The actual chemical emission rate contributes the greatest variance (uncertainty) in exposure estimates. The human biotransformation half-life is the second greatest source of uncertainty in estimated concentrations. In general, biotransformation and biodegradation half-lives are greater sources of uncertainty in modeled exposure and exposure potential than chemical partition coefficients. Conclusions: Mechanistic exposure modeling is suitable for screening and prioritizing large numbers of chemicals. By including uncertainty analysis and uncertainty in chemical information in the exposure estimates, these methods can help identify and address the important sources of uncertainty in human exposure and risk assessment in a systematic manner. PMID:23008278
NASA Astrophysics Data System (ADS)
Velazquez, Antonio; Swartz, R. Andrew
2012-04-01
Wind energy is an increasingly important component of this nation's renewable energy portfolio, however safe and economical wind turbine operation is a critical need to ensure continued adoption. Safe operation of wind turbine structures requires not only information regarding their condition, but their operational environment. Given the difficulty inherent in SHM processes for wind turbines (damage detection, location, and characterization), some uncertainty in conditional assessment is expected. Furthermore, given the stochastic nature of the loading on turbine structures, a probabilistic framework is appropriate to characterize their risk of failure at a given time. Such information will be invaluable to turbine controllers, allowing them to operate the structures within acceptable risk profiles. This study explores the characterization of the turbine loading and response envelopes for critical failure modes of the turbine blade structures. A framework is presented to develop an analytical estimation of the loading environment (including loading effects) based on the dynamic behavior of the blades. This is influenced by behaviors including along and across-wind aero-elastic effects, wind shear gradient, tower shadow effects, and centrifugal stiffening effects. The proposed solution includes methods that are based on modal decomposition of the blades and require frequent updates to the estimated modal properties to account for the time-varying nature of the turbine and its environment. The estimated demand statistics are compared to a code-based resistance curve to determine a probabilistic estimate of the risk of blade failure given the loading environment.
Joint sparsity based heterogeneous data-level fusion for target detection and estimation
NASA Astrophysics Data System (ADS)
Niu, Ruixin; Zulch, Peter; Distasio, Marcello; Blasch, Erik; Shen, Dan; Chen, Genshe
2017-05-01
Typical surveillance systems employ decision- or feature-level fusion approaches to integrate heterogeneous sensor data, which are sub-optimal and incur information loss. In this paper, we investigate data-level heterogeneous sensor fusion. Since the sensors monitor the common targets of interest, whose states can be determined by only a few parameters, it is reasonable to assume that the measurement domain has a low intrinsic dimensionality. For heterogeneous sensor data, we develop a joint-sparse data-level fusion (JSDLF) approach based on the emerging joint sparse signal recovery techniques by discretizing the target state space. This approach is applied to fuse signals from multiple distributed radio frequency (RF) signal sensors and a video camera for joint target detection and state estimation. The JSDLF approach is data-driven and requires minimum prior information, since there is no need to know the time-varying RF signal amplitudes, or the image intensity of the targets. It can handle non-linearity in the sensor data due to state space discretization and the use of frequency/pixel selection matrices. Furthermore, for a multi-target case with J targets, the JSDLF approach only requires discretization in a single-target state space, instead of discretization in a J-target state space, as in the case of the generalized likelihood ratio test (GLRT) or the maximum likelihood estimator (MLE). Numerical examples are provided to demonstrate that the proposed JSDLF approach achieves excellent performance with near real-time accurate target position and velocity estimates.
Availability of information on renal function in Dutch community pharmacies.
Koster, Ellen S; Philbert, Daphne; Noordam, Michelle; Winters, Nina A; Blom, Lyda; Bouvy, Marcel L
2016-08-01
Background Early detection and monitoring of impaired renal function may prevent drug related problems. Objective To assess the availability of information on patient's renal function in Dutch community pharmacies, for patients using medication that might need monitoring in case of renal impairment. Methods Per pharmacy, 25 patients aged ≥65 years using at least one drug that requires monitoring, were randomly selected from the pharmacy information system. For these patients, information on renal function [estimated glomerular filtration rate (eGFR)], was obtained from the pharmacy information system. When absent, this information was obtained from the general practitioner (GP). Results Data were collected for 1632 patients. For 1201 patients (74 %) eGFR values were not directly available in the pharmacy, for another 194 patients (12 %) the eGFR value was not up-to-date. For 1082 patients information could be obtained from the GP, resulting in 942 additional recent eGFR values. Finally, recent information on renal function was available for 72 % (n = 1179) of selected patients. Conclusion In patients using drugs that require renal monitoring, information on renal function is often unknown in the pharmacy. For the majority of patients this information can be retrieved from the GP.
NASA Astrophysics Data System (ADS)
Lee, J. H.; Yoon, H.; Kitanidis, P. K.; Werth, C. J.; Valocchi, A. J.
2015-12-01
Characterizing subsurface properties, particularly hydraulic conductivity, is crucial for reliable and cost-effective groundwater supply management, contaminant remediation, and emerging deep subsurface activities such as geologic carbon storage and unconventional resources recovery. With recent advances in sensor technology, a large volume of hydro-geophysical and chemical data can be obtained to achieve high-resolution images of subsurface properties, which can be used for accurate subsurface flow and reactive transport predictions. However, subsurface characterization with a plethora of information requires high, often prohibitive, computational costs associated with "big data" processing and large-scale numerical simulations. As a result, traditional inversion techniques are not well-suited for problems that require coupled multi-physics simulation models with massive data. In this work, we apply a scalable inversion method called Principal Component Geostatistical Approach (PCGA) for characterizing heterogeneous hydraulic conductivity (K) distribution in a 3-D sand box. The PCGA is a Jacobian-free geostatistical inversion approach that uses the leading principal components of the prior information to reduce computational costs, sometimes dramatically, and can be easily linked with any simulation software. Sequential images of transient tracer concentrations in the sand box were obtained using magnetic resonance imaging (MRI) technique, resulting in 6 million tracer-concentration data [Yoon et. al., 2008]. Since each individual tracer observation has little information on the K distribution, the dimension of the data was reduced using temporal moments and discrete cosine transform (DCT). Consequently, 100,000 unknown K values consistent with the scale of MRI data (at a scale of 0.25^3 cm^3) were estimated by matching temporal moments and DCT coefficients of the original tracer data. Estimated K fields are close to the true K field, and even small-scale variability of the sand box was captured to highlight high K connectivity and contrasts between low and high K zones. Total number of 1,000 MODFLOW and MT3DMS simulations were required to obtain final estimates and corresponding estimation uncertainty, showing the efficiency and effectiveness of our method.
NASA Technical Reports Server (NTRS)
Goldstein, H. W.; Grenda, R. N.
1977-01-01
The sensors were examined for adaptability to shuttle by reviewing pertinent information regarding sensor characteristics as they related to the shuttle and Multimission Modular Spacecraft environments. This included physical and electrical characteristics, data output and command requirements, attitude and orientation requirements, thermal and safety requirements, and adaptability and modification for space. The sensor requirements and characteristics were compared with the corresponding shuttle and Multimission Modular Spacecraft characteristics and capabilities. On this basis the adaptability and necessary modifications for each sensor were determined. A number of the sensors were examined in more detail and estimated cost for the modifications was provided.
Population and Activity of On-road Vehicles in MOVES2014 ...
This report describes the sources and derivation for on-road vehicle population and activity information and associated adjustments as stored in the MOVES2014 default databases. Motor Vehicle Emission Simulator, the MOVES2014 model, is a set of modeling tools for estimating emissions produced by on-road (cars, trucks, motorcycles, etc.) and nonroad (backhoes, lawnmowers, etc.) mobile sources. The national default activity information in MOVES2014 provides a reasonable basis for estimating national emissions. However, the uncertainties and variability in the default data contribute to the uncertainty in the resulting emission estimates. Properly characterizing emissions from the on-road vehicle subset requires a detailed understanding of the cars and trucks that make up the vehicle fleet and their patterns of operation. The MOVES model calculates emission inventories by multiplying emission rates by the appropriate emission-related activity, applying correction (adjustment) factors as needed to simulate specific situations, and then adding up the emissions from all sources (populations) and regions. This report describes the sources and derivation for on-road vehicle population and activity information and associated adjustments as stored in the MOVES2014 default databases. Motor Vehicle Emission Simulator, the MOVES2014 model, is a set of modeling tools for estimating emissions produced by on-road (cars, trucks, motorcycles, etc.) and nonroad (backhoes, law
Sharp, Linda; Tilson, Lesley; Whyte, Sophie; Ceilleachair, Alan O; Walsh, Cathal; Usher, Cara; Tappenden, Paul; Chilcott, James; Staines, Anthony; Barry, Michael; Comber, Harry
2013-03-19
Organised colorectal cancer screening is likely to be cost-effective, but cost-effectiveness results alone may not help policy makers to make decisions about programme feasibility or service providers to plan programme delivery. For these purposes, estimates of the impact on the health services of actually introducing screening in the target population would be helpful. However, these types of analyses are rarely reported. As an illustration of such an approach, we estimated annual health service resource requirements and health outcomes over the first decade of a population-based colorectal cancer screening programme in Ireland. A Markov state-transition model of colorectal neoplasia natural history was used. Three core screening scenarios were considered: (a) flexible sigmoidoscopy (FSIG) once at age 60, (b) biennial guaiac-based faecal occult blood tests (gFOBT) at 55-74 years, and (c) biennial faecal immunochemical tests (FIT) at 55-74 years. Three alternative FIT roll-out scenarios were also investigated relating to age-restricted screening (55-64 years) and staggered age-based roll-out across the 55-74 age group. Parameter estimates were derived from literature review, existing screening programmes, and expert opinion. Results were expressed in relation to the 2008 population (4.4 million people, of whom 700,800 were aged 55-74). FIT-based screening would deliver the greatest health benefits, averting 164 colorectal cancer cases and 272 deaths in year 10 of the programme. Capacity would be required for 11,095-14,820 diagnostic and surveillance colonoscopies annually, compared to 381-1,053 with FSIG-based, and 967-1,300 with gFOBT-based, screening. With FIT, in year 10, these colonoscopies would result in 62 hospital admissions for abdominal bleeding, 27 bowel perforations and one death. Resource requirements for pathology, diagnostic radiology, radiotherapy and colorectal resection were highest for FIT. Estimates depended on screening uptake. Alternative FIT roll-out scenarios had lower resource requirements. While FIT-based screening would quite quickly generate attractive health outcomes, it has heavy resource requirements. These could impact on the feasibility of a programme based on this screening modality. Staggered age-based roll-out would allow time to increase endoscopy capacity to meet programme requirements. Resource modelling of this type complements conventional cost-effectiveness analyses and can help inform policy making and service planning.
The use of historical information for regional frequency analysis of extreme skew surge
NASA Astrophysics Data System (ADS)
Frau, Roberto; Andreewsky, Marc; Bernardara, Pietro
2018-03-01
The design of effective coastal protections requires an adequate estimation of the annual occurrence probability of rare events associated with a return period up to 103 years. Regional frequency analysis (RFA) has been proven to be an applicable way to estimate extreme events by sorting regional data into large and spatially distributed datasets. Nowadays, historical data are available to provide new insight on past event estimation. The utilisation of historical information would increase the precision and the reliability of regional extreme's quantile estimation. However, historical data are from significant extreme events that are not recorded by tide gauge. They usually look like isolated data and they are different from continuous data from systematic measurements of tide gauges. This makes the definition of the duration of our observations period complicated. However, the duration of the observation period is crucial for the frequency estimation of extreme occurrences. For this reason, we introduced here the concept of credible duration
. The proposed RFA method (hereinafter referenced as FAB, from the name of the authors) allows the use of historical data together with systematic data, which is a result of the use of the credible duration concept.
Fully decentralized estimation and control for a modular wheeled mobile robot
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mutambara, A.G.O.; Durrant-Whyte, H.F.
2000-06-01
In this paper, the problem of fully decentralized data fusion and control for a modular wheeled mobile robot (WMR) is addressed. This is a vehicle system with nonlinear kinematics, distributed multiple sensors, and nonlinear sensor models. The problem is solved by applying fully decentralized estimation and control algorithms based on the extended information filter. This is achieved by deriving a modular, decentralized kinematic model by using plane motion kinematics to obtain the forward and inverse kinematics for a generalized simple wheeled vehicle. This model is then used in the decentralized estimation and control algorithms. WMR estimation and control is thusmore » obtained locally using reduced order models with reduced communication of information between nodes is carried out after every measurement (full rate communication), the estimates and control signals obtained at each node are equivalent to those obtained by a corresponding centralized system. Transputer architecture is used as the basis for hardware and software design as it supports the extensive communication and concurrency requirements that characterize modular and decentralized systems. The advantages of a modular WMR vehicle include scalability, application flexibility, low prototyping costs, and high reliability.« less
Estimating Development Cost of an Interactive Website Based Cancer Screening Promotion Program
Lairson, David R.; Chung, Tong Han; Smith, Lisa G.; Springston, Jeffrey K.; Champion, Victoria L.
2015-01-01
Objectives The aim of this study was to estimate the initial development costs for an innovative talk show format tailored intervention delivered via the interactive web, for increasing cancer screening in women 50 to 75 who were non-adherent to screening guidelines for colorectal cancer and/or breast cancer. Methods The cost of the intervention development was estimated from a societal perspective. Micro costing methods plus vendor contract costs were used to estimate cost. Staff logs were used to track personnel time. Non-personnel costs include all additional resources used to produce the intervention. Results Development cost of the interactive web based intervention was $.39 million, of which 77% was direct cost. About 98% of the cost was incurred in personnel time cost, contract cost and overhead cost. Conclusions The new web-based disease prevention medium required substantial investment in health promotion and media specialist time. The development cost was primarily driven by the high level of human capital required. The cost of intervention development is important information for assessing and planning future public and private investments in web-based health promotion interventions. PMID:25749548
Parallel Estimation and Control Architectures for Deep-Space Formation Flying Spacecraft
NASA Technical Reports Server (NTRS)
Hadaegh, Fred Y.; Smith, Roy S.
2006-01-01
The formation flying of precisely controlled spacecraft in deep space can be used to implement optical instruments capable of imaging planets in other solar systems. The distance of the formation from Earth necessitates a significant level of autonomy and each spacecraft must base its actions on its estimates of the location and velocity of the other spacecraft. Precise coordination and control is the key requirement in such missions and the flow of information between spacecraft must be carefully designed. Doing this in an efficient and optimal manner requires novel techniques for the design of the on-board estimators. The use of standard Kalman filter-based designs can lead to unanticipated dynamics--which we refer to as disagreement dynamics--in the estimators' errors. We show how communication amongst the spacecraft can be designed in order to control all of the dynamics within the formation. We present several results relating the topology of the communication network to the resulting closed-loop control dynamics of the formation. The consequences for the design of the control, communication and coordination are discussed.
Tarjan, Lily M; Tinker, M. Tim
2016-01-01
Parametric and nonparametric kernel methods dominate studies of animal home ranges and space use. Most existing methods are unable to incorporate information about the underlying physical environment, leading to poor performance in excluding areas that are not used. Using radio-telemetry data from sea otters, we developed and evaluated a new algorithm for estimating home ranges (hereafter Permissible Home Range Estimation, or “PHRE”) that reflects habitat suitability. We began by transforming sighting locations into relevant landscape features (for sea otters, coastal position and distance from shore). Then, we generated a bivariate kernel probability density function in landscape space and back-transformed this to geographic space in order to define a permissible home range. Compared to two commonly used home range estimation methods, kernel densities and local convex hulls, PHRE better excluded unused areas and required a smaller sample size. Our PHRE method is applicable to species whose ranges are restricted by complex physical boundaries or environmental gradients and will improve understanding of habitat-use requirements and, ultimately, aid in conservation efforts.
NASA Technical Reports Server (NTRS)
Guerreiro, Nelson M.; Butler, Ricky W.; Hagen, George E.; Maddalon, Jeffrey M.; Lewis, Timothy A.
2016-01-01
A loss-of-separation (LOS) is said to occur when two aircraft are spatially too close to one another. A LOS is the fundamental unsafe event to be avoided in air traffic management and conflict detection (CD) is the function that attempts to predict these LOS events. In general, the effectiveness of conflict detection relates to the overall safety and performance of an air traffic management concept. An abstract, parametric analysis was conducted to investigate the impact of surveillance quality, level of intent information, and quality of intent information on conflict detection performance. The data collected in this analysis can be used to estimate the conflict detection performance under alternative future scenarios or alternative allocations of the conflict detection function, based on the quality of the surveillance and intent information under those conditions.Alternatively, this data could also be used to estimate the surveillance and intent information quality required to achieve some desired CD performance as part of the design of a new separation assurance system.
Simple LED spectrophotometer for analysis of color information.
Kim, Ji-Sun; Kim, A-Hee; Oh, Han-Byeol; Goh, Bong-Jun; Lee, Eun-Suk; Kim, Jun-Sik; Jung, Gu-In; Baek, Jin-Young; Jun, Jae-Hoon
2015-01-01
A spectrophotometer is the basic measuring equipment essential to most research activity fields requiring samples to be measured, such as physics, biotechnology and food engineering. This paper proposes a system that is able to detect sample concentration and color information by using LED and color sensor. Purity and wavelength information can be detected by CIE diagram, and the concentration can be estimated with purity information. This method is more economical and efficient than existing spectrophotometry, and can also be used by ordinary persons. This contribution is applicable to a number of fields because it can be used as a colorimeter to detect the wavelength and purity of samples.
Self-organization and entropy reduction in a living cell.
Davies, Paul C W; Rieper, Elisabeth; Tuszynski, Jack A
2013-01-01
In this paper we discuss the entropy and information aspects of a living cell. Particular attention is paid to the information gain on assembling and maintaining a living state. Numerical estimates of the information and entropy reduction are given and discussed in the context of the cell's metabolic activity. We discuss a solution to an apparent paradox that there is less information content in DNA than in the proteins that are assembled based on the genetic code encrypted in DNA. When energy input required for protein synthesis is accounted for, the paradox is clearly resolved. Finally, differences between biological information and instruction are discussed. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Predicting Energy Performance of a Net-Zero Energy Building: A Statistical Approach
Kneifel, Joshua; Webb, David
2016-01-01
Performance-based building requirements have become more prevalent because it gives freedom in building design while still maintaining or exceeding the energy performance required by prescriptive-based requirements. In order to determine if building designs reach target energy efficiency improvements, it is necessary to estimate the energy performance of a building using predictive models and different weather conditions. Physics-based whole building energy simulation modeling is the most common approach. However, these physics-based models include underlying assumptions and require significant amounts of information in order to specify the input parameter values. An alternative approach to test the performance of a building is to develop a statistically derived predictive regression model using post-occupancy data that can accurately predict energy consumption and production based on a few common weather-based factors, thus requiring less information than simulation models. A regression model based on measured data should be able to predict energy performance of a building for a given day as long as the weather conditions are similar to those during the data collection time frame. This article uses data from the National Institute of Standards and Technology (NIST) Net-Zero Energy Residential Test Facility (NZERTF) to develop and validate a regression model to predict the energy performance of the NZERTF using two weather variables aggregated to the daily level, applies the model to estimate the energy performance of hypothetical NZERTFs located in different cities in the Mixed-Humid climate zone, and compares these estimates to the results from already existing EnergyPlus whole building energy simulations. This regression model exhibits agreement with EnergyPlus predictive trends in energy production and net consumption, but differs greatly in energy consumption. The model can be used as a framework for alternative and more complex models based on the experimental data collected from the NZERTF. PMID:27956756
Predicting Energy Performance of a Net-Zero Energy Building: A Statistical Approach.
Kneifel, Joshua; Webb, David
2016-09-01
Performance-based building requirements have become more prevalent because it gives freedom in building design while still maintaining or exceeding the energy performance required by prescriptive-based requirements. In order to determine if building designs reach target energy efficiency improvements, it is necessary to estimate the energy performance of a building using predictive models and different weather conditions. Physics-based whole building energy simulation modeling is the most common approach. However, these physics-based models include underlying assumptions and require significant amounts of information in order to specify the input parameter values. An alternative approach to test the performance of a building is to develop a statistically derived predictive regression model using post-occupancy data that can accurately predict energy consumption and production based on a few common weather-based factors, thus requiring less information than simulation models. A regression model based on measured data should be able to predict energy performance of a building for a given day as long as the weather conditions are similar to those during the data collection time frame. This article uses data from the National Institute of Standards and Technology (NIST) Net-Zero Energy Residential Test Facility (NZERTF) to develop and validate a regression model to predict the energy performance of the NZERTF using two weather variables aggregated to the daily level, applies the model to estimate the energy performance of hypothetical NZERTFs located in different cities in the Mixed-Humid climate zone, and compares these estimates to the results from already existing EnergyPlus whole building energy simulations. This regression model exhibits agreement with EnergyPlus predictive trends in energy production and net consumption, but differs greatly in energy consumption. The model can be used as a framework for alternative and more complex models based on the experimental data collected from the NZERTF.
Karpf, Christian; Krebs, Peter
2011-05-01
The management of sewer systems requires information about discharge and variability of typical wastewater sources in urban catchments. Especially the infiltration of groundwater and the inflow of surface water (I/I) are important for making decisions about the rehabilitation and operation of sewer networks. This paper presents a methodology to identify I/I and estimate its quantity. For each flow fraction in sewer networks, an individual model approach is formulated whose parameters are optimised by the method of least squares. This method was applied to estimate the contributions to the wastewater flow in the sewer system of the City of Dresden (Germany), where data availability is good. Absolute flows of I/I and their temporal variations are estimated. Further information on the characteristics of infiltration is gained by clustering and grouping sewer pipes according to the attributes construction year and groundwater influence and relating these resulting classes to infiltration behaviour. Further, it is shown that condition classes based on CCTV-data can be used to estimate the infiltration potential of sewer pipes. Copyright © 2011 Elsevier Ltd. All rights reserved.
Nelson, Jonathan M.; Kinzel, Paul J.; Schmeeckle, Mark Walter; McDonald, Richard R.; Minear, Justin T.
2016-01-01
Noncontact methods for measuring water-surface elevation and velocity in laboratory flumes and rivers are presented with examples. Water-surface elevations are measured using an array of acoustic transducers in the laboratory and using laser scanning in field situations. Water-surface velocities are based on using particle image velocimetry or other machine vision techniques on infrared video of the water surface. Using spatial and temporal averaging, results from these methods provide information that can be used to develop estimates of discharge for flows over known bathymetry. Making such estimates requires relating water-surface velocities to vertically averaged velocities; the methods here use standard relations. To examine where these relations break down, laboratory data for flows over simple bumps of three amplitudes are evaluated. As anticipated, discharges determined from surface information can have large errors where nonhydrostatic effects are large. In addition to investigating and characterizing this potential error in estimating discharge, a simple method for correction of the issue is presented. With a simple correction based on bed gradient along the flow direction, remotely sensed estimates of discharge appear to be viable.
Preliminary evaluation of spectral, normal and meteorological crop stage estimation approaches
NASA Technical Reports Server (NTRS)
Cate, R. B.; Artley, J. A.; Doraiswamy, P. C.; Hodges, T.; Kinsler, M. C.; Phinney, D. E.; Sestak, M. L. (Principal Investigator)
1980-01-01
Several of the projects in the AgRISTARS program require crop phenology information, including classification, acreage and yield estimation, and detection of episodal events. This study evaluates several crop calendar estimation techniques for their potential use in the program. The techniques, although generic in approach, were developed and tested on spring wheat data collected in 1978. There are three basic approaches to crop stage estimation: historical averages for an area (normal crop calendars), agrometeorological modeling of known crop-weather relationships agrometeorological (agromet) crop calendars, and interpretation of spectral signatures (spectral crop calendars). In all, 10 combinations of planting and biostage estimation models were evaluated. Dates of stage occurrence are estimated with biases between -4 and +4 days while root mean square errors range from 10 to 15 days. Results are inconclusive as to the superiority of any of the models and further evaluation of the models with the 1979 data set is recommended.
Wong, Charlene A; Kulhari, Sajal; McGeoch, Ellen J; Jones, Arthur T; Weiner, Janet; Polsky, Daniel; Baker, Tom
2018-05-29
The design of the Affordable Care Act's (ACA) health insurance marketplaces influences complex health plan choices. To compare the choice environments of the public health insurance exchanges in the fourth (OEP4) versus third (OEP3) open enrollment period and to examine online marketplace run by private companies, including a total cost estimate comparison. In November-December 2016, we examined the public and private online health insurance exchanges. We navigated each site for "real-shopping" (personal information required) and "window-shopping" (no required personal information). Public (n = 13; 12 state-based marketplaces and HealthCare.gov ) and private (n = 23) online health insurance exchanges. Features included consumer decision aids (e.g., total cost estimators, provider lookups) and plan display (e.g., order of plans). We examined private health insurance exchanges for notable features (i.e., those not found on public exchanges) and compared the total cost estimates on public versus private exchanges for a standardized consumer. Nearly all studied consumer decision aids saw increased deployment in the public marketplaces in OEP4 compared to OEP3. Over half of the public exchanges (n = 7 of 13) had total cost estimators (versus 5 of 14 in OEP3) in window-shopping and integrated provider lookups (window-shopping: 7; real-shopping: 8). The most common default plan orders were by premium or total cost estimate. Notable features on private health insurance exchanges were unique data presentation (e.g., infographics) and further personalized shopping (e.g., recommended plan flags). Health plan total cost estimates varied substantially between the public and private exchanges (average difference $1526). The ACA's public health insurance exchanges offered more tools in OEP4 to help consumers select a plan. While private health insurance exchanges presented notable features, the total cost estimates for a standardized consumer varied widely on public versus private exchanges.
Grizzly Bear Noninvasive Genetic Tagging Surveys: Estimating the Magnitude of Missed Detections
Fisher, Jason T.; Heim, Nicole; Code, Sandra; Paczkowski, John
2016-01-01
Sound wildlife conservation decisions require sound information, and scientists increasingly rely on remotely collected data over large spatial scales, such as noninvasive genetic tagging (NGT). Grizzly bears (Ursus arctos), for example, are difficult to study at population scales except with noninvasive data, and NGT via hair trapping informs management over much of grizzly bears’ range. Considerable statistical effort has gone into estimating sources of heterogeneity, but detection error–arising when a visiting bear fails to leave a hair sample–has not been independently estimated. We used camera traps to survey grizzly bear occurrence at fixed hair traps and multi-method hierarchical occupancy models to estimate the probability that a visiting bear actually leaves a hair sample with viable DNA. We surveyed grizzly bears via hair trapping and camera trapping for 8 monthly surveys at 50 (2012) and 76 (2013) sites in the Rocky Mountains of Alberta, Canada. We used multi-method occupancy models to estimate site occupancy, probability of detection, and conditional occupancy at a hair trap. We tested the prediction that detection error in NGT studies could be induced by temporal variability within season, leading to underestimation of occupancy. NGT via hair trapping consistently underestimated grizzly bear occupancy at a site when compared to camera trapping. At best occupancy was underestimated by 50%; at worst, by 95%. Probability of false absence was reduced through successive surveys, but this mainly accounts for error imparted by movement among repeated surveys, not necessarily missed detections by extant bears. The implications of missed detections and biased occupancy estimates for density estimation–which form the crux of management plans–require consideration. We suggest hair-trap NGT studies should estimate and correct detection error using independent survey methods such as cameras, to ensure the reliability of the data upon which species management and conservation actions are based. PMID:27603134
Fast Flood damage estimation coupling hydraulic modeling and Multisensor Satellite data
NASA Astrophysics Data System (ADS)
Fiorini, M.; Rudari, R.; Delogu, F.; Candela, L.; Corina, A.; Boni, G.
2011-12-01
Damage estimation requires a good representation of the Elements at risk and their vulnerability, the knowledge of the flooded area extension and the description of the hydraulic forcing. In this work the real time use of a simplified two dimensional hydraulic model constrained by satellite retrieved flooded areas is analyzed. The main features of such a model are computational speed and simple start-up, with no need to insert complex information but a subset of simplified boundary and initial condition. Those characteristics allow the model to be fast enough to be used in real time for the simulation of flooding events. The model fills the gap of information left by single satellite scenes of flooded area, allowing for the estimation of the maximum flooding extension and magnitude. The static information provided by earth observation (like SAR extension of flooded areas at a certain time) are interpreted in a dynamic consistent way and very useful hydraulic information (e.g., water depth, water speed and the evolution of flooded areas)are provided. These information are merged with satellite identification of elements exposed to risk that are characterized in terms of their vulnerability to floods in order to obtain fast estimates of Food damages. The model has been applied in several flooding events occurred worldwide. amongst the other activations in the Mediterranean areas like Veneto (IT) (October 2010), Basilicata (IT) (March 2011) and Shkoder (January 2010 and December 2010) are considered and compared with larger types of floods like the one of Queensland in December 2010.
Assessment of Agricultural Water Management in Punjab, India using Bayesian Methods
NASA Astrophysics Data System (ADS)
Russo, T. A.; Devineni, N.; Lall, U.; Sidhu, R.
2013-12-01
The success of the Green Revolution in Punjab, India is threatened by the declining water table (approx. 1 m/yr). Punjab, a major agricultural supplier for the rest of India, supports irrigation with a canal system and groundwater, which is vastly over-exploited. Groundwater development in many districts is greater than 200% the annual recharge rate. The hydrologic data required to complete a mass-balance model are not available for this region, therefore we use Bayesian methods to estimate hydrologic properties and irrigation requirements. Using the known values of precipitation, total canal water delivery, crop yield, and water table elevation, we solve for each unknown parameter (often a coefficient) using a Markov chain Monte Carlo (MCMC) algorithm. Results provide regional estimates of irrigation requirements and groundwater recharge rates under observed climate conditions (1972 to 2002). Model results are used to estimate future water availability and demand to help inform agriculture management decisions under projected climate conditions. We find that changing cropping patterns for the region can maintain food production while balancing groundwater pumping with natural recharge. This computational method can be applied in data-scarce regions across the world, where agricultural water management is required to resolve competition between food security and changing resource availability.
Sample Size Estimation in Cluster Randomized Educational Trials: An Empirical Bayes Approach
ERIC Educational Resources Information Center
Rotondi, Michael A.; Donner, Allan
2009-01-01
The educational field has now accumulated an extensive literature reporting on values of the intraclass correlation coefficient, a parameter essential to determining the required size of a planned cluster randomized trial. We propose here a simple simulation-based approach including all relevant information that can facilitate this task. An…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-03
... requirements of FERC-716 (``Good Faith Request for Transmission Service and Response by Transmitting Utility..., provide standards by which the Commission determines if and when a valid good faith request for... 12 components of a good faith estimate and 5 components of a reply to a good faith request. Action...
78 FR 21162 - Notice of Intent to Establish an Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-09
... Programs. NCSES, under generic clearance (OMB 3145-0174), has conducted a methodological study to test a.... Estimate of Burden: In the methodological study, HAs required 1 hour on average to complete these tasks...,206 hours. Most ECs were able to complete this task in less than 30 minutes in the methodological...
Optimal sampling for radiotelemetry studies of spotted owl habitat and home range.
Andrew B. Carey; Scott P. Horton; Janice A. Reid
1989-01-01
Radiotelemetry studies of spotted owl (Strix occidentalis) ranges and habitat-use must be designed efficiently to estimate parameters needed for a sample of individuals sufficient to describe the population. Independent data are required by analytical methods and provide the greatest return of information per effort. We examined time series of...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-27
... may be submitted to: DHS, USCIS, Office of Policy and Strategy, Chief, Regulatory Coordination... estimates on the burden in terms of time and money incurred by applicants for the following aspects of this... service. The average time required and money expended to secure secondary evidence such as an affidavit...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-07
... Act of 1996 (and reauthorized in 2007), NMFS is required to enumerate the economic impacts of the... allow analysts to estimate the economic contributions and impacts of marine fish processing to each... paper forms. Methods of submittal include email of electronic forms, and mail and facsimile transmission...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-14
... art from living American artists. One-half of one percent of the estimated construction cost of new or... for OMB Review; Art-in- Architecture Program National Artist Registry AGENCY: Public Buildings Service... extension of a previously approved information collection requirement regarding Art-in Architecture Program...
The College Cost Book, 1982-83. Third Edition.
ERIC Educational Resources Information Center
Lovell, Susan, Ed.
Expenses at 3,200 schools and colleges are identified, and information is presented on applying for student financial aid and on estimating financial need. Recent changes in federal aid programs, eligibility requirements, and major programs are also outlined. In considering the cost of college, the following areas need to be addressed: tuition and…
The College Cost Book, 1984-85. Fifth Edition.
ERIC Educational Resources Information Center
Brouder, Kathleen
Expenses at 3,500 colleges, universities, and proprietary schools are identified, and information is presented on applying for student financial aid and on estimating financial need. Recent changes in federal aid programs and eligibility requirements are also outlined. In considering the cost of college, the following areas need to be addressed:…
Extensive, strategic assessment of southeast Alaska's vegetative resources.
Willem W.S. van Hees; Bert R. Mead
2005-01-01
Effective natural resources management requires knowledge of the character of resources and of interactions between resource components. Estimates of forest and other vegetation resources are presented to provide managers with information about the character of the resource. Slightly less than half (48%) of southeast Alaska has some type of forest land cover, about 29...
46 CFR 164.008-7 - Procedure for approval.
Code of Federal Regulations, 2013 CFR
2013-10-01
... conducted at some laboratory other than the National Bureau of Standards, this information shall be supplied... fire resistance and integrity test will be given at this time together with the estimated cost of the... manufacturer shall submit the samples required by paragraph (c)(1) of this section to the Fire Research Section...
46 CFR 164.008-7 - Procedure for approval.
Code of Federal Regulations, 2010 CFR
2010-10-01
... conducted at some laboratory other than the National Bureau of Standards, this information shall be supplied... fire resistance and integrity test will be given at this time together with the estimated cost of the... manufacturer shall submit the samples required by paragraph (c)(1) of this section to the Fire Research Section...
46 CFR 164.008-7 - Procedure for approval.
Code of Federal Regulations, 2014 CFR
2014-10-01
... conducted at some laboratory other than the National Bureau of Standards, this information shall be supplied... fire resistance and integrity test will be given at this time together with the estimated cost of the... manufacturer shall submit the samples required by paragraph (c)(1) of this section to the Fire Research Section...
46 CFR 164.008-7 - Procedure for approval.
Code of Federal Regulations, 2011 CFR
2011-10-01
... conducted at some laboratory other than the National Bureau of Standards, this information shall be supplied... fire resistance and integrity test will be given at this time together with the estimated cost of the... manufacturer shall submit the samples required by paragraph (c)(1) of this section to the Fire Research Section...
46 CFR 164.008-7 - Procedure for approval.
Code of Federal Regulations, 2012 CFR
2012-10-01
... conducted at some laboratory other than the National Bureau of Standards, this information shall be supplied... fire resistance and integrity test will be given at this time together with the estimated cost of the... manufacturer shall submit the samples required by paragraph (c)(1) of this section to the Fire Research Section...
40 CFR 270.290 - What general types of information must I keep at my facility?
Code of Federal Regulations, 2011 CFR
2011-07-01
... and power outages, (5) Prevent undue exposure of personnel to hazardous waste (for example, requiring.... (n) [Reserved] (o) The most recent closure cost estimate for your facility prepared under 40 CFR 267... land uses (residential, commercial, agricultural, recreational). (5) A wind rose (i.e., prevailing wind...
40 CFR 270.290 - What general types of information must I keep at my facility?
Code of Federal Regulations, 2010 CFR
2010-07-01
... and power outages, (5) Prevent undue exposure of personnel to hazardous waste (for example, requiring.... (n) [Reserved] (o) The most recent closure cost estimate for your facility prepared under 40 CFR 267... land uses (residential, commercial, agricultural, recreational). (5) A wind rose (i.e., prevailing wind...
An identifiable model for informative censoring
Link, W.A.; Wegman, E.J.; Gantz, D.T.; Miller, J.J.
1988-01-01
The usual model for censored survival analysis requires the assumption that censoring of observations arises only due to causes unrelated to the lifetime under consideration. It is easy to envision situations in which this assumption is unwarranted, and in which use of the Kaplan-Meier estimator and associated techniques will lead to unreliable analyses.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-23
... requires station licensees to measure the carrier frequency, output power, and modulation of each..., 409,048 responses. Estimated Time per Response: .033 hours. Frequency of Response: Recordkeeping... installed and when any changes are made which would likely affect the modulation characteristics. Such...
76 FR 44600 - Renewal of Approved Information Collection, OMB Control Number 1004-0201
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-26
..., production, resource recovery and protection, operations under oil shale leases, and exploration under leases... requirements in 43 CFR parts 3900, 3910, 3920, and 3930, which pertain to management of oil shale. DATES... the agency's burden estimates; (3) ways to enhance the quality, utility, and clarity of the...
76 FR 33768 - Proposed Information Collection Activity; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-09
... inability to pay energy bills; (3) increase the efficiency of energy usage by low-income families, helping... hours Total burden respondents respondent per response hours REACH Model Plan 51 1 72 3,672 Estimated Total Annual Burden Hours: 3,672. In compliance with the requirements of Section 3506(c)(2)(A) of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-09
... required consists of grant application preparation, quarterly reports and electronic data documenting the...: 52. Estimated Time per Response: Grant application preparation: 79.5 hours each; quarterly report preparation: 8 hours each; and inspection and data upload: 1 minute each. Expiration Date: February 28, 2011...
On the potential use of radar-derived information in operational numerical weather prediction
NASA Technical Reports Server (NTRS)
Mcpherson, R. D.
1986-01-01
Estimates of requirements likely to be levied on a new observing system for mesoscale meteonology are given. Potential observing systems for mesoscale numerical weather prediction are discussed. Thermodynamic profiler radiometers, infrared radiometer atmospheric sounders, Doppler radar wind profilers and surveillance radar, and moisture profilers are among the instruments described.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-07
...-Marking of Plastic Explosives for the Purpose of Detection ACTION: 30-Day notice. The Department of...) Title of the Form/Collection: Statement of Process-Marking of Plastic Explosives for the Purpose of... used to ensure that plastic explosives contain a detection agent as required by law. (5) An estimate of...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-16
..., including validity of the methodology and assumptions used; (c) ways to enhance the quality, utility and... collection. Abstract: 7 CFR 273.7(c)(9) requires State agencies to submit quarterly E&T Program Activity... filed Total number of Estimated Section of regulation Title respondents annually responses (C x hours...
Improving detection of psychiatric disturbances in Parkinson's disease: the role of informants.
Hirsch, Elaina S; Adler, Geri; Amspoker, Amber B; Williams, James R; Marsh, Laura
2013-01-01
Under-recognition of psychiatric disturbances in patients with Parkinson's disease (PD) contributes to greater overall morbidity. Little is known about the value of collateral psychiatric history, obtained using standardized assessments with informants, for increasing recognition of PD-related psychiatric illness. To examine the extent to which informants provide critical information that enabled psychiatrists to establish psychiatric diagnoses in patients with PD. Individuals with PD (n = 223) and an informant were interviewed separately regarding the PD patient's psychiatric history and current status. A six-psychiatrist panel rated the extent to which informant data was required to establish the final consensus best-estimate current psychiatric diagnoses. Informants rated as "Crucial" or "Significantly Informative" comprised a "Critical Informant" (CI) subgroup; remaining informants were classified as the "Non-Critical Informant" (NCI) subgroup. Of the informants, 71 (31.4%) were "critical" for determining a psychiatric diagnosis. Without a CI, 81.3% of those with impulse control disorders and 43.8% of those with anxiety disorders would not have been diagnosed. Male PD patients and those with less severe motor deficits were also more likely to require a CI. Informants aid in the identification of psychiatric diagnoses, especially impulse control and anxiety disorders. This has implications for clinical practice and conduction of clinical trials.
NASA Astrophysics Data System (ADS)
Darmenova, K.; Higgins, G.; Kiley, H.; Apling, D.
2010-12-01
Current General Circulation Models (GCMs) provide a valuable estimate of both natural and anthropogenic climate changes and variability on global scales. At the same time, future climate projections calculated with GCMs are not of sufficient spatial resolution to address regional needs. Many climate impact models require information at scales of 50 km or less, so dynamical downscaling is often used to estimate the smaller-scale information based on larger scale GCM output. To address current deficiencies in local planning and decision making with respect to regional climate change, our research is focused on performing a dynamical downscaling with the Weather Research and Forecasting (WRF) model and developing decision aids that translate the regional climate data into actionable information for users. Our methodology involves development of climatological indices of extreme weather and heating/cooling degree days based on WRF ensemble runs initialized with the NCEP-NCAR reanalysis and the European Center/Hamburg Model (ECHAM5). Results indicate that the downscale simulations provide the necessary detailed output required by state and local governments and the private sector to develop climate adaptation plans. In addition we evaluated the WRF performance in long-term climate simulations over the Southwestern US and validated against observational datasets.
NASA Astrophysics Data System (ADS)
Lee, Jonghyun; Yoon, Hongkyu; Kitanidis, Peter K.; Werth, Charles J.; Valocchi, Albert J.
2016-07-01
Characterizing subsurface properties is crucial for reliable and cost-effective groundwater supply management and contaminant remediation. With recent advances in sensor technology, large volumes of hydrogeophysical and geochemical data can be obtained to achieve high-resolution images of subsurface properties. However, characterization with such a large amount of information requires prohibitive computational costs associated with "big data" processing and numerous large-scale numerical simulations. To tackle such difficulties, the principal component geostatistical approach (PCGA) has been proposed as a "Jacobian-free" inversion method that requires much smaller forward simulation runs for each iteration than the number of unknown parameters and measurements needed in the traditional inversion methods. PCGA can be conveniently linked to any multiphysics simulation software with independent parallel executions. In this paper, we extend PCGA to handle a large number of measurements (e.g., 106 or more) by constructing a fast preconditioner whose computational cost scales linearly with the data size. For illustration, we characterize the heterogeneous hydraulic conductivity (K) distribution in a laboratory-scale 3-D sand box using about 6 million transient tracer concentration measurements obtained using magnetic resonance imaging. Since each individual observation has little information on the K distribution, the data were compressed by the zeroth temporal moment of breakthrough curves, which is equivalent to the mean travel time under the experimental setting. Only about 2000 forward simulations in total were required to obtain the best estimate with corresponding estimation uncertainty, and the estimated K field captured key patterns of the original packing design, showing the efficiency and effectiveness of the proposed method.
Herculano-Houzel, Suzana; von Bartheld, Christopher S; Miller, Daniel J; Kaas, Jon H
2015-04-01
The number of cells comprising biological structures represents fundamental information in basic anatomy, development, aging, drug tests, pathology and genetic manipulations. Obtaining unbiased estimates of cell numbers, however, was until recently possible only through stereological techniques, which require specific training, equipment, histological processing and appropriate sampling strategies applied to structures with a homogeneous distribution of cell bodies. An alternative, the isotropic fractionator (IF), became available in 2005 as a fast and inexpensive method that requires little training, no specific software and only a few materials before it can be used to quantify total numbers of neuronal and non-neuronal cells in a whole organ such as the brain or any dissectible regions thereof. This method entails transforming a highly anisotropic tissue into a homogeneous suspension of free-floating nuclei that can then be counted under the microscope or by flow cytometry and identified morphologically and immunocytochemically as neuronal or non-neuronal. We compare the advantages and disadvantages of each method and provide researchers with guidelines for choosing the best method for their particular needs. IF is as accurate as unbiased stereology and faster than stereological techniques, as it requires no elaborate histological processing or sampling paradigms, providing reliable estimates in a few days rather than many weeks. Tissue shrinkage is also not an issue, since the estimates provided are independent of tissue volume. The main disadvantage of IF, however, is that it necessarily destroys the tissue analyzed and thus provides no spatial information on the cellular composition of biological regions of interest.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramusch, R., E-mail: roland.ramusch@boku.ac.at; Pertl, A.; Scherhaufer, S.
Highlights: • Informal collectors from Hungary collect bulky waste and reusable items in Austria. • Two methodologies were applied to estimate the informally collected quantities. • Both approaches lead to an estimation of roughly 100,000 t p.a. informally collected. • The formal Austrian system collects 72 kg/cap/yr of bulky waste, WEE & scrap metal. • Informal collection amounts to approx. 12 kg/cap/yr. - Abstract: Disparities in earnings between Western and Eastern European countries are the reason for a well-established informal sector actively involved in collection and transboundary shipment activities from Austria to Hungary. The preferred objects are reusable items andmore » wastes within the categories bulky waste, WEEE and metals, intended to be sold on flea markets. Despite leading to a loss of recyclable resources for Austrian waste management, these informal activities may contribute to the extension of the lifetime of certain goods when they are reused in Hungary; nevertheless they are discussed rather controversially. The aim of this paper is to provide objective data on the quantities informally collected and transhipped. The unique activities of informal collectors required the development and implementation of a new set of methodologies. The concept of triangulation was used to verify results obtained by field visits, interviews and a traffic counting campaign. Both approaches lead to an estimation of approx. 100,000 t per year of reusable items informally collected in Austria. This means that in addition to the approx. 72 kg/cap/yr formally collected bulky waste, bulky waste wood, household scrap (excluding packaging) and WEEE, up to a further 12 kg/cap/yr might, in the case that informal collection is abandoned, end up as waste or in the second-hand sector.« less
Fuel Economy Label and CAFE Data Inventory
The Fuel Economy Label and CAFE Data asset contains measured summary fuel economy estimates and test data for light-duty vehicle manufacturers by model for certification as required under the Energy Policy and Conservation Act of 1975 (EPCA) and The Energy Independent Security Act of 2007 (EISA) to collect vehicle fuel economy estimates for the creation of Economy Labels and for the calculation of Corporate Average Fuel Economy (CAFE). Manufacturers submit data on an annual basis, or as needed to document vehicle model changes.The EPA performs targeted fuel economy confirmatory tests on approximately 15% of vehicles submitted for validation. Confirmatory data on vehicles is associated with its corresponding submission data to verify the accuracy of manufacturer submissions beyond standard business rules. Submitted data comes in XML format or as documents, with the majority of submissions being sent in XML, and includes descriptive information on the vehicle itself, fuel economy information, and the manufacturer's testing approach. This data may contain proprietary information (CBI) such as information on estimated sales or other data elements indicated by the submitter as confidential. CBI data is not publically available; however, within the EPA data can accessed under the restrictions of the Office of Transportation and Air Quality (OTAQ) CBI policy [RCS Link]. Datasets are segmented by vehicle model/manufacturer and/or year with corresponding fuel economy, te
78 FR 35287 - Agency Information Collection Activities; Proposed Collection; Public Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-12
...In compliance with the requirement for opportunity for public comment on proposed data collection projects (Section 3506(c)(2)(A) of the Paperwork Reduction Act of 1995), the Health Resources and Services Administration (HRSA) announces plans to submit an Information Collection Request (ICR), described below, to the Office of Management and Budget (OMB). Prior to submitting the ICR to OMB, HRSA seeks comments from the public regarding the burden estimate, below, or any other aspect of the ICR.
78 FR 69695 - Agency Information Collection Activities: Proposed Collection: Public Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-20
...In compliance with the requirement for opportunity for public comment on proposed data collection projects (Section 3506(c)(2)(A) of the Paperwork Reduction Act of 1995), the Health Resources and Services Administration (HRSA) announces plans to submit an Information Collection Request (ICR), described below, to the Office of Management and Budget (OMB). Prior to submitting the ICR to OMB, HRSA seeks comments from the public regarding the burden estimate, below, or any other aspect of the ICR.
[Book review] Birds in Europe: Population estimates, trends and conservation status
Peterjohn, Bruce G.
2006-01-01
Effective bird conservation requires knowledge of distribution, relative abundance, and population trends at multiple geographic scales. Obtaining this information for a continental avifauna poses considerable challenges, especially in Europe with its 52 countries, numerous languages and cultures, and disparate resources available for monitoring bird populations within each country. Synthesizing the available information on the status and trends of all European birds into a single volume is an enormous yet essential task necessary to direct bird conservation activities across the continent.
Design and Development of a Prototype Organizational Effectiveness Information System
1984-11-01
information from a large number of people. The existing survey support process for the GOQ is not satisfac- * tory. Most OESOs elect not to use it, because...reporting process uses screen queries and menus to simplify data entry, it is estimated that only 4-6 hours of data entry time would be required for ...description for the file named EVEDIR. The Resource System allows users of the Event Directory to select from the following processing options. o Add a new
Hubble Space Telescope cycle 5. Phase 1: Proposal instructions, version 4.0
NASA Technical Reports Server (NTRS)
Madau, Piero (Editor)
1994-01-01
This document has the following purposes: it describes the information that must be submitted to the Space Telescope Science Institute by Phase 1 proposers, both electronically and on paper, and describes how to submit it; it describes how to fill out the proposal LATEX templates; it describes how to estimate the number of spacecraft orbits that the proposed observations will require; it provides detailed information about the parameters that are used in the forms to describe the requested observations; and it provides information about the preparation and electronic submission of proposal files. Examples of completed proposal forms are included.
SAMICS support study. Volume 1: Cost account catalog
NASA Technical Reports Server (NTRS)
1977-01-01
The Jet Propulsion Laboratory (JPL) is examining the feasibility of a new industry to produce photovoltaic solar energy collectors similar to those used on spacecraft. To do this, a standardized costing procedure was developed. The Solar Array Manufacturing Industry Costing Standards (SAMICS) support study supplies the following information: (1) SAMICS critique; (2) Standard data base--cost account structure, expense item costs, inflation rates, indirect requirements relationships, and standard financial parameter values; (3) Facilities capital cost estimating relationships; (4) Conceptual plant designs; (5) Construction lead times; (6) Production start-up times; (7) Manufacturing price estimates.
Automated assessment of noninvasive filling pressure using color Doppler M-mode echocardiography
NASA Technical Reports Server (NTRS)
Greenberg, N. L.; Firstenberg, M. S.; Cardon, L. A.; Zuckerman, J.; Levine, B. D.; Garcia, M. J.; Thomas, J. D.
2001-01-01
Assessment of left ventricular filling pressure usually requires invasive hemodynamic monitoring to follow the progression of disease or the response to therapy. Previous investigations have shown accurate estimation of wedge pressure using noninvasive Doppler information obtained from the ratio of the wave propagation slope from color M-mode (CMM) images and the peak early diastolic filling velocity from transmitral Doppler images. This study reports an automated algorithm that derives an estimate of wedge pressure based on the spatiotemporal velocity distribution available from digital CMM Doppler images of LV filling.
Glacier volume estimation of Cascade Volcanoes—an analysis and comparison with other methods
Driedger, Carolyn L.; Kennard, P.M.
1986-01-01
During the 1980 eruption of Mount St. Helens, the occurrence of floods and mudflows made apparent a need to assess mudflow hazards on other Cascade volcanoes. A basic requirement for such analysis is information about the volume and distribution of snow and ice on these volcanoes. An analysis was made of the volume-estimation methods developed by previous authors and a volume estimation method was developed for use in the Cascade Range. A radio echo-sounder, carried in a backpack, was used to make point measurements of ice thickness on major glaciers of four Cascade volcanoes (Mount Rainier, Washington; Mount Hood and the Three Sisters, Oregon; and Mount Shasta, California). These data were used to generate ice-thickness maps and bedrock topographic maps for developing and testing volume-estimation methods. Subsequently, the methods were applied to the unmeasured glaciers on those mountains and, as a test of the geographical extent of applicability, to glaciers beyond the Cascades having measured volumes. Two empirical relationships were required in order to predict volumes for all the glaciers. Generally, for glaciers less than 2.6 km in length, volume was found to be estimated best by using glacier area, raised to a power. For longer glaciers, volume was found to be estimated best by using a power law relationship, including slope and shear stress. The necessary variables can be estimated from topographic maps and aerial photographs.
A fast and objective multidimensional kernel density estimation method: fastKDE
O'Brien, Travis A.; Kashinath, Karthik; Cavanaugh, Nicholas R.; ...
2016-03-07
Numerous facets of scientific research implicitly or explicitly call for the estimation of probability densities. Histograms and kernel density estimates (KDEs) are two commonly used techniques for estimating such information, with the KDE generally providing a higher fidelity representation of the probability density function (PDF). Both methods require specification of either a bin width or a kernel bandwidth. While techniques exist for choosing the kernel bandwidth optimally and objectively, they are computationally intensive, since they require repeated calculation of the KDE. A solution for objectively and optimally choosing both the kernel shape and width has recently been developed by Bernacchiamore » and Pigolotti (2011). While this solution theoretically applies to multidimensional KDEs, it has not been clear how to practically do so. A method for practically extending the Bernacchia-Pigolotti KDE to multidimensions is introduced. This multidimensional extension is combined with a recently-developed computational improvement to their method that makes it computationally efficient: a 2D KDE on 10 5 samples only takes 1 s on a modern workstation. This fast and objective KDE method, called the fastKDE method, retains the excellent statistical convergence properties that have been demonstrated for univariate samples. The fastKDE method exhibits statistical accuracy that is comparable to state-of-the-science KDE methods publicly available in R, and it produces kernel density estimates several orders of magnitude faster. The fastKDE method does an excellent job of encoding covariance information for bivariate samples. This property allows for direct calculation of conditional PDFs with fastKDE. It is demonstrated how this capability might be leveraged for detecting non-trivial relationships between quantities in physical systems, such as transitional behavior.« less
Reducing Interpolation Artifacts for Mutual Information Based Image Registration
Soleimani, H.; Khosravifard, M.A.
2011-01-01
Medical image registration methods which use mutual information as similarity measure have been improved in recent decades. Mutual Information is a basic concept of Information theory which indicates the dependency of two random variables (or two images). In order to evaluate the mutual information of two images their joint probability distribution is required. Several interpolation methods, such as Partial Volume (PV) and bilinear, are used to estimate joint probability distribution. Both of these two methods yield some artifacts on mutual information function. Partial Volume-Hanning window (PVH) and Generalized Partial Volume (GPV) methods are introduced to remove such artifacts. In this paper we show that the acceptable performance of these methods is not due to their kernel function. It's because of the number of pixels which incorporate in interpolation. Since using more pixels requires more complex and time consuming interpolation process, we propose a new interpolation method which uses only four pixels (the same as PV and bilinear interpolations) and removes most of the artifacts. Experimental results of the registration of Computed Tomography (CT) images show superiority of the proposed scheme. PMID:22606673
Flood frequency analysis - the challenge of using historical data
NASA Astrophysics Data System (ADS)
Engeland, Kolbjorn
2015-04-01
Estimates of high flood quantiles are needed for many applications, .e.g. dam safety assessments are based on the 1000 years flood, whereas the dimensioning of important infrastructure requires estimates of the 200 year flood. The flood quantiles are estimated by fitting a parametric distribution to a dataset of high flows comprising either annual maximum values or peaks over a selected threshold. Since the record length of data is limited compared to the desired flood quantile, the estimated flood magnitudes are based on a high degree of extrapolation. E.g. the longest time series available in Norway are around 120 years, and as a result any estimation of a 1000 years flood will require extrapolation. One solution is to extend the temporal dimension of a data series by including information about historical floods before the stream flow was systematically gaugeded. Such information could be flood marks or written documentation about flood events. The aim of this study was to evaluate the added value of using historical flood data for at-site flood frequency estimation. The historical floods were included in two ways by assuming: (1) the size of (all) floods above a high threshold within a time interval is known; and (2) the number of floods above a high threshold for a time interval is known. We used a Bayesian model formulation, with MCMC used for model estimation. This estimation procedure allowed us to estimate the predictive uncertainty of flood quantiles (i.e. both sampling and parameter uncertainty is accounted for). We tested the methods using 123 years of systematic data from Bulken in western Norway. In 2014 the largest flood in the systematic record was observed. From written documentation and flood marks we had information from three severe floods in the 18th century and they were likely to exceed the 2014 flood. We evaluated the added value in two ways. First we used the 123 year long streamflow time series and investigated the effect of having several shorter series' which could be supplemented with a limited number of known large flood events. Then we used the three historical floods from the 18th century combined with the whole and subsets of the 123 years of systematic observations. In the latter case several challenges were identified: i) The possibility to transfer water levels to river streamflows due to man made changes in the river profile, (ii) The stationarity of the data might be questioned since the three largest historical floods occurred during the "little ice age" with different climatic conditions compared to today.
Lee, Eugenia E; Stewart, Barclay; Zha, Yuanting A; Groen, Thomas A; Burkle, Frederick M; Kushner, Adam L
2016-08-10
Climate extremes will increase the frequency and severity of natural disasters worldwide. Climate-related natural disasters were anticipated to affect 375 million people in 2015, more than 50% greater than the yearly average in the previous decade. To inform surgical assistance preparedness, we estimated the number of surgical procedures needed. The numbers of people affected by climate-related disasters from 2004 to 2014 were obtained from the Centre for Research of the Epidemiology of Disasters database. Using 5,000 procedures per 100,000 persons as the minimum, baseline estimates were calculated. A linear regression of the number of surgical procedures performed annually and the estimated number of surgical procedures required for climate-related natural disasters was performed. Approximately 140 million people were affected by climate-related natural disasters annually requiring 7.0 million surgical procedures. The greatest need for surgical care was in the People's Republic of China, India, and the Philippines. Linear regression demonstrated a poor relationship between national surgical capacity and estimated need for surgical care resulting from natural disaster, but countries with the least surgical capacity will have the greatest need for surgical care for persons affected by climate-related natural disasters. As climate extremes increase the frequency and severity of natural disasters, millions will need surgical care beyond baseline needs. Countries with insufficient surgical capacity will have the most need for surgical care for persons affected by climate-related natural disasters. Estimates of surgical are particularly important for countries least equipped to meet surgical care demands given critical human and physical resource deficiencies.
Analysis of long term trends of precipitation estimates acquired using radar network in Turkey
NASA Astrophysics Data System (ADS)
Tugrul Yilmaz, M.; Yucel, Ismail; Kamil Yilmaz, Koray
2016-04-01
Precipitation estimates, a vital input in many hydrological and agricultural studies, can be obtained using many different platforms (ground station-, radar-, model-, satellite-based). Satellite- and model-based estimates are spatially continuous datasets, however they lack the high resolution information many applications often require. Station-based values are actual precipitation observations, however they suffer from their nature that they are point data. These datasets may be interpolated however such end-products may have large errors over remote locations with different climate/topography/etc than the areas stations are installed. Radars have the particular advantage of having high spatial resolution information over land even though accuracy of radar-based precipitation estimates depends on the Z-R relationship, mountain blockage, target distance from the radar, spurious echoes resulting from anomalous propagation of the radar beam, bright band contamination and ground clutter. A viable method to obtain spatially and temporally high resolution consistent precipitation information is merging radar and station data to take advantage of each retrieval platform. An optimally merged product is particularly important in Turkey where complex topography exerts strong controls on the precipitation regime and in turn hampers observation efforts. There are currently 10 (additional 7 are planned) weather radars over Turkey obtaining precipitation information since 2007. This study aims to optimally merge radar precipitation data with station based observations to introduce a station-radar blended precipitation product. This study was supported by TUBITAK fund # 114Y676.
Estimation of optimal hologram recording modes on photothermal materials
NASA Astrophysics Data System (ADS)
Dzhamankyzov, Nasipbek Kurmanalievich; Ismanov, Yusupzhan Khakimzhanovich; Zhumaliev, Kubanychbek Myrzabekovich; Alymkulov, Samsaly Amanovich
2018-01-01
A theoretical analysis of the hologram recording process on photothermal media to estimate the required laser radiation power for the information recording as the function of the spatial frequency and radiation exposure duration is considered. Results of the analysis showed that materials with a low thermal diffusivity are necessary to increase the recording density in these media and the recording should be performed with short pulses to minimize the thermal diffusion length. A solution for the heat conduction equation for photothermal materials heated by an interference laser field was found. The solution obtained allows one to determine the required value of the recording temperature for given spatial frequencies, depending on the thermal physical parameters of the medium and on the power and duration of the heating radiation.
User's guide for RAM. Volume II. Data preparation and listings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, D.B.; Novak, J.H.
1978-11-01
The information presented in this user's guide is directed to air pollution scientists having an interest in applying air quality simulation models. RAM is a method of estimating short-term dispersion using the Gaussian steady-state model. These algorithms can be used for estimating air quality concentrations of relatively nonreactive pollutants for averaging times from an hour to a day from point and area sources. The algorithms are applicable for locations with level or gently rolling terrain where a single wind vector for each hour is a good approximation to the flow over the source area considered. Calculations are performed for eachmore » hour. Hourly meteorological data required are wind direction, wind speed, temperature, stability class, and mixing height. Emission information required of point sources consists of source coordinates, emission rate, physical height, stack diameter, stack gas exit velocity, and stack gas temperature. Emission information required of area sources consists of southwest corner coordinates, source side length, total area emission rate and effective area source-height. Computation time is kept to a minimum by the manner in which concentrations from area sources are estimated using a narrow plume hypothesis and using the area source squares as given rather than breaking down all sources into an area of uniform elements. Options are available to the user to allow use of three different types of receptor locations: (1) those whose coordinates are input by the user, (2) those whose coordinates are determined by the model and are downwind of significant point and area sources where maxima are likely to occur, and (3) those whose coordinates are determined by the model to give good area coverage of a specific portion of the region. Computation time is also decreased by keeping the number of receptors to a minimum. Volume II presents RAM example outputs, typical run streams, variable glossaries, and Fortran source codes.« less
NASA Astrophysics Data System (ADS)
Pardo-Iguzquiza, Eulogio; Juan Collados Lara, Antonio; Pulido-Velazquez, David
2016-04-01
The snow availability in Alpine catchments is essential for the economy of these areas. It plays an important role in tourist development but also in the management of the Water Resources Snow is an important water resource in many river basins with mountains in the catchment area. The determination of the snow water equivalent requires the estimation of the evolution of the snow pack (cover area, thickness and snow density) along the time. Although there are complex physical models of the dynamics of the snow pack, sometimes the data available are scarce and a stochastic model like the cellular automata (CA) can be of great practical interest. CA can be used to model the dynamics of growth and wane of the snow pack. The CA is calibrated with historical data. This requires the determination of transition rules that are capable of modeling the evolution of the spatial pattern of snow cover area. Furthermore, CA requires the definition of states and neighborhoods. We have included topographical variables and climatological variables in order to define the state of each pixel. The evolution of snow cover in a pixel depends on its state, the state of the neighboring pixels and the transition rules. The calibration of the CA is done using daily MODIS data, available for the period 24/02/2002 to present with a spatial resolution of 500 m, and the LANDSAT information available with a sixteen-day periodicity from 1984 to the present and with spatial resolution of 30 m. The methodology has been applied to estimation of the snow cover area of Sierra Nevada mountain range in the Southern of Spain to obtain snow cover area daily information with 500 m spatial resolution for the period 1980-2014. Acknowledgments: This research has been partially supported by the GESINHIMPADAPT project (CGL2013-48424-C2-2-R) with Spanish MINECO funds. We would also like to thank NASA DAAC and LANDSAT project for the data provided for this study.