Updates to Enhanced Geothermal System Resource Potential Estimate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Augustine, Chad
The deep EGS electricity generation resource potential estimate maintained by the National Renewable Energy Laboratory was updated using the most recent temperature-at-depth maps available from the Southern Methodist University Geothermal Laboratory. The previous study dates back to 2011 and was developed using the original temperature-at-depth maps showcased in the 2006 MIT Future of Geothermal Energy report. The methodology used to update the deep EGS resource potential is the same as in the previous study and is summarized in the paper. The updated deep EGS resource potential estimate was calculated for depths between 3 and 7 km and is binned inmore » 25 degrees C increments. The updated deep EGS electricity generation resource potential estimate is 4,349 GWe. A comparison of the estimates from the previous and updated studies shows a net increase of 117 GWe in the 3-7 km depth range, due mainly to increases in the underlying temperature-at-depth estimates from the updated maps.« less
Update to Enhanced Geothermal System Resource Potential Estimate: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Augustine, Chad
2016-10-01
The deep EGS electricity generation resource potential estimate maintained by the National Renewable Energy Laboratory was updated using the most recent temperature-at-depth maps available from the Southern Methodist University Geothermal Laboratory. The previous study dates back to 2011 and was developed using the original temperature-at-depth maps showcased in the 2006 MIT Future of Geothermal Energy report. The methodology used to update the deep EGS resource potential is the same as in the previous study and is summarized in the paper. The updated deep EGS resource potential estimate was calculated for depths between 3 and 7 km and is binned inmore » 25 degrees C increments. The updated deep EGS electricity generation resource potential estimate is 4,349 GWe. A comparison of the estimates from the previous and updated studies shows a net increase of 117 GWe in the 3-7 km depth range, due mainly to increases in the underlying temperature-at-depth estimates from the updated maps.« less
A reactive nitrogen budget for Lake Michigan
The reactive nitrogen budget for Lake Michigan was reviewed and updated, making use of recent estimates of watershed and atmospheric nitrogen loads. The updated total N load to Lake Michigan was approximately double the previous estimate from the Lake Michigan Mass Balance study ...
Comparison of Methods to Trace Multiple Subskills: Is LR-DBN Best?
ERIC Educational Resources Information Center
Xu, Yanbo; Mostow, Jack
2012-01-01
A long-standing challenge for knowledge tracing is how to update estimates of multiple subskills that underlie a single observable step. We characterize approaches to this problem by how they model knowledge tracing, fit its parameters, predict performance, and update subskill estimates. Previous methods allocated blame or credit among subskills…
Alkema, Leontine; New, Jin Rou; Pedersen, Jon; You, Danzhen
2014-01-01
In September 2013, the United Nations Inter-agency Group for Child Mortality Estimation (UN IGME) published an update of the estimates of the under-five mortality rate (U5MR) and under-five deaths for all countries. Compared to the UN IGME estimates published in 2012, updated data inputs and a new method for estimating the U5MR were used. We summarize the new U5MR estimation method, which is a Bayesian B-spline Bias-reduction model, and highlight differences with the previously used method. Differences in UN IGME U5MR estimates as published in 2012 and those published in 2013 are presented and decomposed into differences due to the updated database and differences due to the new estimation method to explain and motivate changes in estimates. Compared to the previously used method, the new UN IGME estimation method is based on a different trend fitting method that can track (recent) changes in U5MR more closely. The new method provides U5MR estimates that account for data quality issues. Resulting differences in U5MR point estimates between the UN IGME 2012 and 2013 publications are small for the majority of countries but greater than 10 deaths per 1,000 live births for 33 countries in 2011 and 19 countries in 1990. These differences can be explained by the updated database used, the curve fitting method as well as accounting for data quality issues. Changes in the number of deaths were less than 10% on the global level and for the majority of MDG regions. The 2013 UN IGME estimates provide the most recent assessment of levels and trends in U5MR based on all available data and an improved estimation method that allows for closer-to-real-time monitoring of changes in the U5MR and takes account of data quality issues.
Alkema, Leontine; New, Jin Rou; Pedersen, Jon; You, Danzhen
2014-01-01
Background In September 2013, the United Nations Inter-agency Group for Child Mortality Estimation (UN IGME) published an update of the estimates of the under-five mortality rate (U5MR) and under-five deaths for all countries. Compared to the UN IGME estimates published in 2012, updated data inputs and a new method for estimating the U5MR were used. Methods We summarize the new U5MR estimation method, which is a Bayesian B-spline Bias-reduction model, and highlight differences with the previously used method. Differences in UN IGME U5MR estimates as published in 2012 and those published in 2013 are presented and decomposed into differences due to the updated database and differences due to the new estimation method to explain and motivate changes in estimates. Findings Compared to the previously used method, the new UN IGME estimation method is based on a different trend fitting method that can track (recent) changes in U5MR more closely. The new method provides U5MR estimates that account for data quality issues. Resulting differences in U5MR point estimates between the UN IGME 2012 and 2013 publications are small for the majority of countries but greater than 10 deaths per 1,000 live births for 33 countries in 2011 and 19 countries in 1990. These differences can be explained by the updated database used, the curve fitting method as well as accounting for data quality issues. Changes in the number of deaths were less than 10% on the global level and for the majority of MDG regions. Conclusions The 2013 UN IGME estimates provide the most recent assessment of levels and trends in U5MR based on all available data and an improved estimation method that allows for closer-to-real-time monitoring of changes in the U5MR and takes account of data quality issues. PMID:25013954
Imputatoin and Model-Based Updating Technique for Annual Forest Inventories
Ronald E. McRoberts
2001-01-01
The USDA Forest Service is developing an annual inventory system to establish the capability of producing annual estimates of timber volume and related variables. The inventory system features measurement of an annual sample of field plots with options for updating data for plots measured in previous years. One imputation and two model-based updating techniques are...
Henderson, Audrey; Robinson, Mark; McAdams, Rachel; McCartney, Gerry; Beeston, Clare
2016-01-01
Aims To highlight the importance of monitoring biases when using retail sales data to estimate population alcohol consumption. Methods Previously, we identified and where possible quantified sources of bias that may lead to under- or overestimation of alcohol consumption in Scotland. Here, we update findings by using more recent data and by quantifying emergent biases. Results Underestimation resulting from the net effect of biases on population consumption in Scotland increased from −4% in 2010 to −7% in 2013. Conclusion Biases that might impact on the validity and reliability of sales data when estimating population consumption should be routinely monitored and updated. PMID:26419684
Henderson, Audrey; Robinson, Mark; McAdams, Rachel; McCartney, Gerry; Beeston, Clare
2016-05-01
To highlight the importance of monitoring biases when using retail sales data to estimate population alcohol consumption. Previously, we identified and where possible quantified sources of bias that may lead to under- or overestimation of alcohol consumption in Scotland. Here, we update findings by using more recent data and by quantifying emergent biases. Underestimation resulting from the net effect of biases on population consumption in Scotland increased from -4% in 2010 to -7% in 2013. Biases that might impact on the validity and reliability of sales data when estimating population consumption should be routinely monitored and updated. © The Author 2015. Medical Council on Alcohol and Oxford University Press.
A revised load estimation procedure for the Susquehanna, Potomac, Patuxent, and Choptank rivers
Yochum, Steven E.
2000-01-01
The U.S. Geological Survey?s Chesapeake Bay River Input Program has updated the nutrient and suspended-sediment load data base for the Susquehanna, Potomac, Patuxent, and Choptank Rivers using a multiple-window, center-estimate regression methodology. The revised method optimizes the seven-parameter regression approach that has been used historically by the program. The revised method estimates load using the fifth or center year of a sliding 9-year window. Each year a new model is run for each site and constituent, the most recent year is added, and the previous 4 years of estimates are updated. The fifth year in the 9-year window is considered the best estimate and is kept in the data base. The last year of estimation shows the most change from the previous year?s estimate and this change approaches a minimum at the fifth year. Differences between loads computed using this revised methodology and the loads populating the historical data base have been noted but the load estimates do not typically change drastically. The data base resulting from the application of this revised methodology is populated by annual and monthly load estimates that are known with greater certainty than in the previous load data base.
Moreo, Michael T.; Justet, Leigh
2008-01-01
Ground-water withdrawal estimates from 1913 through 2003 for the Death Valley regional ground-water flow system are compiled in an electronic database to support a regional, three-dimensional, transient ground-water flow model. This database updates a previously published database that compiled estimates of ground-water withdrawals for 1913-1998. The same methodology is used to construct each database. Primary differences between the 2 databases are an additional 5 years of ground-water withdrawal data, well locations in the updated database are restricted to Death Valley regional ground-water flow system model boundary, and application rates are from 0 to 1.5 feet per year lower than original estimates. The lower application rates result from revised estimates of crop consumptive use, which are based on updated estimates of potential evapotranspiration. In 2003, about 55,700 acre-feet of ground water was pumped in the DVRFS, of which 69 percent was used for irrigation, 13 percent for domestic, and 18 percent for public supply, commercial, and mining activities.
On Using Exponential Parameter Estimators with an Adaptive Controller
NASA Technical Reports Server (NTRS)
Patre, Parag; Joshi, Suresh M.
2011-01-01
Typical adaptive controllers are restricted to using a specific update law to generate parameter estimates. This paper investigates the possibility of using any exponential parameter estimator with an adaptive controller such that the system tracks a desired trajectory. The goal is to provide flexibility in choosing any update law suitable for a given application. The development relies on a previously developed concept of controller/update law modularity in the adaptive control literature, and the use of a converse Lyapunov-like theorem. Stability analysis is presented to derive gain conditions under which this is possible, and inferences are made about the tracking error performance. The development is based on a class of Euler-Lagrange systems that are used to model various engineering systems including space robots and manipulators.
Francis A. Roesch; Paul C. van Deusen; Zhiliang Zhu
1995-01-01
Various methods of adjusting low-cost and possibly biased estimates of percent forest coverage from AVHRR data with a subsample of higher-cost estimates from the USDA Forest Service's Forest Inventory and Analysis plots were investigated. Two ratio and two regression estimators were evaluated. Previous work (Zhu and Teuber, 1991) finding that the estimates from...
ERIC Educational Resources Information Center
Shaw, Stacy; Radwin, David
2014-01-01
The web tables in this report provide original and revised estimates of statistics previously published in 2007-08 National Postsecondary Student Aid Study (NPSAS:08): Student Financial Aid Estimates for 2007-08 (NCES 2009-166). The revised estimates were generated using revised weights that were updated in August 2013. NPSAS:08 data were…
Costs of Fetal Alcohol Spectrum Disorder in the Canadian Criminal Justice System.
Thanh, Nguyen Xuan; Jonsson, Egon
2015-01-01
We reviewed literature to estimate the costs of Fetal Alcohol Spectrum Disorder (FASD) in the Canadian Criminal Justice System (CJS), and to update the total costs of FASD in Canada. The results suggest FASD is costlier than previous estimates. The costs of FASD associated with the CJS are estimated at $3.9 billion a year, with $1.2 billion for police, $0.4 billion for court, $0.5 billion for correctional services, $1.6 billion for victims, and $0.2 billion for third-party. The updated total costs of FASD in Canada are $9.7 billion a year, of which CJS accounts for 40%, healthcare 21%, education 17%, social services 13%, and others 9%.
Galactic cosmic radiation exposure of pregnant aircrew members II
DOT National Transportation Integrated Search
2000-10-01
This report is an updated version of a previously published Technical Note in the journal Aviation, Space, and Environmental Medicine. The main change is that improved computer programs were used to estimate galactic cosmic radiation. The calculation...
A Bayesian Assessment of Seismic Semi-Periodicity Forecasts
NASA Astrophysics Data System (ADS)
Nava, F.; Quinteros, C.; Glowacka, E.; Frez, J.
2016-01-01
Among the schemes for earthquake forecasting, the search for semi-periodicity during large earthquakes in a given seismogenic region plays an important role. When considering earthquake forecasts based on semi-periodic sequence identification, the Bayesian formalism is a useful tool for: (1) assessing how well a given earthquake satisfies a previously made forecast; (2) re-evaluating the semi-periodic sequence probability; and (3) testing other prior estimations of the sequence probability. A comparison of Bayesian estimates with updated estimates of semi-periodic sequences that incorporate new data not used in the original estimates shows extremely good agreement, indicating that: (1) the probability that a semi-periodic sequence is not due to chance is an appropriate estimate for the prior sequence probability estimate; and (2) the Bayesian formalism does a very good job of estimating corrected semi-periodicity probabilities, using slightly less data than that used for updated estimates. The Bayesian approach is exemplified explicitly by its application to the Parkfield semi-periodic forecast, and results are given for its application to other forecasts in Japan and Venezuela.
Mogasale, Vittal; Maskery, Brian; Ochiai, R Leon; Lee, Jung Seok; Mogasale, Vijayalaxmi V; Ramani, Enusa; Kim, Young Eun; Park, Jin Kyung; Wierzba, Thomas F
2014-10-01
No access to safe water is an important risk factor for typhoid fever, yet risk-level heterogeneity is unaccounted for in previous global burden estimates. Since WHO has recommended risk-based use of typhoid polysaccharide vaccine, we revisited the burden of typhoid fever in low-income and middle-income countries (LMICs) after adjusting for water-related risk. We estimated the typhoid disease burden from studies done in LMICs based on blood-culture-confirmed incidence rates applied to the 2010 population, after correcting for operational issues related to surveillance, limitations of diagnostic tests, and water-related risk. We derived incidence estimates, correction factors, and mortality estimates from systematic literature reviews. We did scenario analyses for risk factors, diagnostic sensitivity, and case fatality rates, accounting for the uncertainty in these estimates and we compared them with previous disease burden estimates. The estimated number of typhoid fever cases in LMICs in 2010 after adjusting for water-related risk was 11·9 million (95% CI 9·9-14·7) cases with 129 000 (75 000-208 000) deaths. By comparison, the estimated risk-unadjusted burden was 20·6 million (17·5-24·2) cases and 223 000 (131 000-344 000) deaths. Scenario analyses indicated that the risk-factor adjustment and updated diagnostic test correction factor derived from systematic literature reviews were the drivers of differences between the current estimate and past estimates. The risk-adjusted typhoid fever burden estimate was more conservative than previous estimates. However, by distinguishing the risk differences, it will allow assessment of the effect at the population level and will facilitate cost-effectiveness calculations for risk-based vaccination strategies for future typhoid conjugate vaccine. Copyright © 2014 Mogasale et al. Open Access article distributed under the terms of CC BY-NC-SA. Published by .. All rights reserved.
Chesson, Harrell W; Gift, Thomas L; Owusu-Edusei, Kwame; Tao, Guoyu; Johnson, Ana P; Kent, Charlotte K
2011-10-01
We conducted a literature review of studies of the economic burden of sexually transmitted diseases in the United States. The annual direct medical cost of sexually transmitted diseases (including human immunodeficiency virus) has been estimated to be $16.9 billion (range: $13.9-$23.0 billion) in 2010 US dollars.
Improved Analysis of GW150914 Using a Fully Spin-Precessing Waveform Model
NASA Astrophysics Data System (ADS)
Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwal, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Bejger, M.; Bell, A. S.; Berger, B. K.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, J.; Birney, R.; Birnholtz, O.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, S.; Bock, O.; Boer, M.; Bogaert, G.; Bogan, C.; Bohe, A.; Bond, C.; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brockill, P.; Broida, J. E.; Brooks, A. F.; Brown, D. A.; Brown, D. D.; Brown, N. M.; Brunett, S.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cabero, M.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderón Bustillo, J.; Callister, T.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Casanueva Diaz, C.; Casentini, J.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Cerboni Baiardi, L.; Cerretani, G.; Cesarini, E.; Chamberlin, S. J.; Chan, M.; Chao, S.; Charlton, P.; Chassande-Mottin, E.; Cheeseboro, B. D.; Chen, H. Y.; Chen, Y.; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Q.; Chua, S.; Chung, S.; Ciani, G.; Clara, F.; Clark, J. A.; Cleva, F.; Coccia, E.; Cohadon, P.-F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, C. A.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J.-P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Darman, N. S.; Dasgupta, A.; Da Silva Costa, C. F.; Dattilo, V.; Dave, I.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; De, S.; DeBra, D.; Debreczeni, G.; Degallaix, J.; De Laurentis, M.; Deléglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dergachev, V.; De Rosa, R.; DeRosa, R. T.; DeSalvo, R.; Devine, R. C.; Dhurandhar, S.; Díaz, M. C.; Di Fiore, L.; Di Giovanni, M.; Di Girolamo, T.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H.-B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etienne, Z.; Etzel, T.; Evans, M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Fauchon-Jones, E.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Fenyvesi, E.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. P.; Flaminio, R.; Fletcher, M.; Fournier, J.-D.; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fritschel, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gaebel, S.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gaur, G.; Gehrels, N.; Gemme, G.; Geng, P.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Abhirup; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; González, G.; Gonzalez Castro, J. M.; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Gosselin, M.; Gouaty, R.; Grado, A.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; Hacker, J. J.; Hall, B. R.; Hall, E. D.; Hammond, G.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hannam, M. D.; Hanson, J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C.-J.; Haughian, K.; Healy, J.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Hennig, J.; Henry, J.; Heptonstall, A. W.; Heurs, M.; Hild, S.; Hoak, D.; Hofman, D.; Holt, K.; Holz, D. E.; Hopkins, P.; Hough, J.; Houston, E. A.; Howell, E. J.; Hu, Y. M.; Huang, S.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J.-M.; Isi, M.; Isogai, T.; Iyer, B. R.; Izumi, K.; Jacqmin, T.; Jang, H.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jian, L.; Jiménez-Forteza, F.; Johnson, W. W.; Johnson-McDaniel, N. K.; Jones, D. I.; Jones, R.; Jonker, R. J. G.; Ju, L.; K, Haris; Kalaghatgi, C. V.; Kalogera, V.; Kandhasamy, S.; Kang, G.; Kanner, J. B.; Kapadia, S. J.; Karki, S.; Karvinen, K. S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kéfélian, F.; Kehl, M. S.; Keitel, D.; Kelley, D. B.; Kells, W.; Kennedy, R.; Key, J. S.; Khalili, F. Y.; Khan, I.; Khan, S.; Khan, Z.; Khazanov, E. A.; Kijbunchoo, N.; Kim, Chi-Woong; Kim, Chunglee; Kim, J.; Kim, K.; Kim, N.; Kim, W.; Kim, Y.-M.; Kimbrell, S. J.; King, E. J.; King, P. J.; Kissel, J. S.; Klein, B.; Kleybolte, L.; Klimenko, S.; Koehlenbeck, S. M.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kringel, V.; Królak, A.; Krueger, C.; Kuehn, G.; Kumar, P.; Kumar, R.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Landry, M.; Lange, J.; Lantz, B.; Lasky, P. D.; Laxen, M.; Lazzarini, A.; Lazzaro, C.; Leaci, P.; Leavey, S.; Lebigot, E. O.; Lee, C. H.; Lee, H. K.; Lee, H. M.; Lee, K.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Lewis, J. B.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Lockerbie, N. A.; Lombardi, A. L.; London, L. T.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Lousto, C. O.; Lovelace, G.; Lück, H.; Lundgren, A. P.; Lynch, R.; Ma, Y.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Magaña-Sandoval, F.; Magaña Zertuche, L.; Magee, R. M.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandic, V.; Mangano, V.; Mansell, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Márka, S.; Márka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martynov, D. V.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Mastrogiovanni, S.; Matichard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McRae, T.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mendell, G.; Mercer, R. A.; Merilh, E. L.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Metzdorff, R.; Meyers, P. M.; Mezzani, F.; Miao, H.; Michel, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, A. L.; Miller, A.; Miller, B. B.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B. C.; Moore, C. J.; Moraru, D.; Moreno, G.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Murphy, D. J.; Murray, P. G.; Mytidis, A.; Nardecchia, I.; Naticchioni, L.; Nayak, R. K.; Nedkova, K.; Nelemans, G.; Nelson, T. J. N.; Neri, M.; Neunzert, A.; Newton, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E. N.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; O'Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J. J.; Oh, S. H.; Ohme, F.; Oliver, M.; Oppermann, P.; Oram, Richard J.; O'Reilly, B.; O'Shaughnessy, R.; Ottaway, D. J.; Overmier, H.; Owen, B. J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Paris, H. R.; Parker, W.; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Patrick, Z.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perreca, A.; Perri, L. M.; Pfeiffer, H. P.; Phelps, M.; Piccinni, O. J.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poe, M.; Poggiani, R.; Popolizio, P.; Post, A.; Powell, J.; Prasad, J.; Predoi, V.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prix, R.; Prodi, G. A.; Prokhorov, L.; Puncken, O.; Punturo, M.; Puppo, P.; Pürrer, M.; Qi, H.; Qin, J.; Qiu, S.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rajan, C.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Reyes, S. D.; Ricci, F.; Riles, K.; Rizzo, M.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, R.; Romanov, G.; Romie, J. H.; Rosińska, D.; Rowan, S.; Rüdiger, A.; Ruggi, P.; Ryan, K.; Sachdev, S.; Sadecki, T.; Sadeghian, L.; Sakellariadou, M.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sanchez, E. J.; Sandberg, V.; Sandeen, B.; Sanders, J. R.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Sauter, O. E. S.; Savage, R. L.; Sawadsky, A.; Schale, P.; Schilling, R.; Schmidt, J.; Schmidt, P.; Schnabel, R.; Schofield, R. M. S.; Schönbeck, A.; Schreiber, E.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, S. M.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Setyawati, Y.; Shaddock, D. A.; Shaffer, T.; Shahriar, M. S.; Shaltev, M.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Siellez, K.; Siemens, X.; Sieniawska, M.; Sigg, D.; Silva, A. D.; Singer, A.; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, J. R.; Smith, N. D.; Smith, R. J. E.; Son, E. J.; Sorazu, B.; Sorrentino, F.; Souradeep, T.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stevenson, S. P.; Stone, R.; Strain, K. A.; Straniero, N.; Stratta, G.; Strauss, N. A.; Strigin, S.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sun, L.; Sunil, S.; Sutton, P. J.; Swinkels, B. L.; Szczepańczyk, M. J.; Tacca, M.; Talukder, D.; Tanner, D. B.; Tápai, M.; Tarabrin, S. P.; Taracchini, A.; Taylor, R.; Theeg, T.; Thirugnanasambandam, M. P.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thorne, K. S.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Toland, K.; Tomlinson, C.; Tonelli, M.; Tornasi, Z.; Torres, C. V.; Torrie, C. I.; Töyrä, D.; Travasso, F.; Traylor, G.; Trifirò, D.; Tringali, M. C.; Trozzo, L.; Tse, M.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlbruch, H.; Vajente, G.; Valdes, G.; Vallisneri, M.; van Bakel, N.; van Beuzekom, M.; van den Brand, J. F. J.; Van Den Broeck, C.; Vander-Hyde, D. C.; van der Schaaf, L.; van der Sluys, M. V.; van Heijningen, J. V.; Vano-Vinuales, A.; van Veggel, A. A.; Vardaro, M.; Vass, S.; Vasúth, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P. J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Viceré, A.; Vinciguerra, S.; Vine, D. J.; Vinet, J.-Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D. V.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, M.; Walker, M.; Wallace, L.; Walsh, S.; Wang, G.; Wang, H.; Wang, M.; Wang, X.; Wang, Y.; Ward, R. L.; Warner, J.; Was, M.; Weaver, B.; Wei, L.-W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Wen, L.; Weßels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; Whiting, B. F.; Williams, R. D.; Williamson, A. R.; Willis, J. L.; Willke, B.; Wimmer, M. H.; Winkler, W.; Wipf, C. C.; Wittel, H.; Woan, G.; Woehler, J.; Worden, J.; Wright, J. L.; Wu, D. S.; Wu, G.; Yablon, J.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yu, H.; Yvert, M.; ZadroŻny, A.; Zangrando, L.; Zanolin, M.; Zendri, J.-P.; Zevin, M.; Zhang, L.; Zhang, M.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, X. J.; Zucker, M. E.; Zuraw, S. E.; Zweizig, J.; Boyle, M.; Brügmann, B.; Campanelli, M.; Chu, T.; Clark, M.; Haas, R.; Hemberger, D.; Hinder, I.; Kidder, L. E.; Kinsey, M.; Laguna, P.; Ossokine, S.; Pan, Y.; Röver, C.; Scheel, M.; Szilagyi, B.; Teukolsky, S.; Zlochower, Y.; LIGO Scientific Collaboration; Virgo Collaboration
2016-10-01
This paper presents updated estimates of source parameters for GW150914, a binary black-hole coalescence event detected by the Laser Interferometer Gravitational-wave Observatory (LIGO) in 2015 [Abbott et al. Phys. Rev. Lett. 116, 061102 (2016).]. Abbott et al. [Phys. Rev. Lett. 116, 241102 (2016).] presented parameter estimation of the source using a 13-dimensional, phenomenological precessing-spin model (precessing IMRPhenom) and an 11-dimensional nonprecessing effective-one-body (EOB) model calibrated to numerical-relativity simulations, which forces spin alignment (nonprecessing EOBNR). Here, we present new results that include a 15-dimensional precessing-spin waveform model (precessing EOBNR) developed within the EOB formalism. We find good agreement with the parameters estimated previously [Abbott et al. Phys. Rev. Lett. 116, 241102 (2016).], and we quote updated component masses of 35-3+5 M⊙ and 3 0-4+3 M⊙ (where errors correspond to 90% symmetric credible intervals). We also present slightly tighter constraints on the dimensionless spin magnitudes of the two black holes, with a primary spin estimate <0.65 and a secondary spin estimate <0.75 at 90% probability. Abbott et al. [Phys. Rev. Lett. 116, 241102 (2016).] estimated the systematic parameter-extraction errors due to waveform-model uncertainty by combining the posterior probability densities of precessing IMRPhenom and nonprecessing EOBNR. Here, we find that the two precessing-spin models are in closer agreement, suggesting that these systematic errors are smaller than previously quoted.
The Cosmic Connection Parts for the Berkeley Detector Suppliers: Scintillator Eljen Technology 1 obtain the components needed to build the Berkeley Detector. These companies have helped previous the last update. He estimates that the cost to build a detector varies from $1500 to $2700 depending
Bleeker, S E; Derksen-Lubsen, G; Grobbee, D E; Donders, A R T; Moons, K G M; Moll, H A
2007-01-01
To externally validate and update a previously developed rule for predicting the presence of serious bacterial infections in children with fever without apparent source. Patients, 1-36 mo, presenting with fever without source, were prospectively enrolled. Serious bacterial infection included bacterial meningitis, sepsis, bacteraemia, pneumonia, urinary tract infection, bacterial gastroenteritis, osteomyelitis/ethmoiditis. The generalizability of the original rule was determined. Subsequently, the prediction rule was updated using all available data of the patients with fever without source (1996-1998 and 2000-2001, n = 381) using multivariable logistic regression. the generalizability of the rule appeared insufficient in the new patients (n = 150). In the updated rule, independent predictors from history and examination were duration of fever, vomiting, ill clinical appearance, chest-wall retractions and poor peripheral circulation (ROC area (95%CI): 0.69 (0.63-0.75)). Additional independent predictors from laboratory were serum white blood cell count and C-reactive protein, and in urinalysis > or = 70 white bloods (ROC area (95%CI): 0.83 (0.78-0.88). A previously developed prediction rule for predicting the presence of serious bacterial infection in children with fever without apparent source was updated. Its clinical score can be used as a first screening tool. Additional laboratory testing may specify the individual risk estimate (range: 4-54%) further.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Savy, J.
New design and evaluation guidelines for department of energy facilities subjected to natural phenomena hazard, are being finalized. Although still in draft form at this time, the document describing those guidelines should be considered to be an update of previously available guidelines. The recommendations in the guidelines document mentioned above, and simply referred to as the guidelines'' thereafter, are based on the best information at the time of its development. In particular, the seismic hazard model for the Princeton site was based on a study performed in 1981 for Lawrence Livermore National Laboratory (LLNL), which relied heavily on the resultsmore » of the NRC's Systematic Evaluation Program and was based on a methodology and data sets developed in 1977 and 1978. Considerable advances have been made in the last ten years in the domain of seismic hazard modeling. Thus, it is recommended to update the estimate of the seismic hazard at the DOE sites whenever possible. The major differences between previous estimates and the ones proposed in this study for the PPPL are in the modeling of the strong ground motion at the site, and the treatment of the total uncertainty in the estimates to include knowledge uncertainty, random uncertainty, and expert opinion diversity as well. 28 refs.« less
The Cost of Crime to Society: New Crime-Specific Estimates for Policy and Program Evaluation
French, Michael T.; Fang, Hai
2010-01-01
Estimating the cost to society of individual crimes is essential to the economic evaluation of many social programs, such as substance abuse treatment and community policing. A review of the crime-costing literature reveals multiple sources, including published articles and government reports, which collectively represent the alternative approaches for estimating the economic losses associated with criminal activity. Many of these sources are based upon data that are more than ten years old, indicating a need for updated figures. This study presents a comprehensive methodology for calculating the cost of society of various criminal acts. Tangible and intangible losses are estimated using the most current data available. The selected approach, which incorporates both the cost-of-illness and the jury compensation methods, yields cost estimates for more than a dozen major crime categories, including several categories not found in previous studies. Updated crime cost estimates can help government agencies and other organizations execute more prudent policy evaluations, particularly benefit-cost analyses of substance abuse treatment or other interventions that reduce crime. PMID:20071107
The cost of crime to society: new crime-specific estimates for policy and program evaluation.
McCollister, Kathryn E; French, Michael T; Fang, Hai
2010-04-01
Estimating the cost to society of individual crimes is essential to the economic evaluation of many social programs, such as substance abuse treatment and community policing. A review of the crime-costing literature reveals multiple sources, including published articles and government reports, which collectively represent the alternative approaches for estimating the economic losses associated with criminal activity. Many of these sources are based upon data that are more than 10 years old, indicating a need for updated figures. This study presents a comprehensive methodology for calculating the cost to society of various criminal acts. Tangible and intangible losses are estimated using the most current data available. The selected approach, which incorporates both the cost-of-illness and the jury compensation methods, yields cost estimates for more than a dozen major crime categories, including several categories not found in previous studies. Updated crime cost estimates can help government agencies and other organizations execute more prudent policy evaluations, particularly benefit-cost analyses of substance abuse treatment or other interventions that reduce crime. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
Fulton, Lawrence; Kerr, Bernie; Inglis, James M; Brooks, Matthew; Bastian, Nathaniel D
2015-07-01
In this study, we re-evaluate air ambulance requirements (rules of allocation) and planning considerations based on an Army-approved, Theater Army Analysis scenario. A previous study using workload only estimated a requirement of 0.4 to 0.6 aircraft per admission, a significant bolus over existence-based rules. In this updated study, we estimate requirements for Phase III (major combat operations) using a simulation grounded in previously published work and Phase IV (stability operations) based on four rules of allocation: unit existence rules, workload factors, theater structure (geography), and manual input. This study improves upon previous work by including the new air ambulance mission requirements of Department of Defense 51001.1, Roles and Functions of the Services, by expanding the analysis over two phases, and by considering unit rotation requirements known as Army Force Generation based on Department of Defense policy. The recommendations of this study are intended to inform future planning factors and already provided decision support to the Army Aviation Branch in determining force structure requirements. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
...This final rule updates the prospective payment rates for inpatient rehabilitation facilities (IRFs) for federal fiscal year (FY) 2014 (for discharges occurring on or after October 1, 2013 and on or before September 30, 2014) as required by the statute. This final rule also revised the list of diagnosis codes that may be counted toward an IRF's ``60 percent rule'' compliance calculation to determine ``presumptive compliance,'' update the IRF facility-level adjustment factors using an enhanced estimation methodology, revise sections of the Inpatient Rehabilitation Facility-Patient Assessment Instrument, revise requirements for acute care hospitals that have IRF units, clarify the IRF regulation text regarding limitation of review, update references to previously changed sections in the regulations text, and revise and update quality measures and reporting requirements under the IRF quality reporting program.
2013-08-06
This final rule updates the prospective payment rates for inpatient rehabilitation facilities (IRFs) for federal fiscal year (FY) 2014 (for discharges occurring on or after October 1, 2013 and on or before September 30, 2014) as required by the statute. This final rule also revised the list of diagnosis codes that may be counted toward an IRF's "60 percent rule'' compliance calculation to determine "presumptive compliance,'' update the IRF facility-level adjustment factors using an enhanced estimation methodology, revise sections of the Inpatient Rehabilitation Facility-Patient Assessment Instrument, revise requirements for acute care hospitals that have IRF units, clarify the IRF regulation text regarding limitation of review, update references to previously changed sections in the regulations text, and revise and update quality measures and reporting requirements under the IRF quality reporting program.
James, Eric P.; Benjamin, Stanley G.; Marquis, Melinda
2016-10-28
A new gridded dataset for wind and solar resource estimation over the contiguous United States has been derived from hourly updated 1-h forecasts from the National Oceanic and Atmospheric Administration High-Resolution Rapid Refresh (HRRR) 3-km model composited over a three-year period (approximately 22 000 forecast model runs). The unique dataset features hourly data assimilation, and provides physically consistent wind and solar estimates for the renewable energy industry. The wind resource dataset shows strong similarity to that previously provided by a Department of Energy-funded study, and it includes estimates in southern Canada and northern Mexico. The solar resource dataset represents anmore » initial step towards application-specific fields such as global horizontal and direct normal irradiance. This combined dataset will continue to be augmented with new forecast data from the advanced HRRR atmospheric/land-surface model.« less
Walton, David M; Macdermid, Joy C; Giorgianni, Anthony A; Mascarenhas, Joanna C; West, Stephen C; Zammit, Caroline A
2013-02-01
Systematic review and meta-analysis. To update a previous review and meta-analysis on risk factors for persistent problems following whiplash secondary to a motor vehicle accident. Prognosis in whiplash-associated disorder (WAD) has become an active area of research, perhaps owing to the difficulty of treating chronic problems. A previously published review and meta-analysis of prognostic factors included primary sources up to May 2007. Since that time, more research has become available, and an update to that original review is warranted. A systematic search of international databases was conducted, with rigorous inclusion criteria focusing on studies published between May 2007 and May 2012. Articles were scored, and data were extracted and pooled to estimate the odds ratio for any factor that had at least 3 independent data points in the literature. Four new cohorts (n = 1121) were identified. In combination with findings of a previous review, 12 variables were found to be significant predictors of poor outcome following whiplash, 9 of which were new (n = 2) or revised (n = 7) as a result of additional data. The significant variables included high baseline pain intensity (greater than 5.5/10), report of headache at inception, less than postsecondary education, no seatbelt in use during the accident, report of low back pain at inception, high Neck Disability Index score (greater than 14.5/50), preinjury neck pain, report of neck pain at inception (regardless of intensity), high catastrophizing, female sex, WAD grade 2 or 3, and WAD grade 3 alone. Those variables robust to publication bias included high pain intensity, female sex, report of headache at inception, less than postsecondary education, high Neck Disability Index score, and WAD grade 2 or 3. Three existing variables (preaccident history of headache, rear-end collision, older age) and 1 additional novel variable (collision severity) were refined or added in this updated review but showed no significant predictive value. This review identified 2 additional prognostic factors and refined the estimates of 7 previously identified factors, bringing the total number of significant predictors across the 2 reviews to 12. These factors can be easily identified in a clinical setting to provide estimates of prognosis following whiplash.
NREL: International Activities - Philippines Wind Resource Maps and Data
Philippines Wind Resource Maps and Data In 2014, under the Enhancing Capacity for Low Emission National Wind Technology Center and Geospatial Data Science Team applied modern approaches to update previous estimates to support the development of wind energy potential in the Philippines. The new
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurtz, S.
This report summarizes the current status of the CPV industry and is updated from previous versions to include information from the last year. New information presented at the CPV-8 conference is included along with the addition of new companies that have announced their interest in CPV, and estimates of production volumes for 2011 and 2012.
Improved Analysis of GW150914 Using a Fully Spin-Precessing Waveform Model
NASA Technical Reports Server (NTRS)
Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Camp, J. B.;
2016-01-01
This paper presents updated estimates of source parameters for GW150914, a binary black-hole coalescence event detected by the Laser Interferometer Gravitational-wave Observatory (LIGO) in 2015 [Abbott et al. Phys. Rev. Lett. 116, 061102 (2016).]. Abbott et al. [Phys. Rev. Lett. 116, 241102 (2016).] presented parameter estimation of the source using a 13-dimensional, phenomenological precessing-spin model (precessing IMRPhenom) and an 11-dimensional nonprecessing effective-one-body (EOB) model calibrated to numerical-relativity simulations, which forces spin alignment (nonprecessing EOBNR). Here, we present new results that include a 15-dimensional precessing-spin waveform model (precessing EOBNR) developed within the EOB formalism. We find good agreement with the parameters estimated previously [Abbott et al. Phys. Rev. Lett. 116, 241102 (2016).], and we quote updated component masses of 35(+5)(-3) solar M; and 30(+3)(-4) solar M; (where errors correspond to 90 symmetric credible intervals). We also present slightly tighter constraints on the dimensionless spin magnitudes of the two black holes, with a primary spin estimate is less than 0.65 and a secondary spin estimate is less than 0.75 at 90% probability. Abbott et al. [Phys. Rev. Lett. 116, 241102 (2016).] estimated the systematic parameter-extraction errors due to waveform-model uncertainty by combining the posterior probability densities of precessing IMRPhenom and nonprecessing EOBNR. Here, we find that the two precessing-spin models are in closer agreement, suggesting that these systematic errors are smaller than previously quoted.
Shope, Christopher L.; Angeroth, Cory E.
2015-01-01
Effective management of surface waters requires a robust understanding of spatiotemporal constituent loadings from upstream sources and the uncertainty associated with these estimates. We compared the total dissolved solids loading into the Great Salt Lake (GSL) for water year 2013 with estimates of previously sampled periods in the early 1960s.We also provide updated results on GSL loading, quantitatively bounded by sampling uncertainties, which are useful for current and future management efforts. Our statistical loading results were more accurate than those from simple regression models. Our results indicate that TDS loading to the GSL in water year 2013 was 14.6 million metric tons with uncertainty ranging from 2.8 to 46.3 million metric tons, which varies greatly from previous regression estimates for water year 1964 of 2.7 million metric tons. Results also indicate that locations with increased sampling frequency are correlated with decreasing confidence intervals. Because time is incorporated into the LOADEST models, discrepancies are largely expected to be a function of temporally lagged salt storage delivery to the GSL associated with terrestrial and in-stream processes. By incorporating temporally variable estimates and statistically derived uncertainty of these estimates,we have provided quantifiable variability in the annual estimates of dissolved solids loading into the GSL. Further, our results support the need for increased monitoring of dissolved solids loading into saline lakes like the GSL by demonstrating the uncertainty associated with different levels of sampling frequency.
An updated checklist of aquatic plants of Myanmar and Thailand
2014-01-01
Abstract The flora of Tropical Asia is among the richest in the world, yet the actual diversity is estimated to be much higher than previously reported. Myanmar and Thailand are adjacent countries that together occupy more than the half the area of continental Tropical Asia. This geographic area is diverse ecologically, ranging from cool-temperate to tropical climates, and includes from coast, rainforests and high mountain elevations. An updated checklist of aquatic plants, which includes 78 species in 44 genera from 24 families, are presented based on floristic works. This number includes seven species, that have never been listed in the previous floras and checklists. The species (excluding non-indigenous taxa) were categorized by five geographic groups with the exception of to reflect the rich diversity of the countries' floras. PMID:24723783
Krueger, Hans; Krueger, Joshua; Koot, Jacqueline
2015-04-30
Tobacco smoking, excess weight and physical inactivity contribute substantially to the preventable disease burden in Canada. The purpose of this paper is to determine the potential reduction in economic burden if all provinces achieved prevalence rates of these three risk factors (RFs) equivalent to those of the province with the lowest rates, and to update and address a limitation noted in our previous model. We used a previously developed approach based on population attributable fractions to estimate the economic burden associated with these RFs. Sex-specific relative risk and age-/sex-specific prevalence data were used in the modelling. The previous model was updated using the most current data for developing resource allocation weights. In 2012, the prevalence of tobacco smoking, excess weight and physical inactivity was the lowest in British Columbia. If age- and sex-specific prevalence rates from BC were applied to populations living in the other provinces, the annual economic burden attributable to these three RFs would be reduced by $5.3 billion. Updating the model resulted in a considerable shift in economic burden from smoking to excess weight, with the estimated annual economic burden attributable to excess weight now 25% higher compared to that of tobacco smoking ($23.3 vs. $18.7 billion). Achieving RF prevalence rates equivalent to those of the province with the lowest rates would result in a 10% reduction in economic burden attributable to excess weight, smoking and physical inactivity in Canada. This study shows that using current resource use data is important for this type of economic modelling.
Updated generalized biomass equations for North American tree species
David C. Chojnacky; Linda S. Heath; Jennifer C. Jenkins
2014-01-01
Historically, tree biomass at large scales has been estimated by applying dimensional analysis techniques and field measurements such as diameter at breast height (dbh) in allometric regression equations. Equations often have been developed using differing methods and applied only to certain species or isolated areas. We previously had compiled and combined (in meta-...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-28
... economic zone (EEZ). NMFS previously determined the commercial annual catch limit (ACL) for gray..., on July 7, 2013. However, updated landings estimates indicate the commercial ACL for gray triggerfish... Conservation and Management Act (Magnuson-Stevens Act). Background NMFS determined that the commercial ACL for...
Scenarios for the Hanford immobilized Low-Activity waste (ILAW) performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
MANN, F.M.
The purpose of the next version of the Hanford Immobilized Low-Activity Tank Waste (ILAW) Performance Assessment (ILAW PA) is to provide an updated estimate of the long-term human health and environmental impact of the disposal of ILAW and to compare these estimates against performance objectives displayed in Tables 1,2, and 3 (Mann 1999a). Such a radiological performance assessment is required by U.S. Department of Energy (DOE) Orders on radioactive waste management (DOE 1988a and DOE 1999a). This document defines the scenarios that will be used for the next update of the PA that is scheduled to be issued in 2001.more » Since the previous performance assessment (Mann 1998) was issued, considerable additional data on waste form behavior and site-specific soil geotechnical properties have been collected. In addition, the 2001 ILAW PA will benefit from improved computer models and the experience gained from the previous performance assessment. However, the scenarios (that is, the features, events, and processes analyzed in the Performance assessment) for the next PA are very similar to the ones in the 1998 PA.« less
Teaching for All? Teach For America’s Effects across the Distribution of Student Achievement
Penner, Emily K.
2016-01-01
This paper examines the effect of Teach For America (TFA) on the distribution of student achievement in elementary school. It extends previous research by estimating quantile treatment effects (QTE) to examine how student achievement in TFA and non-TFA classrooms differs across the broader distribution of student achievement. It also updates prior distributional work on TFA by correcting for previously unidentified missing data and estimating unconditional, rather than conditional QTE. Consistent with previous findings, results reveal a positive impact of TFA teachers across the distribution of math achievement. In reading, however, relative to veteran non-TFA teachers, students at the bottom of the reading distribution score worse in TFA classrooms, and students in the upper half of the distribution perform better. PMID:27668032
The Population of Near-Earth Asteroids Revisited
NASA Astrophysics Data System (ADS)
Harris, Alan William
2017-10-01
I have been tracking progress of the surveys discovering Near-Earth Asteroids (NEAs) for more than 20 years, and have reported updates every few years at past meetings. Following my last report at a DPS and the published update two years ago (Harris and D’Abramo 2015, Icarus 257, 302-312), it came to light that these and previous estimates were affected by round-off of H magnitudes by the Minor Planet Center to 0.1 mag. While it is true that individual magnitudes are generally not even that accurate, statistically the round-off shifted the population estimate by ~6%. While this hardly matters in the small size range, for the largest asteroids the shift alters N(H<17.75), assumed equivalent to N(D>1km), from 990 ± 20 (Harris & D’Abramo 2015) to 934 ± 20. Since the number already discovered, 872, is the same for both solutions, the implied completion of the surveys shifts from 88% to 93%. Not only is this correction satisfying with regard to the “Spaceguard Goal” of discovering 90% of NEAs of D > 1 km, but it reduces the estimated number of large NEAs remaining to be discovered by nearly a factor of 2. In this presentation I will explain the correction to the round-off bias and present an updated population estimate and survey progress using discoveries up to July, 2017.
Baker, Nancy T.; Stone, Wesley W.
2013-01-01
This report provides preliminary estimates of annual agricultural use of 374 pesticide compounds in counties of the conterminous United States in 2010 and 2011, compiled by means of methods described in Thelin and Stone (2013). U.S. Department of Agriculture (USDA) county-level data for harvested-crop acreage were used in conjunction with proprietary Crop Reporting District (CRD)-level pesticide-use data to estimate county-level pesticide use. Estimated pesticide use (EPest) values were calculated with both the EPest-high and EPest-low methods. The distinction between the EPest-high method and the EPest-low method is that there are more counties with estimated pesticide use for EPest-high compared to EPest-low, owing to differing assumptions about missing survey data (Thelin and Stone, 2013). Preliminary estimates in this report will be revised upon availability of updated crop acreages in the 2012 Agricultural Census, to be published by the USDA in 2014. In addition, estimates for 2008 and 2009 previously published by Stone (2013) will be updated subsequent to the 2012 Agricultural Census release. Estimates of annual agricultural pesticide use are provided as downloadable, tab-delimited files, which are organized by compound, year, state Federal Information Processing Standard (FIPS) code, county FIPS code, and kg (amount in kilograms).
An updated Holocene sea-level curve for the Delaware coast
Nikitina, D.L.; Pizzuto, J.E.; Schwimmer, R.A.; Ramsey, K.W.
2000-01-01
We present an updated Holocene sea-level curve for the Delaware coast based on new calibrations of 16 previously published radiocarbon dates (Kraft, 1976; Belknap and Kraft, 1977) and 22 new radiocarbon dates of basal peat deposits. A review of published and unpublished 137Cs and 210Pb analyses, and tide gauge data provide the basis for evaluating shorter-term (102 yr) sea-level trends. Paleosea-level elevations for the new basal peat samples were determined from the present vertical zonation of marsh plants relative to mean high water along the Delaware coast and the composition of plant fossils and foraminifera. Current trends in tidal range along the Delaware coast were used to reduce elevations from different locations to a common vertical datum of mean high water at Breakwater Harbor, Delaware. The updated curve is similar to Belknap and Kraft's [J. Sediment. Petrol., 47 (1977) 610-629] original sea-level curve from 12,000 to about 2000 yr BP. The updated curve documents a rate of sea-level rise of 0.9 mm/yr from 1250 yr BP to present (based on 11 dates), in good agreement with other recent sea-level curves from the northern and central U.S. Atlantic coast, while the previous curve documents rates of about 1.3 mm/yr (based on 4 dates). The precision of both estimates, however, is very low, so the significance of these differences is uncertain. A review of 210Pb and 137Cs analyses from salt marshes of Delaware indicates average marsh accretion rates of 3 mm/yr for the last 100 yr, in good agreement with shorter-term estimates of sea-level rise from tide gauge records. ?? 2000 Elsevier Science B.V.
Kellerborg, Klas; Danielsson, Anna-Karin; Allebeck, Peter; Coates, Matthew M; Agardh, Emilie
2016-08-01
The Global Burden of Disease (GBD) study continuously refines its estimates as new data and methods become available. In the latest iteration of the study, GBD 2013, changes were made related to the disease burden attributed to alcohol. The aim of this study was to briefly present these changes and to compare the disease burden attributed to alcohol in Swedish men and women in 2010 using previous and updated methods. In the GBD study, the contribution of alcohol to the burden of disease is estimated by theoretically assessing how much of the disease burden can be avoided by reducing the consumption of alcohol to zero. The updated methods mainly consider improved measurements of alcohol consumption, including less severe alcohol dependence, assigning the most severe injuries and removing the protective effect of drinking on cardiovascular diseases if combined with binge drinking. The overall disease burden attributed to alcohol in 2010 increased by 14% when using the updated methods. Women accounted for this overall increase, mainly because the updated methods led to an overall higher alcohol consumption in women. By contrast, the overall burden decreased in men, one reason being the lower overall alcohol consumption with the new methods. In men, the inclusion of less severe alcohol dependence resulted in a large decrease in the alcohol attributed disease burden. This was, however, evened out to a great extent by the increase in cardiovascular disease and injuries. CONCLUSIONS WHEN USING THE UPDATED GBD METHODS, THE OVERALL DISEASE BURDEN ATTRIBUTED TO ALCOHOL INCREASED IN WOMEN, BUT NOT IN MEN. © 2016 the Nordic Societies of Public Health.
Estimating for building and civil engineering works. Eighth edition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geddes, S.; Chrystal-Smith, G.; Jolly, P.
1986-01-01
This new edition of Spence Geddes classic work has been revised and updated to take into account changes since the seventh edition of 1981. It remains a standard reference work which combines a step-by-step guide to the preparation of estimates from tendering stage with a fully representative selection of labour and material constants and worked examples of actual calculations. The estimating information is tabulated as hour constants which are unaffected by fluctuations of labour and plant hire costs. Two new sections have been included. In previous editions dayworks received a few brief notes only, but as so much daywork ismore » carried out on both large and small contracts, and as it can frequently give rise to misunderstanding, a fuller explanation was thought helpful. Landscaping, once the province of the gardener is now very often an integral part of building and civil engineering contracts and a new chapter has therefore been added. With these additions and the careful updating, the book is an indispensable source of reference for the estimator and a valuable source of information for architects, engineers and surveyors.« less
Accurate step-hold tracking of smoothly varying periodic and aperiodic probability.
Ricci, Matthew; Gallistel, Randy
2017-07-01
Subjects observing many samples from a Bernoulli distribution are able to perceive an estimate of the generating parameter. A question of fundamental importance is how the current percept-what we think the probability now is-depends on the sequence of observed samples. Answers to this question are strongly constrained by the manner in which the current percept changes in response to changes in the hidden parameter. Subjects do not update their percept trial-by-trial when the hidden probability undergoes unpredictable and unsignaled step changes; instead, they update it only intermittently in a step-hold pattern. It could be that the step-hold pattern is not essential to the perception of probability and is only an artifact of step changes in the hidden parameter. However, we now report that the step-hold pattern obtains even when the parameter varies slowly and smoothly. It obtains even when the smooth variation is periodic (sinusoidal) and perceived as such. We elaborate on a previously published theory that accounts for: (i) the quantitative properties of the step-hold update pattern; (ii) subjects' quick and accurate reporting of changes; (iii) subjects' second thoughts about previously reported changes; (iv) subjects' detection of higher-order structure in patterns of change. We also call attention to the challenges these results pose for trial-by-trial updating theories.
Updated Global Burden of Cholera in Endemic Countries
Ali, Mohammad; Nelson, Allyson R.; Lopez, Anna Lena; Sack, David A.
2015-01-01
Background The global burden of cholera is largely unknown because the majority of cases are not reported. The low reporting can be attributed to limited capacity of epidemiological surveillance and laboratories, as well as social, political, and economic disincentives for reporting. We previously estimated 2.8 million cases and 91,000 deaths annually due to cholera in 51 endemic countries. A major limitation in our previous estimate was that the endemic and non-endemic countries were defined based on the countries’ reported cholera cases. We overcame the limitation with the use of a spatial modelling technique in defining endemic countries, and accordingly updated the estimates of the global burden of cholera. Methods/Principal Findings Countries were classified as cholera endemic, cholera non-endemic, or cholera-free based on whether a spatial regression model predicted an incidence rate over a certain threshold in at least three of five years (2008-2012). The at-risk populations were calculated for each country based on the percent of the country without sustainable access to improved sanitation facilities. Incidence rates from population-based published studies were used to calculate the estimated annual number of cases in endemic countries. The number of annual cholera deaths was calculated using inverse variance-weighted average case-fatality rate (CFRs) from literature-based CFR estimates. We found that approximately 1.3 billion people are at risk for cholera in endemic countries. An estimated 2.86 million cholera cases (uncertainty range: 1.3m-4.0m) occur annually in endemic countries. Among these cases, there are an estimated 95,000 deaths (uncertainty range: 21,000-143,000). Conclusion/Significance The global burden of cholera remains high. Sub-Saharan Africa accounts for the majority of this burden. Our findings can inform programmatic decision-making for cholera control. PMID:26043000
Evaluation of improved land use and canopy representation in ...
Biogenic volatile organic compounds (BVOC) participate in reactions that can lead to secondarily formed ozone and particulate matter (PM) impacting air quality and climate. BVOC emissions are important inputs to chemical transport models applied on local to global scales but considerable uncertainty remains in the representation of canopy parameterizations and emission algorithms from different vegetation species. The Biogenic Emission Inventory System (BEIS) has been used to support both scientific and regulatory model assessments for ozone and PM. Here we describe a new version of BEIS which includes updated input vegetation data and canopy model formulation for estimating leaf temperature and vegetation data on estimated BVOC. The Biogenic Emission Landuse Database (BELD) was revised to incorporate land use data from the Moderate Resolution Imaging Spectroradiometer (MODIS) land product and 2006 National Land Cover Database (NLCD) land coverage. Vegetation species data are based on the US Forest Service (USFS) Forest Inventory and Analysis (FIA) version 5.1 for 2002–2013 and US Department of Agriculture (USDA) 2007 census of agriculture data. This update results in generally higher BVOC emissions throughout California compared with the previous version of BEIS. Baseline and updated BVOC emission estimates are used in Community Multiscale Air Quality (CMAQ) Model simulations with 4 km grid resolution and evaluated with measurements of isoprene and monoterp
QUENCHING OF CARBON MONOXIDE AND METHANE IN THE ATMOSPHERES OF COOL BROWN DWARFS AND HOT JUPITERS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Visscher, Channon; Moses, Julianne I., E-mail: visscher@lpi.usra.edu, E-mail: jmoses@spacescience.org
We explore CO{r_reversible}CH{sub 4} quench kinetics in the atmospheres of substellar objects using updated timescale arguments, as suggested by a thermochemical kinetics and diffusion model that transitions from the thermochemical-equilibrium regime in the deep atmosphere to a quench-chemical regime at higher altitudes. More specifically, we examine CO quench chemistry on the T dwarf Gliese 229B and CH{sub 4} quench chemistry on the hot-Jupiter HD 189733b. We describe a method for correctly calculating reverse rate coefficients for chemical reactions, discuss the predominant pathways for CO{r_reversible}CH{sub 4} interconversion as indicated by the model, and demonstrate that a simple timescale approach can bemore » used to accurately describe the behavior of quenched species when updated reaction kinetics and mixing-length-scale assumptions are used. Proper treatment of quench kinetics has important implications for estimates of molecular abundances and/or vertical mixing rates in the atmospheres of substellar objects. Our model results indicate significantly higher K{sub zz} values than previously estimated near the CO quench level on Gliese 229B, whereas current-model-data comparisons using CH{sub 4} permit a wide range of K{sub zz} values on HD 189733b. We also use updated reaction kinetics to revise previous estimates of the Jovian water abundance, based upon the observed abundance and chemical behavior of carbon monoxide. The CO chemical/observational constraint, along with Galileo entry probe data, suggests a water abundance of approximately 0.51-2.6 x solar (for a solar value of H{sub 2}O/H{sub 2} = 9.61 x 10{sup -4}) in Jupiter's troposphere, assuming vertical mixing from the deep atmosphere is the only source of tropospheric CO.« less
Uncertainty Estimation in Elastic Full Waveform Inversion by Utilising the Hessian Matrix
NASA Astrophysics Data System (ADS)
Hagen, V. S.; Arntsen, B.; Raknes, E. B.
2017-12-01
Elastic Full Waveform Inversion (EFWI) is a computationally intensive iterative method for estimating elastic model parameters. A key element of EFWI is the numerical solution of the elastic wave equation which lies as a foundation to quantify the mismatch between synthetic (modelled) and true (real) measured seismic data. The misfit between the modelled and true receiver data is used to update the parameter model to yield a better fit between the modelled and true receiver signal. A common approach to the EFWI model update problem is to use a conjugate gradient search method. In this approach the resolution and cross-coupling for the estimated parameter update can be found by computing the full Hessian matrix. Resolution of the estimated model parameters depend on the chosen parametrisation, acquisition geometry, and temporal frequency range. Although some understanding has been gained, it is still not clear which elastic parameters can be reliably estimated under which conditions. With few exceptions, previous analyses have been based on arguments using radiation pattern analysis. We use the known adjoint-state technique with an expansion to compute the Hessian acting on a model perturbation to conduct our study. The Hessian is used to infer parameter resolution and cross-coupling for different selections of models, acquisition geometries, and data types, including streamer and ocean bottom seismic recordings. Information about the model uncertainty is obtained from the exact Hessian, and is essential when evaluating the quality of estimated parameters due to the strong influence of source-receiver geometry and frequency content. Investigation is done on both a homogeneous model and the Gullfaks model where we illustrate the influence of offset on parameter resolution and cross-coupling as a way of estimating uncertainty.
Updates to the zoonotic niche map of Ebola virus disease in Africa
Pigott, David M; Millear, Anoushka I; Earl, Lucas; Morozoff, Chloe; Han, Barbara A; Shearer, Freya M; Weiss, Daniel J; Brady, Oliver J; Kraemer, Moritz UG; Moyes, Catherine L; Bhatt, Samir; Gething, Peter W; Golding, Nick; Hay, Simon I
2016-01-01
As the outbreak of Ebola virus disease (EVD) in West Africa is now contained, attention is turning from control to future outbreak prediction and prevention. Building on a previously published zoonotic niche map (Pigott et al., 2014), this study incorporates new human and animal occurrence data and expands upon the way in which potential bat EVD reservoir species are incorporated. This update demonstrates the potential for incorporating and updating data used to generate the predicted suitability map. A new data portal for sharing such maps is discussed. This output represents the most up-to-date estimate of the extent of EVD zoonotic risk in Africa. These maps can assist in strengthening surveillance and response capacity to contain viral haemorrhagic fevers. DOI: http://dx.doi.org/10.7554/eLife.16412.001 PMID:27414263
Model-Based Engine Control Architecture with an Extended Kalman Filter
NASA Technical Reports Server (NTRS)
Csank, Jeffrey T.; Connolly, Joseph W.
2016-01-01
This paper discusses the design and implementation of an extended Kalman filter (EKF) for model-based engine control (MBEC). Previously proposed MBEC architectures feature an optimal tuner Kalman Filter (OTKF) to produce estimates of both unmeasured engine parameters and estimates for the health of the engine. The success of this approach relies on the accuracy of the linear model and the ability of the optimal tuner to update its tuner estimates based on only a few sensors. Advances in computer processing are making it possible to replace the piece-wise linear model, developed off-line, with an on-board nonlinear model running in real-time. This will reduce the estimation errors associated with the linearization process, and is typically referred to as an extended Kalman filter. The non-linear extended Kalman filter approach is applied to the Commercial Modular Aero-Propulsion System Simulation 40,000 (C-MAPSS40k) and compared to the previously proposed MBEC architecture. The results show that the EKF reduces the estimation error, especially during transient operation.
Model-Based Engine Control Architecture with an Extended Kalman Filter
NASA Technical Reports Server (NTRS)
Csank, Jeffrey T.; Connolly, Joseph W.
2016-01-01
This paper discusses the design and implementation of an extended Kalman filter (EKF) for model-based engine control (MBEC). Previously proposed MBEC architectures feature an optimal tuner Kalman Filter (OTKF) to produce estimates of both unmeasured engine parameters and estimates for the health of the engine. The success of this approach relies on the accuracy of the linear model and the ability of the optimal tuner to update its tuner estimates based on only a few sensors. Advances in computer processing are making it possible to replace the piece-wise linear model, developed off-line, with an on-board nonlinear model running in real-time. This will reduce the estimation errors associated with the linearization process, and is typically referred to as an extended Kalman filter. The nonlinear extended Kalman filter approach is applied to the Commercial Modular Aero-Propulsion System Simulation 40,000 (C-MAPSS40k) and compared to the previously proposed MBEC architecture. The results show that the EKF reduces the estimation error, especially during transient operation.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-27
... Perryman (2002) used similar methods to Wade (2002) to update the status assessment of ENP gray whales by...). Subsequently, Laake et al. (2009) reanalyzed all previous abundance data using methods consistent with Wade... time series of abundance estimates. Punt and Wade (2010) used methods similar to those described by...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-13
... exclusive economic zone (EEZ). NMFS previously determined the commercial ACL for yellowtail snapper would be... Atlantic EEZ at 12:01 a.m. on September 11, 2012. Updated landings estimates indicate the ACL will not be..., September 11, 2012, through December 31, 2012, the end of the fishing season, unless the ACL is reached...
Brownian motion with adaptive drift for remaining useful life prediction: Revisited
NASA Astrophysics Data System (ADS)
Wang, Dong; Tsui, Kwok-Leung
2018-01-01
Linear Brownian motion with constant drift is widely used in remaining useful life predictions because its first hitting time follows the inverse Gaussian distribution. State space modelling of linear Brownian motion was proposed to make the drift coefficient adaptive and incorporate on-line measurements into the first hitting time distribution. Here, the drift coefficient followed the Gaussian distribution, and it was iteratively estimated by using Kalman filtering once a new measurement was available. Then, to model nonlinear degradation, linear Brownian motion with adaptive drift was extended to nonlinear Brownian motion with adaptive drift. However, in previous studies, an underlying assumption used in the state space modelling was that in the update phase of Kalman filtering, the predicted drift coefficient at the current time exactly equalled the posterior drift coefficient estimated at the previous time, which caused a contradiction with the predicted drift coefficient evolution driven by an additive Gaussian process noise. In this paper, to alleviate such an underlying assumption, a new state space model is constructed. As a result, in the update phase of Kalman filtering, the predicted drift coefficient at the current time evolves from the posterior drift coefficient at the previous time. Moreover, the optimal Kalman filtering gain for iteratively estimating the posterior drift coefficient at any time is mathematically derived. A discussion that theoretically explains the main reasons why the constructed state space model can result in high remaining useful life prediction accuracies is provided. Finally, the proposed state space model and its associated Kalman filtering gain are applied to battery prognostics.
Lives Saved Tool (LiST) costing: a module to examine costs and prioritize interventions.
Bollinger, Lori A; Sanders, Rachel; Winfrey, William; Adesina, Adebiyi
2017-11-07
Achieving the Sustainable Development Goals will require careful allocation of resources in order to achieve the highest impact. The Lives Saved Tool (LiST) has been used widely to calculate the impact of maternal, neonatal and child health (MNCH) interventions for program planning and multi-country estimation in several Lancet Series commissions. As use of the LiST model increases, many have expressed a desire to cost interventions within the model, in order to support budgeting and prioritization of interventions by countries. A limited LiST costing module was introduced several years ago, but with gaps in cost types. Updates to inputs have now been added to make the module fully functional for a range of uses. This paper builds on previous work that developed an initial version of the LiST costing module to provide costs for MNCH interventions using an ingredients-based costing approach. Here, we update in 2016 the previous econometric estimates from 2013 with newly-available data and also include above-facility level costs such as program management. The updated econometric estimates inform percentages of intervention-level costs for some direct costs and indirect costs. These estimates add to existing values for direct cost requirements for items such as drugs and supplies and required provider time which were already available in LiST Costing. Results generated by the LiST costing module include costs for each intervention, as well as disaggregated costs by intervention including drug and supply costs, labor costs, other recurrent costs, capital costs, and above-service delivery costs. These results can be combined with mortality estimates to support prioritization of interventions by countries. The LiST costing module provides an option for countries to identify resource requirements for scaling up a maternal, neonatal, and child health program, and to examine the financial impact of different resource allocation strategies. It can be a useful tool for countries as they seek to identify the best investments for scarce resources. The purpose of the LiST model is to provide a tool to make resource allocation decisions in a strategic planning process through prioritizing interventions based on resulting impact on maternal and child mortality and morbidity.
Update of the α - n Yields for Reactor Fuel Materials for the Interest of Nuclear Safeguards
NASA Astrophysics Data System (ADS)
Simakov, S. P.; van den Berg, Q. Y.
2017-01-01
The neutron yields caused by spontaneous α-decay of actinides and subsequent (α,xn) reactions were re-evaluated for the reactor fuel materials UO2, UF6, PuO2 and PuF4. For this purpose, the most recent reference data for decay parameters, α-particle stopping powers and (α,xn) cross sections were collected, analysed and used in calculations. The input data and elaborated code were validated against available thick target neutron yields in pure and compound materials measured at accelerators or with radioactive sources. This paper provides the specific neutron yields and their uncertainties resultant from α-decay of actinides 241Am, 249Bk, 252Cf, 242,244Cm, 237Np, 238-242Pu, 232Th and 232-236,238U in oxide and fluoride compounds. The obtained results are an update of previous reference tables issued by the Los Alamos National Laboratory in 1991 which were used for the safeguarding of radioactive materials by passive non-destructive techniques. The comparison of the updated values with previous ones shows an agreement within one estimated uncertainty (≈ 10%) for oxides, and deviations of up to 50% for fluorides.
Chandran, Anil Kumar Nalini; Yoo, Yo-Han; Cao, Peijian; Sharma, Rita; Sharma, Manoj; Dardick, Christopher; Ronald, Pamela C; Jung, Ki-Hong
2016-12-01
Protein kinases catalyze the transfer of a phosphate moiety from a phosphate donor to the substrate molecule, thus playing critical roles in cell signaling and metabolism. Although plant genomes contain more than 1000 genes that encode kinases, knowledge is limited about the function of each of these kinases. A major obstacle that hinders progress towards kinase characterization is functional redundancy. To address this challenge, we previously developed the rice kinase database (RKD) that integrated omics-scale data within a phylogenetics context. An updated version of rice kinase database (RKD) that contains metadata derived from NCBI GEO expression datasets has been developed. RKD 2.0 facilitates in-depth transcriptomic analyses of kinase-encoding genes in diverse rice tissues and in response to biotic and abiotic stresses and hormone treatments. We identified 261 kinases specifically expressed in particular tissues, 130 that are significantly up- regulated in response to biotic stress, 296 in response to abiotic stress, and 260 in response to hormones. Based on this update and Pearson correlation coefficient (PCC) analysis, we estimated that 19 out of 26 genes characterized through loss-of-function studies confer dominant functions. These were selected because they either had paralogous members with PCC values of <0.5 or had no paralog. Compared with the previous version of RKD, RKD 2.0 enables more effective estimations of functional redundancy or dominance because it uses comprehensive expression profiles rather than individual profiles. The integrated analysis of RKD with PCC establishes a single platform for researchers to select rice kinases for functional analyses.
Retrospective Assessment of Cost Savings From Prevention
Grosse, Scott D.; Berry, Robert J.; Tilford, J. Mick; Kucik, James E.; Waitzman, Norman J.
2016-01-01
Introduction Although fortification of food with folic acid has been calculated to be cost saving in the U.S., updated estimates are needed. This analysis calculates new estimates from the societal perspective of net cost savings per year associated with mandatory folic acid fortification of enriched cereal grain products in the U.S. that was implemented during 1997–1998. Methods Estimates of annual numbers of live-born spina bifida cases in 1995–1996 relative to 1999–2011 based on birth defects surveillance data were combined during 2015 with published estimates of the present value of lifetime direct costs updated in 2014 U.S. dollars for a live-born infant with spina bifida to estimate avoided direct costs and net cost savings. Results The fortification mandate is estimated to have reduced the annual number of U.S. live-born spina bifida cases by 767, with a lower-bound estimate of 614. The present value of mean direct lifetime cost per infant with spina bifida is estimated to be $791,900, or $577,000 excluding caregiving costs. Using a best estimate of numbers of avoided live-born spina bifida cases, fortification is estimated to reduce the present value of total direct costs for each year's birth cohort by $603 million more than the cost of fortification. A lower-bound estimate of cost savings using conservative assumptions, including the upper-bound estimate of fortification cost, is $299 million. Conclusions The estimates of cost savings are larger than previously reported, even using conservative assumptions. The analysis can also inform assessments of folic acid fortification in other countries. PMID:26790341
Stellar Parameters for Trappist-1
NASA Astrophysics Data System (ADS)
Van Grootel, Valérie; Fernandes, Catarina S.; Gillon, Michael; Jehin, Emmanuel; Manfroid, Jean; Scuflaire, Richard; Burgasser, Adam J.; Barkaoui, Khalid; Benkhaldoun, Zouhair; Burdanov, Artem; Delrez, Laetitia; Demory, Brice-Olivier; de Wit, Julien; Queloz, Didier; Triaud, Amaury H. M. J.
2018-01-01
TRAPPIST-1 is an ultracool dwarf star transited by seven Earth-sized planets, for which thorough characterization of atmospheric properties, surface conditions encompassing habitability, and internal compositions is possible with current and next-generation telescopes. Accurate modeling of the star is essential to achieve this goal. We aim to obtain updated stellar parameters for TRAPPIST-1 based on new measurements and evolutionary models, compared to those used in discovery studies. We present a new measurement for the parallax of TRAPPIST-1, 82.4 ± 0.8 mas, based on 188 epochs of observations with the TRAPPIST and Liverpool Telescopes from 2013 to 2016. This revised parallax yields an updated luminosity of {L}* =(5.22+/- 0.19)× {10}-4 {L}ȯ , which is very close to the previous estimate but almost two times more precise. We next present an updated estimate for TRAPPIST-1 stellar mass, based on two approaches: mass from stellar evolution modeling, and empirical mass derived from dynamical masses of equivalently classified ultracool dwarfs in astrometric binaries. We combine them using a Monte-Carlo approach to derive a semi-empirical estimate for the mass of TRAPPIST-1. We also derive estimate for the radius by combining this mass with stellar density inferred from transits, as well as an estimate for the effective temperature from our revised luminosity and radius. Our final results are {M}* =0.089+/- 0.006 {M}ȯ , {R}* =0.121+/- 0.003 {R}ȯ , and {T}{eff} = 2516 ± 41 K. Considering the degree to which the TRAPPIST-1 system will be scrutinized in coming years, these revised and more precise stellar parameters should be considered when assessing the properties of TRAPPIST-1 planets.
Identification of transmissivity fields using a Bayesian strategy and perturbative approach
NASA Astrophysics Data System (ADS)
Zanini, Andrea; Tanda, Maria Giovanna; Woodbury, Allan D.
2017-10-01
The paper deals with the crucial problem of the groundwater parameter estimation that is the basis for efficient modeling and reclamation activities. A hierarchical Bayesian approach is developed: it uses the Akaike's Bayesian Information Criteria in order to estimate the hyperparameters (related to the covariance model chosen) and to quantify the unknown noise variance. The transmissivity identification proceeds in two steps: the first, called empirical Bayesian interpolation, uses Y* (Y = lnT) observations to interpolate Y values on a specified grid; the second, called empirical Bayesian update, improve the previous Y estimate through the addition of hydraulic head observations. The relationship between the head and the lnT has been linearized through a perturbative solution of the flow equation. In order to test the proposed approach, synthetic aquifers from literature have been considered. The aquifers in question contain a variety of boundary conditions (both Dirichelet and Neuman type) and scales of heterogeneities (σY2 = 1.0 and σY2 = 5.3). The estimated transmissivity fields were compared to the true one. The joint use of Y* and head measurements improves the estimation of Y considering both degrees of heterogeneity. Even if the variance of the strong transmissivity field can be considered high for the application of the perturbative approach, the results show the same order of approximation of the non-linear methods proposed in literature. The procedure allows to compute the posterior probability distribution of the target quantities and to quantify the uncertainty in the model prediction. Bayesian updating has advantages related both to the Monte-Carlo (MC) and non-MC approaches. In fact, as the MC methods, Bayesian updating allows computing the direct posterior probability distribution of the target quantities and as non-MC methods it has computational times in the order of seconds.
Jégu, Jérémie; Belot, Aurélien; Borel, Christian; Daubisse-Marliac, Laetitia; Trétarre, Brigitte; Ganry, Olivier; Guizard, Anne-Valérie; Bara, Simona; Troussard, Xavier; Bouvier, Véronique; Woronoff, Anne-Sophie; Colonna, Marc; Velten, Michel
2015-05-01
To provide head and neck squamous cell carcinoma (HNSCC) survival estimates with respect to patient previous history of cancer. Data from ten French population-based cancer registries were used to establish a cohort of all male patients presenting with a HNSCC diagnosed between 1989 and 2004. Vital status was updated until December 31, 2007. The 5-year overall and net survival estimates were assessed using the Kaplan-Meier and Pohar-Perme estimators, respectively. Multivariate Cox regression models were used to assess the effect of cancer history adjusted for age and year of HNSCC diagnosis. Among the cases of HNSCC, 5553 were localized in the oral cavity, 3646 in the oropharynx, 3793 in the hypopharynx and 4550 in the larynx. From 11.0% to 16.8% of patients presented with a previous history of cancer according to HNSCC. Overall and net survival were closely tied to the presence, or not, of a previous cancer. For example, for carcinoma of the oral cavity, the five-year overall survival was 14.0%, 5.9% and 36.7% in case of previous lung cancer, oesophagus cancer or no cancer history, respectively. Multivariate analyses showed that previous history of cancer was a prognosis factor independent of age and year of diagnosis (p<.001). Previous history of cancer is strongly associated with survival among HNSCC patients. Survival estimates based on patients' previous history of cancer will enable clinicians to assess more precisely the prognosis of their patients with respect to this major comorbid condition. Copyright © 2015 Elsevier Ltd. All rights reserved.
The role of spatial memory and frames of reference in the precision of angular path integration.
Arthur, Joeanna C; Philbeck, John W; Kleene, Nicholas J; Chichka, David
2012-09-01
Angular path integration refers to the ability to maintain an estimate of self-location after a rotational displacement by integrating internally-generated (idiothetic) self-motion signals over time. Previous work has found that non-sensory inputs, namely spatial memory, can play a powerful role in angular path integration (Arthur et al., 2007, 2009). Here we investigated the conditions under which spatial memory facilitates angular path integration. We hypothesized that the benefit of spatial memory is particularly likely in spatial updating tasks in which one's self-location estimate is referenced to external space. To test this idea, we administered passive, non-visual body rotations (ranging 40°-140°) about the yaw axis and asked participants to use verbal reports or open-loop manual pointing to indicate the magnitude of the rotation. Prior to some trials, previews of the surrounding environment were given. We found that when participants adopted an egocentric frame of reference, the previously-observed benefit of previews on within-subject response precision was not manifested, regardless of whether remembered spatial frameworks were derived from vision or spatial language. We conclude that the powerful effect of spatial memory is dependent on one's frame of reference during self-motion updating. Copyright © 2012 Elsevier B.V. All rights reserved.
Valdez-Flores, Ciriaco; Sielken, Robert L; Teta, M Jane
2010-04-01
The most recent epidemiological data on individual workers in the NIOSH and updated UCC occupational studies have been used to characterize the potential excess cancer risks of environmental exposure to ethylene oxide (EO). In addition to refined analyses of the separate cohorts, power has been increased by analyzing the combined cohorts. In previous SMR analyses of the separate studies and the present analyses of the updated and pooled studies of over 19,000 workers, none of the SMRs for any combination of the 12 cancer endpoints and six sub-cohorts analyzed were statistically significantly greater than one including the ones of greatest previous interest: leukemia, lymphohematopoietic tissue, lymphoid tumors, NHL, and breast cancer. In our study, no evidence of a positive cumulative exposure-response relationship was found. Fitted Cox proportional hazards models with cumulative EO exposure do not have statistically significant positive slopes. The lack of increasing trends was corroborated by categorical analyses. Cox model estimates of the concentrations corresponding to a 1-in-a-million extra environmental cancer risk are all greater than approximately 1ppb and are more than 1500-fold greater than the 0.4ppt estimate in the 2006 EPA draft IRIS risk assessment. The reasons for this difference are identified and discussed. Copyright 2009 Elsevier Inc. All rights reserved.
Chevance, Aurélie; Schuster, Tibor; Steele, Russell; Ternès, Nils; Platt, Robert W
2015-10-01
Robustness of an existing meta-analysis can justify decisions on whether to conduct an additional study addressing the same research question. We illustrate the graphical assessment of the potential impact of an additional study on an existing meta-analysis using published data on statin use and the risk of acute kidney injury. A previously proposed graphical augmentation approach is used to assess the sensitivity of the current test and heterogeneity statistics extracted from existing meta-analysis data. In addition, we extended the graphical augmentation approach to assess potential changes in the pooled effect estimate after updating a current meta-analysis and applied the three graphical contour definitions to data from meta-analyses on statin use and acute kidney injury risk. In the considered example data, the pooled effect estimates and heterogeneity indices demonstrated to be considerably robust to the addition of a future study. Supportingly, for some previously inconclusive meta-analyses, a study update might yield statistically significant kidney injury risk increase associated with higher statin exposure. The illustrated contour approach should become a standard tool for the assessment of the robustness of meta-analyses. It can guide decisions on whether to conduct additional studies addressing a relevant research question. Copyright © 2015 Elsevier Inc. All rights reserved.
An Investigation Into the Effects of Frequency Response Function Estimators on Model Updating
NASA Astrophysics Data System (ADS)
Ratcliffe, M. J.; Lieven, N. A. J.
1999-03-01
Model updating is a very active research field, in which significant effort has been invested in recent years. Model updating methodologies are invariably successful when used on noise-free simulated data, but tend to be unpredictable when presented with real experimental data that are—unavoidably—corrupted with uncorrelated noise content. In the development and validation of model-updating strategies, a random zero-mean Gaussian variable is added to simulated test data to tax the updating routines more fully. This paper proposes a more sophisticated model for experimental measurement noise, and this is used in conjunction with several different frequency response function estimators, from the classical H1and H2to more refined estimators that purport to be unbiased. Finite-element model case studies, in conjunction with a genuine experimental test, suggest that the proposed noise model is a more realistic representation of experimental noise phenomena. The choice of estimator is shown to have a significant influence on the viability of the FRF sensitivity method. These test cases find that the use of the H2estimator for model updating purposes is contraindicated, and that there is no advantage to be gained by using the sophisticated estimators over the classical H1estimator.
Contraceptive failure in the United States
Trussell, James
2013-01-01
This review provides an update of previous estimates of first-year probabilities of contraceptive failure for all methods of contraception available in the United States. Estimates are provided of probabilities of failure during typical use (which includes both incorrect and inconsistent use) and during perfect use (correct and consistent use). The difference between these two probabilities reveals the consequences of imperfect use; it depends both on how unforgiving of imperfect use a method is and on how hard it is to use that method perfectly. These revisions reflect new research on contraceptive failure both during perfect use and during typical use. PMID:21477680
Ochratoxin A in Portugal: A Review to Assess Human Exposure
Duarte, Sofia C.; Pena, Angelina; Lino, Celeste M.
2010-01-01
In Portugal, the climate, dietary habits, and food contamination levels present the characteristics for higher population susceptibility to ochratoxin A (OTA), one of the known mycotoxins with the greatest public health and agro-economic importance. In this review, following a brief historical insight on OTA research, a summary of the available data on OTA occurrence in food (cereals, bread, wine, meat) and biological fluids (blood, urine) is made. With this data, an estimation of intake is made to ascertain and update the risk exposure estimation of the Portuguese population, in comparison to previous studies and other populations. PMID:22069635
Online Updating of Statistical Inference in the Big Data Setting.
Schifano, Elizabeth D; Wu, Jing; Wang, Chun; Yan, Jun; Chen, Ming-Hui
2016-01-01
We present statistical methods for big data arising from online analytical processing, where large amounts of data arrive in streams and require fast analysis without storage/access to the historical data. In particular, we develop iterative estimating algorithms and statistical inferences for linear models and estimating equations that update as new data arrive. These algorithms are computationally efficient, minimally storage-intensive, and allow for possible rank deficiencies in the subset design matrices due to rare-event covariates. Within the linear model setting, the proposed online-updating framework leads to predictive residual tests that can be used to assess the goodness-of-fit of the hypothesized model. We also propose a new online-updating estimator under the estimating equation setting. Theoretical properties of the goodness-of-fit tests and proposed estimators are examined in detail. In simulation studies and real data applications, our estimator compares favorably with competing approaches under the estimating equation setting.
Who will have health insurance in the future? An updated projection.
Young, Richard A; DeVoe, Jennifer E
2012-01-01
The passage of the 2010 Patient Protection and Affordable Care Act (PPACA) in the United States put the issues of health care reform and health care costs back in the national spotlight. DeVoe and colleagues previously estimated that the cost of a family health insurance premium would equal the median household income by the year 2025. A slowdown in health care spending tied to the recent economic downturn and the passage of the PPACA occurred after this model was published. In this updated model, we estimate that this threshold will be crossed in 2033, and under favorable assumptions the PPACA may extend this date only to 2037. Continuing to make incremental changes in US health policy will likely not bend the cost curve, which has eluded policy makers for the past 50 years. Private health insurance will become increasingly unaffordable to low-to-middle-income Americans unless major changes are made in the US health care system.
SOLID WASTE INTEGRATED FORECAST TECHNICAL (SWIFT) REPORT FY2005 THRU FY2035 2005.0 VOLUME 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
BARCOT, R.A.
This report provides up-to-date life cycle information about the radioactive solid waste expected to be managed by Hanford's Waste Management (WM) Project from onsite and offsite generators. It includes: (1) an overview of Hanford-wide solid waste to be managed by the WM Project; (2) multi-level and waste class-specific estimates; (3) background information on waste sources; and (4) comparisons to previous forecasts and other national data sources. The focus of this report is low-level waste (LLW), mixed low-level waste (MLLW), and transuranic waste, both non-mixed and mixed (TRU(M)). Some details on hazardous waste are also provided, however, this information is notmore » considered comprehensive. This report includes data requested in December, 2004 with updates through March 31,2005. The data represent a life cycle forecast covering all reported activities from FY2005 through the end of each program's life cycle and are an update of the previous FY2004.1 data version.« less
NASA Astrophysics Data System (ADS)
Khaire, Vikram; Srianand, Raghunathan
2016-01-01
In the standard picture, the only sources of cosmic UV background are the quasars and the star forming galaxies. The hydrogen ionizing emissivity from galaxies depends on a parameter known as escape fraction (fesc). It is the ratio of the escaping hydrogen ionizing photons from galaxies to the total produced by their stellar population. Using available multi-wavelength and multi-epoch galaxy luminosity function measurements, we update the galaxy emissivity by estimating a self-consistent combination of the star formation rate density and dust attenuation. Using the recent quasar luminosity function measurements, we present an updated hydrogen ionizing emissivity from quasars which shows a factor of ~2 increase as compared to the previous estimates at z<2. We use these in a cosmological radiative transfer code developed by us to generate the UV background and show that the recently inferred high values of hydrogen photoionization rates at low redshifts can be easily obtained with reasonable values of fesc. This resolves the problem of 'photon underproduction crisis' and shows that there is no need to invoke non-standard sources of the UV background such as decaying dark matter particles. We will present the implications of this updated quasar and galaxy emissivity on the values of fesc at high redshifts and on the cosmic reionization. We will also present the effect of the updated UV background on the inferred properties of the intergalactic medium, especially on the Lyman alpha forest and the metal line absorption systems.
Spectroscopic orbits of nearby solar-type dwarfs - II.
NASA Astrophysics Data System (ADS)
Gorynya, N. A.; Tokovinin, A.
2018-03-01
Several nearby solar-type dwarfs with variable radial velocity were monitored to find their spectroscopic orbits. First orbital elements of 15 binaries (HIP 12144, 17895, 27970, 32329, 38636, 39072, 40479, 43004, 73700, 79234, 84696, 92140, 88656, 104514, and 112222) are determined. The previously known orbits of HIP 5276, 21443, 28678, and 41214 are confirmed and updated. The orbital periods range from 2 d to 4 yr. There are eight hierarchical systems with additional distant companions among those 19 stars. The outer visual orbit of the triple system HIP 17895 is updated and the masses of all its components are estimated. We provide radial velocities of another 16 Hipparcos stars without orbital solutions, some of those with long periods or false claims of variability.
Grosse, Scott D; Berry, Robert J; Mick Tilford, J; Kucik, James E; Waitzman, Norman J
2016-05-01
Although fortification of food with folic acid has been calculated to be cost saving in the U.S., updated estimates are needed. This analysis calculates new estimates from the societal perspective of net cost savings per year associated with mandatory folic acid fortification of enriched cereal grain products in the U.S. that was implemented during 1997-1998. Estimates of annual numbers of live-born spina bifida cases in 1995-1996 relative to 1999-2011 based on birth defects surveillance data were combined during 2015 with published estimates of the present value of lifetime direct costs updated in 2014 U.S. dollars for a live-born infant with spina bifida to estimate avoided direct costs and net cost savings. The fortification mandate is estimated to have reduced the annual number of U.S. live-born spina bifida cases by 767, with a lower-bound estimate of 614. The present value of mean direct lifetime cost per infant with spina bifida is estimated to be $791,900, or $577,000 excluding caregiving costs. Using a best estimate of numbers of avoided live-born spina bifida cases, fortification is estimated to reduce the present value of total direct costs for each year's birth cohort by $603 million more than the cost of fortification. A lower-bound estimate of cost savings using conservative assumptions, including the upper-bound estimate of fortification cost, is $299 million. The estimates of cost savings are larger than previously reported, even using conservative assumptions. The analysis can also inform assessments of folic acid fortification in other countries. Published by Elsevier Inc.
Jones, Joseph L.; Haluska, Tana L.; Kresch, David L.
2001-01-01
A method of updating flood inundation maps at a fraction of the expense of using traditional methods was piloted in Washington State as part of the U.S. Geological Survey Urban Geologic and Hydrologic Hazards Initiative. Large savings in expense may be achieved by building upon previous Flood Insurance Studies and automating the process of flood delineation with a Geographic Information System (GIS); increases in accuracy and detail result from the use of very-high-accuracy elevation data and automated delineation; and the resulting digital data sets contain valuable ancillary information such as flood depth, as well as greatly facilitating map storage and utility. The method consists of creating stage-discharge relations from the archived output of the existing hydraulic model, using these relations to create updated flood stages for recalculated flood discharges, and using a GIS to automate the map generation process. Many of the effective flood maps were created in the late 1970?s and early 1980?s, and suffer from a number of well recognized deficiencies such as out-of-date or inaccurate estimates of discharges for selected recurrence intervals, changes in basin characteristics, and relatively low quality elevation data used for flood delineation. FEMA estimates that 45 percent of effective maps are over 10 years old (FEMA, 1997). Consequently, Congress has mandated the updating and periodic review of existing maps, which have cost the Nation almost 3 billion (1997) dollars. The need to update maps and the cost of doing so were the primary motivations for piloting a more cost-effective and efficient updating method. New technologies such as Geographic Information Systems and LIDAR (Light Detection and Ranging) elevation mapping are key to improving the efficiency of flood map updating, but they also improve the accuracy, detail, and usefulness of the resulting digital flood maps. GISs produce digital maps without manual estimation of inundated areas between cross sections, and can generate working maps across a broad range of scales, for any selected area, and overlayed with easily updated cultural features. Local governments are aggressively collecting very-high-accuracy elevation data for numerous reasons; this not only lowers the cost and increases accuracy of flood maps, but also inherently boosts the level of community involvement in the mapping process. These elevation data are also ideal for hydraulic modeling, should an existing model be judged inadequate.
Cohn, T.A.; Lane, W.L.; Baier, W.G.
1997-01-01
This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record; information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters, of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.
NASA Astrophysics Data System (ADS)
Cohn, T. A.; Lane, W. L.; Baier, W. G.
This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record; information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters, of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.
Use of indexing to update United States annual timber harvest by state
James Howard; Enrique Quevedo; Andrew Kramp
2009-01-01
This report provides an index method that can be used to update recent estimates of timber harvest by state to a common current year and to make 5-year projections. The Forest Service Forest Inventory and Analysis (FIA) program makes estimates of harvest for each state in differing years. The purpose of this updating method is to bring each state-level estimate up to a...
The Site-Scale Saturated Zone Flow Model for Yucca Mountain
NASA Astrophysics Data System (ADS)
Al-Aziz, E.; James, S. C.; Arnold, B. W.; Zyvoloski, G. A.
2006-12-01
This presentation provides a reinterpreted conceptual model of the Yucca Mountain site-scale flow system subject to all quality assurance procedures. The results are based on a numerical model of site-scale saturated zone beneath Yucca Mountain, which is used for performance assessment predictions of radionuclide transport and to guide future data collection and modeling activities. This effort started from the ground up with a revised and updated hydrogeologic framework model, which incorporates the latest lithology data, and increased grid resolution that better resolves the hydrogeologic framework, which was updated throughout the model domain. In addition, faults are much better represented using the 250× 250- m2 spacing (compared to the previous model's 500× 500-m2 spacing). Data collected since the previous model calibration effort have been included and they comprise all Nye County water-level data through Phase IV of their Early Warning Drilling Program. Target boundary fluxes are derived from the newest (2004) Death Valley Regional Flow System model from the US Geologic Survey. A consistent weighting scheme assigns importance to each measured water-level datum and boundary flux extracted from the regional model. The numerical model is calibrated by matching these weighted water level measurements and boundary fluxes using parameter estimation techniques, along with more informal comparisons of the model to hydrologic and geochemical information. The model software (hydrologic simulation code FEHM~v2.24 and parameter estimation software PEST~v5.5) and model setup facilitates efficient calibration of multiple conceptual models. Analyses evaluate the impact of these updates and additional data on the modeled potentiometric surface and the flowpaths emanating from below the repository. After examining the heads and permeabilities obtained from the calibrated models, we present particle pathways from the proposed repository and compare them to those from the previous model calibration. Specific discharge at a point 5~km from the repository is also examined and found to be within acceptable uncertainty. The results show that updated model yields a calibration with smaller residuals than the previous model revision while ensuring that flowpaths follow measured gradients and paths derived from hydrochemical analyses. This work was supported by the Yucca Mountain Site Characterization Office as part of the Civilian Radioactive Waste Management Program, which is managed by the U.S. Department of Energy, Yucca Mountain Site Characterization Project. Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE AC04 94AL85000.
Vision Based Localization in Urban Environments
NASA Technical Reports Server (NTRS)
McHenry, Michael; Cheng, Yang; Matthies, Larry
2005-01-01
As part of DARPA's MARS2020 program, the Jet Propulsion Laboratory developed a vision-based system for localization in urban environments that requires neither GPS nor active sensors. System hardware consists of a pair of small FireWire cameras and a standard Pentium-based computer. The inputs to the software system consist of: 1) a crude grid-based map describing the positions of buildings, 2) an initial estimate of robot location and 3) the video streams produced by each camera. At each step during the traverse the system: captures new image data, finds image features hypothesized to lie on the outside of a building, computes the range to those features, determines an estimate of the robot's motion since the previous step and combines that data with the map to update a probabilistic representation of the robot's location. This probabilistic representation allows the system to simultaneously represent multiple possible locations, For our testing, we have derived the a priori map manually using non-orthorectified overhead imagery, although this process could be automated. The software system consists of two primary components. The first is the vision system which uses binocular stereo ranging together with a set of heuristics to identify features likely to be part of building exteriors and to compute an estimate of the robot's motion since the previous step. The resulting visual features and the associated range measurements are software component, a particle-filter based localization system. This system uses the map and the then fed to the second primary most recent results from the vision system to update the estimate of the robot's location. This report summarizes the design of both the hardware and software and will include the results of applying the system to the global localization of a robot over an approximately half-kilometer traverse across JPL'S Pasadena campus.
Geothermal resources and reserves in Indonesia: an updated revision
NASA Astrophysics Data System (ADS)
Fauzi, A.
2015-02-01
More than 300 high- to low-enthalpy geothermal sources have been identified throughout Indonesia. From the early 1980s until the late 1990s, the geothermal potential for power production in Indonesia was estimated to be about 20 000 MWe. The most recent estimate exceeds 29 000 MWe derived from the 300 sites (Geological Agency, December 2013). This resource estimate has been obtained by adding all of the estimated geothermal potential resources and reserves classified as "speculative", "hypothetical", "possible", "probable", and "proven" from all sites where such information is available. However, this approach to estimating the geothermal potential is flawed because it includes double counting of some reserve estimates as resource estimates, thus giving an inflated figure for the total national geothermal potential. This paper describes an updated revision of the geothermal resource estimate in Indonesia using a more realistic methodology. The methodology proposes that the preliminary "Speculative Resource" category should cover the full potential of a geothermal area and form the base reference figure for the resource of the area. Further investigation of this resource may improve the level of confidence of the category of reserves but will not necessarily increase the figure of the "preliminary resource estimate" as a whole, unless the result of the investigation is higher. A previous paper (Fauzi, 2013a, b) redefined and revised the geothermal resource estimate for Indonesia. The methodology, adopted from Fauzi (2013a, b), will be fully described in this paper. As a result of using the revised methodology, the potential geothermal resources and reserves for Indonesia are estimated to be about 24 000 MWe, some 5000 MWe less than the 2013 national estimate.
Computer program for updating timber resource statistics by county, with tables for Mississippi
Roy C. Beltz; Joe F. Christopher
1970-01-01
A computer program is available for updating Forest Survey estimates of timber growth, cut, and inventory volume by species group, for sawtimber and growing stock. Changes in rate of product removal are estimated from changes in State severance tax data. Updated tables are given for Mississippi.
Updated folate data in the Dutch Food Composition Database and implications for intake estimates
Westenbrink, Susanne; Jansen-van der Vliet, Martine; van Rossum, Caroline
2012-01-01
Background and objective Nutrient values are influenced by the analytical method used. Food folate measured by high performance liquid chromatography (HPLC) or by microbiological assay (MA) yield different results, with in general higher results from MA than from HPLC. This leads to the question of how to deal with different analytical methods in compiling standardised and internationally comparable food composition databases? A recent inventory on folate in European food composition databases indicated that currently MA is more widely used than HPCL. Since older Dutch values are produced by HPLC and newer values by MA, analytical methods and procedures for compiling folate data in the Dutch Food Composition Database (NEVO) were reconsidered and folate values were updated. This article describes the impact of this revision of folate values in the NEVO database as well as the expected impact on the folate intake assessment in the Dutch National Food Consumption Survey (DNFCS). Design The folate values were revised by replacing HPLC with MA values from recent Dutch analyses. Previously MA folate values taken from foreign food composition tables had been recalculated to the HPLC level, assuming a 27% lower value from HPLC analyses. These recalculated values were replaced by the original MA values. Dutch HPLC and MA values were compared to each other. Folate intake was assessed for a subgroup within the DNFCS to estimate the impact of the update. Results In the updated NEVO database nearly all folate values were produced by MA or derived from MA values which resulted in an average increase of 24%. The median habitual folate intake in young children was increased by 11–15% using the updated folate values. Conclusion The current approach for folate in NEVO resulted in more transparency in data production and documentation and higher comparability among European databases. Results of food consumption surveys are expected to show higher folate intakes when using the updated values. PMID:22481900
S.N. Oswalt
2017-01-01
This resource update provides an overview of forest resources in Louisiana based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Southern Research Station. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly. The estimates presented in this update are for the...
Thomas Brandeis; Andy Hartsell; KaDonna Randolph; Sonja Oswalt; Consuelo Brandeis
2016-01-01
This resource update provides an overview of forest resources in Kentucky based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Southern Research Station. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly. The estimates presented in this update are...
Petkewich, Matthew D.; Conrads, Paul
2013-01-01
The Everglades Depth Estimation Network is an integrated network of real-time water-level gaging stations, a ground-elevation model, and a water-surface elevation model designed to provide scientists, engineers, and water-resource managers with water-level and water-depth information (1991-2013) for the entire freshwater portion of the Greater Everglades. The U.S. Geological Survey Greater Everglades Priority Ecosystems Science provides support for the Everglades Depth Estimation Network in order for the Network to provide quality-assured monitoring data for the U.S. Army Corps of Engineers Comprehensive Everglades Restoration Plan. In a previous study, water-level estimation equations were developed to fill in missing data to increase the accuracy of the daily water-surface elevation model. During this study, those equations were updated because of the addition and removal of water-level gaging stations, the consistent use of water-level data relative to the North American Vertical Datum of 1988, and availability of recent data (March 1, 2006, to September 30, 2011). Up to three linear regression equations were developed for each station by using three different input stations to minimize the occurrences of missing data for an input station. Of the 667 water-level estimation equations developed to fill missing data at 223 stations, more than 72 percent of the equations have coefficients of determination greater than 0.90, and 97 percent have coefficients of determination greater than 0.70.
Hyperhidrosis: an update on prevalence and severity in the United States.
Doolittle, James; Walker, Patricia; Mills, Thomas; Thurston, Jane
2016-12-01
Current published estimates of the prevalence of hyperhidrosis in the United States are outdated and underestimate the true prevalence of the condition. The objectives of this study are to provide an updated estimate of the prevalence of hyperhidrosis in the US population and to further assess the severity and impact of sweating on those affected by the condition. For the purposes of obtaining prevalence, a nationally representative sample of 8160 individuals were selected using an online panel, and information as to whether or not they experience hyperhidrosis was obtained. The 393 individuals (210 female, 244 non-Hispanic white, 27 black, mean age 40.3, SE 0.64) who indicated that they have hyperhidrosis were asked further questions, including body areas impacted, severity of symptoms, age of onset, and socioemotional impact of the condition. Current results estimate the prevalence of hyperhidrosis at 4.8 %, which represents approximately 15.3 million people in the United States. Of these, 70 % report severe excessive sweating in at least one body area. In spite of this, only 51 % have discussed their excessive sweating with a healthcare professional. The main reasons are a belief that hyperhidrosis is not a medical condition and that no treatment options exist. The current study's findings with regard to age of onset and prevalence by body area generally align with the previous research. However, current findings suggest that the severity and prevalence are both higher than previously thought, indicating a need for greater awareness of the condition and its associated treatment options among medical professionals.
Complaint Regarding the Use of Audit Results on a $1 Billion Missile Defense Agency Contract
2014-09-12
None of the applicable FAR/DFARS clauses in the contract required withholds, nor did the PCO determine such was in the best interest of the...which PCO defaulted back to the previous report, 1201-2006L24010001, dated June 27, 2007 which states the Estimating System and internal control...Leonard, DCAA requested an extension to March 15, 2010. The PCO acknowledged and thanked DCAA for the update. While MDA acknowledged the request
Life-Cycle Cost/Benefit Assessment of Expedite Departure Path (EDP)
NASA Technical Reports Server (NTRS)
Wang, Jianzhong Jay; Chang, Paul; Datta, Koushik
2005-01-01
This report presents a life-cycle cost/benefit assessment (LCCBA) of Expedite Departure Path (EDP), an air traffic control Decision Support Tool (DST) currently under development at NASA. This assessment is an update of a previous study performed by bd Systems, Inc. (bd) during FY01, with the following revisions: The life-cycle cost assessment methodology developed by bd for the previous study was refined and calibrated using Free Flight Phase 1 (FFP1) cost information for Traffic Management Advisor (TMA, or TMA-SC in the FAA's terminology). Adjustments were also made to the site selection and deployment scheduling methodology to include airspace complexity as a factor. This technique was also applied to the benefit extrapolation methodology to better estimate potential benefits for other years, and at other sites. This study employed a new benefit estimating methodology because bd s previous single year potential benefit assessment of EDP used unrealistic assumptions that resulted in optimistic estimates. This methodology uses an air traffic simulation approach to reasonably predict the impacts from the implementation of EDP. The results of the costs and benefits analyses were then integrated into a life-cycle cost/benefit assessment.
Stone, Mandy L.; Graham, Jennifer L.; Gatotho, Jackline W.
2013-01-01
Cheney Reservoir, located in south-central Kansas, is one of the primary water supplies for the city of Wichita, Kansas. The U.S. Geological Survey has operated a continuous real-time water-quality monitoring station in Cheney Reservoir since 2001; continuously measured physicochemical properties include specific conductance, pH, water temperature, dissolved oxygen, turbidity, fluorescence (wavelength range 650 to 700 nanometers; estimate of total chlorophyll), and reservoir elevation. Discrete water-quality samples were collected during 2001 through 2009 and analyzed for sediment, nutrients, taste-and-odor compounds, cyanotoxins, phytoplankton community composition, actinomycetes bacteria, and other water-quality measures. Regression models were developed to establish relations between discretely sampled constituent concentrations and continuously measured physicochemical properties to compute concentrations of constituents that are not easily measured in real time. The water-quality information in this report is important to the city of Wichita because it allows quantification and characterization of potential constituents of concern in Cheney Reservoir. This report updates linear regression models published in 2006 that were based on data collected during 2001 through 2003. The update uses discrete and continuous data collected during May 2001 through December 2009. Updated models to compute dissolved solids, sodium, chloride, and suspended solids were similar to previously published models. However, several other updated models changed substantially from previously published models. In addition to updating relations that were previously developed, models also were developed for four new constituents, including magnesium, dissolved phosphorus, actinomycetes bacteria, and the cyanotoxin microcystin. In addition, a conversion factor of 0.74 was established to convert the Yellow Springs Instruments (YSI) model 6026 turbidity sensor measurements to the newer YSI model 6136 sensor at the Cheney Reservoir site. Because a high percentage of geosmin and microcystin data were below analytical detection thresholds (censored data), multiple logistic regression was used to develop models that best explained the probability of geosmin and microcystin concentrations exceeding relevant thresholds. The geosmin and microcystin models are particularly important because geosmin is a taste-and-odor compound and microcystin is a cyanotoxin.
Qin, Fangjun; Chang, Lubin; Jiang, Sai; Zha, Feng
2018-05-03
In this paper, a sequential multiplicative extended Kalman filter (SMEKF) is proposed for attitude estimation using vector observations. In the proposed SMEKF, each of the vector observations is processed sequentially to update the attitude, which can make the measurement model linearization more accurate for the next vector observation. This is the main difference to Murrell’s variation of the MEKF, which does not update the attitude estimate during the sequential procedure. Meanwhile, the covariance is updated after all the vector observations have been processed, which is used to account for the special characteristics of the reset operation necessary for the attitude update. This is the main difference to the traditional sequential EKF, which updates the state covariance at each step of the sequential procedure. The numerical simulation study demonstrates that the proposed SMEKF has more consistent and accurate performance in a wide range of initial estimate errors compared to the MEKF and its traditional sequential forms.
A Sequential Multiplicative Extended Kalman Filter for Attitude Estimation Using Vector Observations
Qin, Fangjun; Jiang, Sai; Zha, Feng
2018-01-01
In this paper, a sequential multiplicative extended Kalman filter (SMEKF) is proposed for attitude estimation using vector observations. In the proposed SMEKF, each of the vector observations is processed sequentially to update the attitude, which can make the measurement model linearization more accurate for the next vector observation. This is the main difference to Murrell’s variation of the MEKF, which does not update the attitude estimate during the sequential procedure. Meanwhile, the covariance is updated after all the vector observations have been processed, which is used to account for the special characteristics of the reset operation necessary for the attitude update. This is the main difference to the traditional sequential EKF, which updates the state covariance at each step of the sequential procedure. The numerical simulation study demonstrates that the proposed SMEKF has more consistent and accurate performance in a wide range of initial estimate errors compared to the MEKF and its traditional sequential forms. PMID:29751538
Composite Intelligent Learning Control of Strict-Feedback Systems With Disturbance.
Xu, Bin; Sun, Fuchun
2018-02-01
This paper addresses the dynamic surface control of uncertain nonlinear systems on the basis of composite intelligent learning and disturbance observer in presence of unknown system nonlinearity and time-varying disturbance. The serial-parallel estimation model with intelligent approximation and disturbance estimation is built to obtain the prediction error and in this way the composite law for weights updating is constructed. The nonlinear disturbance observer is developed using intelligent approximation information while the disturbance estimation is guaranteed to converge to a bounded compact set. The highlight is that different from previous work directly toward asymptotic stability, the transparency of the intelligent approximation and disturbance estimation is included in the control scheme. The uniformly ultimate boundedness stability is analyzed via Lyapunov method. Through simulation verification, the composite intelligent learning with disturbance observer can efficiently estimate the effect caused by system nonlinearity and disturbance while the proposed approach obtains better performance with higher accuracy.
NASA Astrophysics Data System (ADS)
Wang, Rong; Tao, Shu; Balkanski, Yves; Ciais, Philippe
2013-04-01
Black carbon (BC) is an air component of particular concern in terms of air quality and climate change. Black carbon emissions are often estimated based on the fuel data and emission factors. However, large variations in emission factors reported in the literature have led to a high uncertainty in previous inventories. Here, we develop a new global 0.1°×0.1° BC emission inventory for 2007 with full uncertainty analysis based on updated source and emission factor databases. Two versions of LMDz-OR-INCA models, named as INCA and INCA-zA, are run to evaluate the new emission inventory. INCA is built up based on a regular grid system with a resolution of 1.27° in latitude and 2.50° in longitude, while INCA-zA is specially zoomed to 0.51°×0.66° (latitude×longitude) in Asia. By checking against field observations, we compare our inventory with ACCMIP, which is used by IPCC in the 5th assessment report, and also evaluate the influence of model resolutions. With the newly calculated BC air concentrations and the nested model, we estimate the direct radiative forcing of BC and the premature death and mortality rate induced by BC exposure with Asia emphasized. Global BC direct radiative forcing at TOA is estimated to be 0.41 W/m2 (0.2 - 0.8 as inter-quartile range), which is 17% higher than that derived from the inventory adopted by IPCC-AR5 (0.34 W/m2). The estimated premature deaths induced by inhalation exposure to anthropogenic BC (0.36 million in 2007) and the percentage of high risk population are higher than those previously estimated. Ninety percents of the global total anthropogenic PD occur in Asia with 0.18 and 0.08 million deaths in China and India, respectively.
Aylward, L L; Hays, S M
2015-12-01
Urinary biomonitoring data for 2,4-dichlorophenoxyacetic acid (2,4-D) reflect aggregate population exposures to trace 2,4-D residues in diet and the environment. These data can be interpreted in the context of current risk assessments by comparison to a Biomonitoring Equivalent (BE), which is an estimate of the average biomarker concentration consistent with an exposure guidance value such as the US EPA Reference Dose (RfD). BE values are updated here from previous published BE values to reflect a change in the US EPA RfD. The US EPA RfD has been updated to reflect a revised point of departure (POD) based on new information from additional toxicological studies and updated assessment of applicable uncertainty factors. In addition, new biomonitoring data from both the US National Health and Nutrition Examination Survey (NHANES) and the Canadian Health Measures Survey (CHMS) have been published. The updated US EPA chronic RfD of 0.21 mg/kg-d results in updated BE values of 10,500 and 7000 μg/L for adults and children, respectively. Comparison of the current population-representative data to these BE values shows that upper bound population biomarker concentrations are more than 5000-fold below BE values corresponding to the updated US EPA RfD. This biomonitoring-based risk assessment supports the conclusion that current use patterns in the US and Canada result in incidental exposures in the general population that can be considered negligible in the context of the current 2,4-D risk assessment. Copyright © 2015 Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-09
... Blended Payment a. Update to the Drug Add-on to the Composite Rate Portion of the ESRD Blended Payment Rate i. Estimating Growth in Expenditures for Drugs and Biologicals in CY 2013 ii. Estimating Per Patient Growth iii. Applying the Growth Update to the Drug Add-On Adjustment iv. Update to the Drug Add-On...
Annual update of data for estimating ESALs.
DOT National Transportation Integrated Search
2006-10-01
A revised procedure for estimating equivalent single axleloads (ESALs) was developed in 1985. This procedure used weight, classification, and traffic volume data collected by the Transportation Cabinet's Division of Planning. : Annual updates of data...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hwang, Ho-Ling; Davis, Stacy Cagle
2009-12-01
This report is designed to document the analysis process and estimation models currently used by the Federal Highway Administration (FHWA) to estimate the off-highway gasoline consumption and public sector fuel consumption. An overview of the entire FHWA attribution process is provided along with specifics related to the latest update (2008) on the Off-Highway Gasoline Use Model and the Public Use of Gasoline Model. The Off-Highway Gasoline Use Model is made up of five individual modules, one for each of the off-highway categories: agricultural, industrial and commercial, construction, aviation, and marine. This 2008 update of the off-highway models was the secondmore » major update (the first model update was conducted during 2002-2003) after they were originally developed in mid-1990. The agricultural model methodology, specifically, underwent a significant revision because of changes in data availability since 2003. Some revision to the model was necessary due to removal of certain data elements used in the original estimation method. The revised agricultural model also made use of some newly available information, published by the data source agency in recent years. The other model methodologies were not drastically changed, though many data elements were updated to improve the accuracy of these models. Note that components in the Public Use of Gasoline Model were not updated in 2008. A major challenge in updating estimation methods applied by the public-use model is that they would have to rely on significant new data collection efforts. In addition, due to resource limitation, several components of the models (both off-highway and public-us models) that utilized regression modeling approaches were not recalibrated under the 2008 study. An investigation of the Environmental Protection Agency's NONROAD2005 model was also carried out under the 2008 model update. Results generated from the NONROAD2005 model were analyzed, examined, and compared, to the extent that is possible on the overall totals, to the current FHWA estimates. Because NONROAD2005 model was designed for emission estimation purposes (i.e., not for measuring fuel consumption), it covers different equipment populations from those the FHWA models were based on. Thus, a direct comparison generally was not possible in most sectors. As a result, NONROAD2005 data were not used in the 2008 update of the FHWA off-highway models. The quality of fuel use estimates directly affect the data quality in many tables published in the Highway Statistics. Although updates have been made to the Off-Highway Gasoline Use Model and the Public Use Gasoline Model, some challenges remain due to aging model equations and discontinuation of data sources.« less
Neal, Joseph M; Barrington, Michael J; Fettiplace, Michael R; Gitman, Marina; Memtsoudis, Stavros G; Mörwald, Eva E; Rubin, Daniel S; Weinberg, Guy
2018-02-01
The American Society of Regional Anesthesia and Pain Medicine's Third Practice Advisory on local anesthetic systemic toxicity is an interim update from its 2010 advisory. The advisory focuses on new information regarding the mechanisms of lipid resuscitation, updated frequency estimates, the preventative role of ultrasound guidance, changes to case presentation patterns, and limited information related to local infiltration anesthesia and liposomal bupivacaine. In addition to emerging information, the advisory updates recommendations pertaining to prevention, recognition, and treatment of local anesthetic systemic toxicity. WHAT'S NEW IN THIS UPDATE?: This interim update summarizes recent scientific findings that have enhanced our understanding of the mechanisms that lead to lipid emulsion reversal of LAST, including rapid partitioning, direct inotropy, and post-conditioning. Since the previous practice advisory, epidemiological data have emerged that suggest a lower frequency of LAST as reported by single institutions and some registries, nevertheless a considerable number of events still occur within the general community. Contemporary case reports suggest a trend toward delayed presentation, which may mirror the increased use of ultrasound guidance (fewer intravascular injections), local infiltration techniques (slower systemic uptake), and continuous local anesthetic infusions. Small patient size and sarcopenia are additional factors that increase potential risk for LAST. An increasing number of reported events occur outside of the traditional hospital setting and involve non-anesthesiologists.
TSARINA: A Computer Model for Assessing Conventional and Chemical Attacks on Airbases
1990-09-01
IV, and has been updated to FORTRAN 77; it has been adapted to various computer systems, as was the widely used AIDA model and the previous versions of...conventional and chemical attacks on sortie generation. In the first version of TSARINA [1 2], several key additions were made to the AIDA model so that (1...various on-base resources, in addition to the estimates of hits and facility damage that are generated by the original AIDA model . The second version
GLEAM v3: updated land evaporation and root-zone soil moisture datasets
NASA Astrophysics Data System (ADS)
Martens, Brecht; Miralles, Diego; Lievens, Hans; van der Schalie, Robin; de Jeu, Richard; Fernández-Prieto, Diego; Verhoest, Niko
2016-04-01
Evaporation determines the availability of surface water resources and the requirements for irrigation. In addition, through its impacts on the water, carbon and energy budgets, evaporation influences the occurrence of rainfall and the dynamics of air temperature. Therefore, reliable estimates of this flux at regional to global scales are of major importance for water management and meteorological forecasting of extreme events. However, the global-scale magnitude and variability of the flux, and the sensitivity of the underlying physical process to changes in environmental factors, are still poorly understood due to the limited global coverage of in situ measurements. Remote sensing techniques can help to overcome the lack of ground data. However, evaporation is not directly observable from satellite systems. As a result, recent efforts have focussed on combining the observable drivers of evaporation within process-based models. The Global Land Evaporation Amsterdam Model (GLEAM, www.gleam.eu) estimates terrestrial evaporation based on daily satellite observations of meteorological drivers of terrestrial evaporation, vegetation characteristics and soil moisture. Since the publication of the first version of the model in 2011, GLEAM has been widely applied for the study of trends in the water cycle, interactions between land and atmosphere and hydrometeorological extreme events. A third version of the GLEAM global datasets will be available from the beginning of 2016 and will be distributed using www.gleam.eu as gateway. The updated datasets include separate estimates for the different components of the evaporative flux (i.e. transpiration, bare-soil evaporation, interception loss, open-water evaporation and snow sublimation), as well as variables like the evaporative stress, potential evaporation, root-zone soil moisture and surface soil moisture. A new dataset using SMOS-based input data of surface soil moisture and vegetation optical depth will also be distributed. The most important updates in GLEAM include the revision of the soil moisture data assimilation system, the evaporative stress functions and the infiltration of rainfall. In this presentation, we will highlight the changes of the methodology and present the new datasets, their validation against in situ observations and the comparisons against alternative datasets of terrestrial evaporation, such as GLDAS-Noah, ERA-Interim and previous GLEAM datasets. Preliminary results indicate that the magnitude and the spatio-temporal variability of the evaporation estimates have been slightly improved upon previous versions of the datasets.
Annual update of data for estimating ESALs : draft.
DOT National Transportation Integrated Search
2008-10-01
A revised procedure for estimating equivalent single axleloads (ESALs) was developed in 1985. This procedure used weight, classification, and traffic volume data collected by the Transportation Cabinet's Division of Planning. : Annual updates of data...
Stupp, Paul; Okoroh, Ekwutosi; Besera, Ghenet; Goodman, David; Danel, Isabella
2016-01-01
Objectives In 1996, the U.S. Congress passed legislation making female genital mutilation/cutting (FGM/C) illegal in the United States. CDC published the first estimates of the number of women and girls at risk for FGM/C in 1997. Since 2012, various constituencies have again raised concerns about the practice in the United States. We updated an earlier estimate of the number of women and girls in the United States who were at risk for FGM/C or its consequences. Methods We estimated the number of women and girls who were at risk for undergoing FGM/C or its consequences in 2012 by applying country-specific prevalence of FGM/C to the estimated number of women and girls living in the United States who were born in that country or who lived with a parent born in that country. Results Approximately 513,000 women and girls in the United States were at risk for FGM/C or its consequences in 2012, which was more than three times higher than the earlier estimate, based on 1990 data. The increase in the number of women and girls younger than 18 years of age at risk for FGM/C was more than four times that of previous estimates. Conclusion The estimated increase was wholly a result of rapid growth in the number of immigrants from FGM/C-practicing countries living in the United States and not from increases in FGM/C prevalence in those countries. Scientifically valid information regarding whether women or their daughters have actually undergone FGM/C and related information that can contribute to efforts to prevent the practice in the United States and provide needed health services to women who have undergone FGM/C are needed. PMID:26957669
Goldberg, Howard; Stupp, Paul; Okoroh, Ekwutosi; Besera, Ghenet; Goodman, David; Danel, Isabella
2016-01-01
In 1996, the U.S. Congress passed legislation making female genital mutilation/cutting (FGM/C) illegal in the United States. CDC published the first estimates of the number of women and girls at risk for FGM/C in 1997. Since 2012, various constituencies have again raised concerns about the practice in the United States. We updated an earlier estimate of the number of women and girls in the United States who were at risk for FGM/C or its consequences. We estimated the number of women and girls who were at risk for undergoing FGM/C or its consequences in 2012 by applying country-specific prevalence of FGM/C to the estimated number of women and girls living in the United States who were born in that country or who lived with a parent born in that country. Approximately 513,000 women and girls in the United States were at risk for FGM/C or its consequences in 2012, which was more than three times higher than the earlier estimate, based on 1990 data. The increase in the number of women and girls younger than 18 years of age at risk for FGM/C was more than four times that of previous estimates. The estimated increase was wholly a result of rapid growth in the number of immigrants from FGM/C-practicing countries living in the United States and not from increases in FGM/C prevalence in those countries. Scientifically valid information regarding whether women or their daughters have actually undergone FGM/C and related information that can contribute to efforts to prevent the practice in the United States and provide needed health services to women who have undergone FGM/C are needed.
Indoor Spatial Updating With Impaired Vision
Legge, Gordon E.; Granquist, Christina; Baek, Yihwa; Gage, Rachel
2016-01-01
Purpose Spatial updating is the ability to keep track of position and orientation while moving through an environment. We asked how normally sighted and visually impaired subjects compare in spatial updating and in estimating room dimensions. Methods Groups of 32 normally sighted, 16 low-vision, and 16 blind subjects estimated the dimensions of six rectangular rooms. Updating was assessed by guiding the subjects along three-segment paths in the rooms. At the end of each path, they estimated the distance and direction to the starting location, and to a designated target. Spatial updating was tested in five conditions ranging from free viewing to full auditory and visual deprivation. Results The normally sighted and low-vision groups did not differ in their accuracy for judging room dimensions. Correlations between estimated size and physical size were high. Accuracy of low-vision performance was not correlated with acuity, contrast sensitivity, or field status. Accuracy was lower for the blind subjects. The three groups were very similar in spatial-updating performance, and exhibited only weak dependence on the nature of the viewing conditions. Conclusions People with a wide range of low-vision conditions are able to judge room dimensions as accurately as people with normal vision. Blind subjects have difficulty in judging the dimensions of quiet rooms, but some information is available from echolocation. Vision status has little impact on performance in simple spatial updating; proprioceptive and vestibular cues are sufficient. PMID:27978556
Indoor Spatial Updating With Impaired Vision.
Legge, Gordon E; Granquist, Christina; Baek, Yihwa; Gage, Rachel
2016-12-01
Spatial updating is the ability to keep track of position and orientation while moving through an environment. We asked how normally sighted and visually impaired subjects compare in spatial updating and in estimating room dimensions. Groups of 32 normally sighted, 16 low-vision, and 16 blind subjects estimated the dimensions of six rectangular rooms. Updating was assessed by guiding the subjects along three-segment paths in the rooms. At the end of each path, they estimated the distance and direction to the starting location, and to a designated target. Spatial updating was tested in five conditions ranging from free viewing to full auditory and visual deprivation. The normally sighted and low-vision groups did not differ in their accuracy for judging room dimensions. Correlations between estimated size and physical size were high. Accuracy of low-vision performance was not correlated with acuity, contrast sensitivity, or field status. Accuracy was lower for the blind subjects. The three groups were very similar in spatial-updating performance, and exhibited only weak dependence on the nature of the viewing conditions. People with a wide range of low-vision conditions are able to judge room dimensions as accurately as people with normal vision. Blind subjects have difficulty in judging the dimensions of quiet rooms, but some information is available from echolocation. Vision status has little impact on performance in simple spatial updating; proprioceptive and vestibular cues are sufficient.
Device SEE Susceptibility Update: 1996-1998
NASA Technical Reports Server (NTRS)
Coss, J. R.; Miyahira, T. F.; Swift, G. M.
1998-01-01
This eighth Compendium continues the previous work of Nichols, et al, on single event effects (SEE) first published in 1985. Because the compendium has grown so voluminous, this update only presents data not publised in previous compendia.
The current economic burden of illness of osteoporosis in Canada
Burke, N.; Von Keyserlingk, C.; Leslie, W. D.; Morin, S. N.; Adachi, J. D.; Papaioannou, A.; Bessette, L.; Brown, J. P.; Pericleous, L.; Tarride, J.
2016-01-01
Summary We estimate the current burden of illness of osteoporosis in Canada is double ($4.6 billion) our previous estimates ($2.3 billion) due to improved data capture of the multiple encounters and services that accompany a fracture: emergency room, admissions to acute and step-down non-acute institutions, rehabilitation, home-assisted or long-term residency support. Introduction We previously estimated the economic burden of illness of osteoporosis-attributable fractures in Canada for the year 2008 to be $2.3 billion in the base case and as much as $3.9 billion. The aim of this study is to update the estimate of the economic burden of illness for osteoporosis-attributable fractures for Canada based on newly available home care and long-term care (LTC) data. Methods Multiple national databases were used for the fiscal-year ending March 31, 2011 (FY 2010/2011) for acute institutional care, emergency visits, day surgery, secondary admissions for rehabilitation, and complex continuing care, as well as national dispensing data for osteoporosis medications. Gaps in national data were supplemented by provincial and community survey data. Osteoporosis-attributable fractures for Canadians age 50+ were identified by ICD-10-CA codes. Costs were expressed in 2014 dollars. Results In FY 2010/2011, the number of osteoporosis-attributable fractures was 131,443 resulting in 64,884 acute care admissions and 983,074 acute hospital days. Acute care costs were $1.5 billion, an 18 % increase since 2008. The cost of LTC was 33.4 times the previous estimate ($31 million versus $1.03 billion) because of improved data capture. The cost for rehabilitation and secondary admissions increased 3.4 fold, while drug costs decreased 19 %. The overall cost of osteoporosis was over $4.6 billion, an increase of 83 % from the 2008 estimate. Conclusion Since the 2008 estimate, new Canadian data on home care and LTC are available which provided a better estimate of the burden of osteoporosis in Canada. This suggests that our previous estimates were seriously underestimated. PMID:27166680
The current economic burden of illness of osteoporosis in Canada.
Hopkins, R B; Burke, N; Von Keyserlingk, C; Leslie, W D; Morin, S N; Adachi, J D; Papaioannou, A; Bessette, L; Brown, J P; Pericleous, L; Tarride, J
2016-10-01
We estimate the current burden of illness of osteoporosis in Canada is double ($4.6 billion) our previous estimates ($2.3 billion) due to improved data capture of the multiple encounters and services that accompany a fracture: emergency room, admissions to acute and step-down non-acute institutions, rehabilitation, home-assisted or long-term residency support. We previously estimated the economic burden of illness of osteoporosis-attributable fractures in Canada for the year 2008 to be $2.3 billion in the base case and as much as $3.9 billion. The aim of this study is to update the estimate of the economic burden of illness for osteoporosis-attributable fractures for Canada based on newly available home care and long-term care (LTC) data. Multiple national databases were used for the fiscal-year ending March 31, 2011 (FY 2010/2011) for acute institutional care, emergency visits, day surgery, secondary admissions for rehabilitation, and complex continuing care, as well as national dispensing data for osteoporosis medications. Gaps in national data were supplemented by provincial and community survey data. Osteoporosis-attributable fractures for Canadians age 50+ were identified by ICD-10-CA codes. Costs were expressed in 2014 dollars. In FY 2010/2011, the number of osteoporosis-attributable fractures was 131,443 resulting in 64,884 acute care admissions and 983,074 acute hospital days. Acute care costs were $1.5 billion, an 18 % increase since 2008. The cost of LTC was 33.4 times the previous estimate ($31 million versus $1.03 billion) because of improved data capture. The cost for rehabilitation and secondary admissions increased 3.4 fold, while drug costs decreased 19 %. The overall cost of osteoporosis was over $4.6 billion, an increase of 83 % from the 2008 estimate. Since the 2008 estimate, new Canadian data on home care and LTC are available which provided a better estimate of the burden of osteoporosis in Canada. This suggests that our previous estimates were seriously underestimated.
NASA Technical Reports Server (NTRS)
Dean, Bruce H. (Inventor)
2009-01-01
A method of recovering unknown aberrations in an optical system includes collecting intensity data produced by the optical system, generating an initial estimate of a phase of the optical system, iteratively performing a phase retrieval on the intensity data to generate a phase estimate using an initial diversity function corresponding to the intensity data, generating a phase map from the phase retrieval phase estimate, decomposing the phase map to generate a decomposition vector, generating an updated diversity function by combining the initial diversity function with the decomposition vector, generating an updated estimate of the phase of the optical system by removing the initial diversity function from the phase map. The method may further include repeating the process beginning with iteratively performing a phase retrieval on the intensity data using the updated estimate of the phase of the optical system in place of the initial estimate of the phase of the optical system, and using the updated diversity function in place of the initial diversity function, until a predetermined convergence is achieved.
Improving uncertainty estimates: Inter-annual variability in Ireland
NASA Astrophysics Data System (ADS)
Pullinger, D.; Zhang, M.; Hill, N.; Crutchley, T.
2017-11-01
This paper addresses the uncertainty associated with inter-annual variability used within wind resource assessments for Ireland in order to more accurately represent the uncertainties within wind resource and energy yield assessments. The study was undertaken using a total of 16 ground stations (Met Eireann) and corresponding reanalysis datasets to provide an update to previous work on this topic undertaken nearly 20 years ago. The results of the work demonstrate that the previously reported 5.4% of wind speed inter-annual variability is considered to be appropriate, guidance is given on how to provide a robust assessment of IAV using available sources of data including ground stations, MERRA-2 and ERA-Interim.
Validation of a 20-year forecast of US childhood lead poisoning: Updated prospects for 2010.
Jacobs, David E; Nevin, Rick
2006-11-01
We forecast childhood lead poisoning and residential lead paint hazard prevalence for 1990-2010, based on a previously unvalidated model that combines national blood lead data with three different housing data sets. The housing data sets, which describe trends in housing demolition, rehabilitation, window replacement, and lead paint, are the American Housing Survey, the Residential Energy Consumption Survey, and the National Lead Paint Survey. Blood lead data are principally from the National Health and Nutrition Examination Survey. New data now make it possible to validate the midpoint of the forecast time period. For the year 2000, the model predicted 23.3 million pre-1960 housing units with lead paint hazards, compared to an empirical HUD estimate of 20.6 million units. Further, the model predicted 498,000 children with elevated blood lead levels (EBL) in 2000, compared to a CDC empirical estimate of 434,000. The model predictions were well within 95% confidence intervals of empirical estimates for both residential lead paint hazard and blood lead outcome measures. The model shows that window replacement explains a large part of the dramatic reduction in lead poisoning that occurred from 1990 to 2000. Here, the construction of the model is described and updated through 2010 using new data. Further declines in childhood lead poisoning are achievable, but the goal of eliminating children's blood lead levels > or =10 microg/dL by 2010 is unlikely to be achieved without additional action. A window replacement policy will yield multiple benefits of lead poisoning prevention, increased home energy efficiency, decreased power plant emissions, improved housing affordability, and other previously unrecognized benefits. Finally, combining housing and health data could be applied to forecasting other housing-related diseases and injuries.
Validation of a 20-year forecast of US childhood lead poisoning: Updated prospects for 2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobs, David E.; Nevin, Rick
2006-11-15
We forecast childhood lead poisoning and residential lead paint hazard prevalence for 1990-2010, based on a previously unvalidated model that combines national blood lead data with three different housing data sets. The housing data sets, which describe trends in housing demolition, rehabilitation, window replacement, and lead paint, are the American Housing Survey, the Residential Energy Consumption Survey, and the National Lead Paint Survey. Blood lead data are principally from the National Health and Nutrition Examination Survey. New data now make it possible to validate the midpoint of the forecast time period. For the year 2000, the model predicted 23.3 millionmore » pre-1960 housing units with lead paint hazards, compared to an empirical HUD estimate of 20.6 million units. Further, the model predicted 498,000 children with elevated blood lead levels (EBL) in 2000, compared to a CDC empirical estimate of 434,000. The model predictions were well within 95% confidence intervals of empirical estimates for both residential lead paint hazard and blood lead outcome measures. The model shows that window replacement explains a large part of the dramatic reduction in lead poisoning that occurred from 1990 to 2000. Here, the construction of the model is described and updated through 2010 using new data. Further declines in childhood lead poisoning are achievable, but the goal of eliminating children's blood lead levels {>=}10 {mu}g/dL by 2010 is unlikely to be achieved without additional action. A window replacement policy will yield multiple benefits of lead poisoning prevention, increased home energy efficiency, decreased power plant emissions, improved housing affordability, and other previously unrecognized benefits. Finally, combining housing and health data could be applied to forecasting other housing-related diseases and injuries.« less
Improving Estimation of Ground Casualty Risk From Reentering Space Objects
NASA Technical Reports Server (NTRS)
Ostrom, Chris L.
2017-01-01
A recent improvement to the long-term estimation of ground casualties from reentering space debris is the further refinement and update to the human population distribution. Previous human population distributions were based on global totals with simple scaling factors for future years, or a coarse grid of population counts in a subset of the world's countries, each cell having its own projected growth rate. The newest population model includes a 5-fold refinement in both latitude and longitude resolution. All areas along a single latitude are combined to form a global population distribution as a function of latitude, creating a more accurate population estimation based on non-uniform growth at the country and area levels. Previous risk probability calculations used simplifying assumptions that did not account for the ellipsoidal nature of the Earth. The new method uses first, a simple analytical method to estimate the amount of time spent above each latitude band for a debris object with a given orbit inclination and second, a more complex numerical method that incorporates the effects of a non-spherical Earth. These new results are compared with the prior models to assess the magnitude of the effects on reentry casualty risk.
Improving Estimation of Ground Casualty Risk from Reentering Space Objects
NASA Technical Reports Server (NTRS)
Ostrom, C.
2017-01-01
A recent improvement to the long-term estimation of ground casualties from reentering space debris is the further refinement and update to the human population distribution. Previous human population distributions were based on global totals with simple scaling factors for future years, or a coarse grid of population counts in a subset of the world's countries, each cell having its own projected growth rate. The newest population model includes a 5-fold refinement in both latitude and longitude resolution. All areas along a single latitude are combined to form a global population distribution as a function of latitude, creating a more accurate population estimation based on non-uniform growth at the country and area levels. Previous risk probability calculations used simplifying assumptions that did not account for the ellipsoidal nature of the earth. The new method uses first, a simple analytical method to estimate the amount of time spent above each latitude band for a debris object with a given orbit inclination, and second, a more complex numerical method that incorporates the effects of a non-spherical Earth. These new results are compared with the prior models to assess the magnitude of the effects on reentry casualty risk.
Simulation and analyses of the aeroassist flight experiment attitude update method
NASA Technical Reports Server (NTRS)
Carpenter, J. R.
1991-01-01
A method which will be used to update the alignment of the Aeroassist Flight Experiment's Inertial Measuring Unit is simulated and analyzed. This method, the Star Line Maneuver, uses measurements from the Space Shuttle Orbiter star trackers along with an extended Kalman filter to estimate a correction to the attitude quaternion maintained by an Inertial Measuring Unit in the Orbiter's payload bay. This quaternion is corrupted by on-orbit bending of the Orbiter payload bay with respect to the Orbiter navigation base, which is incorporated into the payload quaternion when it is initialized via a direct transfer of the Orbiter attitude state. The method of updating this quaternion is examined through verification of baseline cases and Monte Carlo analysis using a simplified simulation, The simulation uses nominal state dynamics and measurement models from the Kalman filter as its real world models, and is programmed on Microvax minicomputer using Matlab, and interactive matrix analysis tool. Results are presented which confirm and augment previous performance studies, thereby enhancing confidence in the Star Line Maneuver design methodology.
Rücker, Viktoria; Keil, Ulrich; Fitzgerald, Anthony P; Malzahn, Uwe; Prugger, Christof; Ertl, Georg; Heuschmann, Peter U; Neuhauser, Hannelore
2016-01-01
Estimation of absolute risk of cardiovascular disease (CVD), preferably with population-specific risk charts, has become a cornerstone of CVD primary prevention. Regular recalibration of risk charts may be necessary due to decreasing CVD rates and CVD risk factor levels. The SCORE risk charts for fatal CVD risk assessment were first calibrated for Germany with 1998 risk factor level data and 1999 mortality statistics. We present an update of these risk charts based on the SCORE methodology including estimates of relative risks from SCORE, risk factor levels from the German Health Interview and Examination Survey for Adults 2008–11 (DEGS1) and official mortality statistics from 2012. Competing risks methods were applied and estimates were independently validated. Updated risk charts were calculated based on cholesterol, smoking, systolic blood pressure risk factor levels, sex and 5-year age-groups. The absolute 10-year risk estimates of fatal CVD were lower according to the updated risk charts compared to the first calibration for Germany. In a nationwide sample of 3062 adults aged 40–65 years free of major CVD from DEGS1, the mean 10-year risk of fatal CVD estimated by the updated charts was lower by 29% and the estimated proportion of high risk people (10-year risk > = 5%) by 50% compared to the older risk charts. This recalibration shows a need for regular updates of risk charts according to changes in mortality and risk factor levels in order to sustain the identification of people with a high CVD risk. PMID:27612145
Estimating the k2 Tidal Gravity Love Number of Mars
NASA Technical Reports Server (NTRS)
Smith, David E.; Zuber, Maria; Torrence, Mark; Dunn, Peter
2003-01-01
Analysis of the orbits of spacecraft can be used to infer global tidal parameters. For Mars, the Mars Global Surveyor (MGS) spacecraft has been used to estimate the second degree Love number, k2 from the tracking DSN tracking Doppler and range data by several authors. Unfortunately, neither of the spacecraft presently in orbit are ideally suited to tidal recovery because they are in sun-synchronous orbits that vary only slightly in local time; and, further, the sub-solar location only varies by about 25 degrees in latitude. Never-the less respectable estimates of the k2 tide have been made by several authors. We present an updated solution of the degree 2 zonal Love number, compare with previous dues, and analyze the sensitivity of the solution to orbital parameters, spacecraft maneuvers, and solution methodology. Estimating the k2 Tidal Gravity Love Number of Mars.
NASA Astrophysics Data System (ADS)
Machado, M. R.; Adhikari, S.; Dos Santos, J. M. C.; Arruda, J. R. F.
2018-03-01
Structural parameter estimation is affected not only by measurement noise but also by unknown uncertainties which are present in the system. Deterministic structural model updating methods minimise the difference between experimentally measured data and computational prediction. Sensitivity-based methods are very efficient in solving structural model updating problems. Material and geometrical parameters of the structure such as Poisson's ratio, Young's modulus, mass density, modal damping, etc. are usually considered deterministic and homogeneous. In this paper, the distributed and non-homogeneous characteristics of these parameters are considered in the model updating. The parameters are taken as spatially correlated random fields and are expanded in a spectral Karhunen-Loève (KL) decomposition. Using the KL expansion, the spectral dynamic stiffness matrix of the beam is expanded as a series in terms of discretized parameters, which can be estimated using sensitivity-based model updating techniques. Numerical and experimental tests involving a beam with distributed bending rigidity and mass density are used to verify the proposed method. This extension of standard model updating procedures can enhance the dynamic description of structural dynamic models.
Third COS FUV Lifetime Position: FUV Target Acquisition Parameter Update {LENA3}
NASA Astrophysics Data System (ADS)
Penton, Steven
2013-10-01
Verify the ability of the Cycle 22 COS FSW to place an isolated point source at the center of the PSA, using FUV dispersed light target acquisition (TA) from the object and all three FUV gratings at the Third Lifetime Position (LP3). This program is modeled from the activity summary of LENA3.This program should be executed after the LP3 HV, XD spectral positions, aperture mechanism position, and focus are determined and updated. In addition, initial estimates of the LIFETIME=ALTERNATE TA FSW parameters and subarrays should be updated prior to execution of this program. After Visit 01, the subarrays will be updated. After Visit 2, the FUV WCA-to-PSA offsets will be updateded. Prior to Visit 6, LV56 will be installed will include new values for the LP3 FUV plate scales. VISIT 6 exposures use the default lifetime position (LP3).NUV imaging TAs have previously been used to determine the correct locations for FUV spectra. We follow the same procedure here.Note that the ETC runs here were made using ETC22.2 and are therefore valid for Mach 2014. Some TDS drop will likely have occured before these visits execute, but we have plenty of count to go what we need to do in this program.
Kalman filter to update forest cover estimates
Raymond L. Czaplewski
1990-01-01
The Kalman filter is a statistical estimator that combines a time-series of independent estimates, using a prediction model that describes expected changes in the state of a system over time. An expensive inventory can be updated using model predictions that are adjusted with more recent, but less expensive and precise, monitoring data. The concepts of the Kalman...
Cost analysis of the treatment of severe acute malnutrition in West Africa.
Isanaka, Sheila; Menzies, Nicolas A; Sayyad, Jessica; Ayoola, Mudasiru; Grais, Rebecca F; Doyon, Stéphane
2017-10-01
We present an updated cost analysis to provide new estimates of the cost of providing community-based treatment for severe acute malnutrition, including expenditure shares for major cost categories. We calculated total and per child costs from a provider perspective. We categorized costs into three main activities (outpatient treatment, inpatient treatment, and management/administration) and four cost categories within each activity (personnel; therapeutic food; medical supplies; and infrastructure and logistical support). For each category, total costs were calculated by multiplying input quantities expended in the Médecins Sans Frontières nutrition program in Niger during a 12-month study period by 2015 input prices. All children received outpatient treatment, with 43% also receiving inpatient treatment. In this large, well-established program, the average cost per child treated was €148.86, with outpatient and inpatient treatment costs of €75.50 and €134.57 per child, respectively. Therapeutic food (44%, €32.98 per child) and personnel (35%, €26.70 per child) dominated outpatient costs, while personnel (56%, €75.47 per child) dominated in the cost of inpatient care. Sensitivity analyses suggested lowering prices of medical treatments, and therapeutic food had limited effect on total costs per child, while increasing program size and decreasing use of expatriate staff support reduced total costs per child substantially. Updated estimates of severe acute malnutrition treatment cost are substantially lower than previously published values, and important cost savings may be possible with increases in coverage/program size and integration into national health programs. These updated estimates can be used to suggest approaches to improve efficiency and inform national-level resource allocation. © 2016 John Wiley & Sons Ltd.
Indoor Spatial Updating with Reduced Visual Information
Legge, Gordon E.; Gage, Rachel; Baek, Yihwa; Bochsler, Tiana M.
2016-01-01
Purpose Spatial updating refers to the ability to keep track of position and orientation while moving through an environment. People with impaired vision may be less accurate in spatial updating with adverse consequences for indoor navigation. In this study, we asked how artificial restrictions on visual acuity and field size affect spatial updating, and also judgments of the size of rooms. Methods Normally sighted young adults were tested with artificial restriction of acuity in Mild Blur (Snellen 20/135) and Severe Blur (Snellen 20/900) conditions, and a Narrow Field (8°) condition. The subjects estimated the dimensions of seven rectangular rooms with and without these visual restrictions. They were also guided along three-segment paths in the rooms. At the end of each path, they were asked to estimate the distance and direction to the starting location. In Experiment 1, the subjects walked along the path. In Experiment 2, they were pushed in a wheelchair to determine if reduced proprioceptive input would result in poorer spatial updating. Results With unrestricted vision, mean Weber fractions for room-size estimates were near 20%. Severe Blur but not Mild Blur yielded larger errors in room-size judgments. The Narrow Field was associated with increased error, but less than with Severe Blur. There was no effect of visual restriction on estimates of distance back to the starting location, and only Severe Blur yielded larger errors in the direction estimates. Contrary to expectation, the wheelchair subjects did not exhibit poorer updating performance than the walking subjects, nor did they show greater dependence on visual condition. Discussion If our results generalize to people with low vision, severe deficits in acuity or field will adversely affect the ability to judge the size of indoor spaces, but updating of position and orientation may be less affected by visual impairment. PMID:26943674
Indoor Spatial Updating with Reduced Visual Information.
Legge, Gordon E; Gage, Rachel; Baek, Yihwa; Bochsler, Tiana M
2016-01-01
Spatial updating refers to the ability to keep track of position and orientation while moving through an environment. People with impaired vision may be less accurate in spatial updating with adverse consequences for indoor navigation. In this study, we asked how artificial restrictions on visual acuity and field size affect spatial updating, and also judgments of the size of rooms. Normally sighted young adults were tested with artificial restriction of acuity in Mild Blur (Snellen 20/135) and Severe Blur (Snellen 20/900) conditions, and a Narrow Field (8°) condition. The subjects estimated the dimensions of seven rectangular rooms with and without these visual restrictions. They were also guided along three-segment paths in the rooms. At the end of each path, they were asked to estimate the distance and direction to the starting location. In Experiment 1, the subjects walked along the path. In Experiment 2, they were pushed in a wheelchair to determine if reduced proprioceptive input would result in poorer spatial updating. With unrestricted vision, mean Weber fractions for room-size estimates were near 20%. Severe Blur but not Mild Blur yielded larger errors in room-size judgments. The Narrow Field was associated with increased error, but less than with Severe Blur. There was no effect of visual restriction on estimates of distance back to the starting location, and only Severe Blur yielded larger errors in the direction estimates. Contrary to expectation, the wheelchair subjects did not exhibit poorer updating performance than the walking subjects, nor did they show greater dependence on visual condition. If our results generalize to people with low vision, severe deficits in acuity or field will adversely affect the ability to judge the size of indoor spaces, but updating of position and orientation may be less affected by visual impairment.
GEO Collisional Risk Assessment Based on Analysis of NASA-WISE Data and Modeling
NASA Astrophysics Data System (ADS)
Howard, S.; Murray-Krezan, J.; Dao, P.; Surka, D.
From December 2009 thru 2011 the NASA Wide-Field Infrared Survey Explorer (WISE) gathered radiometrically exquisite measurements of debris in near Earth orbits, substantially augmenting the current catalog of known debris. The WISE GEO-belt debris population adds approximately 2,000 previously uncataloged objects. This paper describes characterization of the WISE GEO-belt orbital debris population in terms of location, epoch, and size. The WISE GEO-belt debris population characteristics are compared with the publically available U.S. catalog and previous descriptions of the GEO-belt debris population. We found that our results differ from previously published debris distributions, suggesting the need for updates to collision probability models and a better measurement-based understanding of the debris population. Previous studies of collisional rate in GEO invoke the presence of a large number of debris in the regime of sizes too small to track, i.e. not in the catalog, but large enough to cause significant damage and fragmentation in a collision. A common approach is to estimate that population of small debris by assuming that it is dominated by fragments and therefore should follow trends observed in fragmentation events or laboratory fragmentation tests. In other words, the population of debris can be extrapolated from trackable sizes to small sizes using an empirically determined trend of population as a function of size. We use new information suggested by the analysis of WISE IR measurements to propose an updated relationship. Our trend is an improvement because we expect that an IR emissive signature is a more reliable indicator of physical size. Based on the revised relationship, we re-estimate the total collisional rate in the GEO belt with the inclusion of projected uncatalogued debris and applying a conjunction assessment technique. Through modeling, we evaluate the hot spots near the geopotential wells and the effects of fragmentation in the GEO graveyard to the collision with GEO objects.
SREB Teacher Salaries: Update for 1995-96 and Estimated Increases for 1997.
ERIC Educational Resources Information Center
Southern Regional Education Board, Atlanta, GA.
Updated information is provided on teacher salaries for 1995-96 and estimated increases for the 1996-97 school year in the 15 states belonging to the Southern Regional Education Board (SREB). Pay raises for 1996-97 are estimated as ranging from 1.75 percent in Virginia to 6 percent in Georgia. Especially noted are South Carolina's continuing…
The Propulsive Small Expendable Deployer System (ProSEDS)
NASA Technical Reports Server (NTRS)
Lorenzini, Enrico C.; Cosmo, Mario L.; Estes, Robert D.; Sanmartin, Juan; Pelaez, Jesus; Ruiz, Manuel
2003-01-01
This Final Report covers the following main topics: 1) Brief Description of ProSEDS; 2) Mission Analysis; 3) Dynamics Reference Mission; 4) Dynamics Stability; 5) Deployment Control; 6) Updated System Performance; 7) Updated Mission Analysis; 8) Updated Dynamics Reference Mission; 9) Updated Deployment Control Profiles and Simulations; 10) Updated Reference Mission; 11) Evaluation of Power Delivered by the Tether; 12) Deployment Control Profile Ref. #78 and Simulations; 13) Kalman Filters for Mission Estimation; 14) Analysis/Estimation of Deployment Flight Data; 15) Comparison of ED Tethers and Electrical Thrusters; 16) Dynamics Analysis for Mission Starting at a Lower Altitude; 17) Deployment Performance at a Lower Altitude; 18) Satellite Orbit after a Tether Cut; 19) Deployment with Shorter Dyneema Tether Length; 20) Interactive Software for ED Tethers.
NASA Astrophysics Data System (ADS)
Ahmed, F.; Teferle, F. N.; Bingley, R. M.
2012-04-01
Since September 2011 the University of Luxembourg in collaboration with the University of Nottingham has been setting up two near real-time processing systems for ground-based GNSS data for the provision of zenith total delay (ZTD) and integrated water vapour (IWV) estimates. Both systems are based on Bernese v5.0, use the double-differenced network processing strategy and operate with a 1-hour (NRT1h) and 15-minutes (NRT15m) update cycle. Furthermore, the systems follow the approach of the E-GVAP METO and IES2 systems in that the normal equations for the latest data are combined with those from the previous four updates during the estimation of the ZTDs. NRT1h currently takes the hourly data from over 130 GNSS stations in Europe whereas NRT15m is primarily using the real-time streams of EUREF-IP. Both networks include additional GNSS stations in Luxembourg, Belgium and France. The a priori station coordinates for all of these stem from a moving average computed over the last 20 to 50 days and are based on the precise point positioning processing strategy. In this study we present the first ZTD and IWV estimates obtained from the NRT1h and NRT15m systems in development at the University of Luxembourg. In a preliminary evaluation we compare their performance to the IES2 system at the University of Nottingham and find the IWV estimates to agree at the sub-millimetre level.
Prevalence of Individuals Experiencing the Effects of Stroke in Canada: Trends and Projections.
Krueger, Hans; Koot, Jacqueline; Hall, Ruth E; O'Callaghan, Christina; Bayley, Mark; Corbett, Dale
2015-08-01
Previous estimates of the number and prevalence of individuals experiencing the effects of stroke in Canada are out of date and exclude critical population groups. It is essential to have complete data that report on stroke disability for monitoring and planning purposes. The objective was to provide an updated estimate of the number of individuals experiencing the effects of stroke in Canada (and its regions), trending since 2000 and forecasted prevalence to 2038. The prevalence, trends, and projected number of individuals experiencing the effects of stroke were estimated using region-specific survey data and adjusted to account for children aged <12 years and individuals living in homes for the aged. In 2013, we estimate that there were 405 000 individuals experiencing the effects of stroke in Canada, yielding a prevalence of 1.15%. This value is expected to increase to between 654 000 and 726 000 by 2038. Trends in stroke data between 2000 and 2012 suggest a nonsignificant decrease in stroke prevalence, but a substantial and rising increase in the number of individuals experiencing the effects of stroke. Stroke prevalence varied considerably between regions. Previous estimates of stroke prevalence have underestimated the true number of individuals experiencing the effects of stroke in Canada. Furthermore, the projected increases that will result from population growth and demographic changes highlight the importance of maintaining up-to-date estimates. © 2015 American Heart Association, Inc.
Modelling the spatial distribution of ammonia emissions in the UK.
Hellsten, S; Dragosits, U; Place, C J; Vieno, M; Dore, A J; Misselbrook, T H; Tang, Y S; Sutton, M A
2008-08-01
Ammonia emissions (NH3) are characterised by a high spatial variability at a local scale. When modelling the spatial distribution of NH3 emissions, it is important to provide robust emission estimates, since the model output is used to assess potential environmental impacts, e.g. exceedance of critical loads. The aim of this study was to provide a new, updated spatial NH3 emission inventory for the UK for the year 2000, based on an improved modelling approach and the use of updated input datasets. The AENEID model distributes NH3 emissions from a range of agricultural activities, such as grazing and housing of livestock, storage and spreading of manures, and fertilizer application, at a 1-km grid resolution over the most suitable landcover types. The results of the emission calculation for the year 2000 are analysed and the methodology is compared with a previous spatial emission inventory for 1996.
The Extended-Image Tracking Technique Based on the Maximum Likelihood Estimation
NASA Technical Reports Server (NTRS)
Tsou, Haiping; Yan, Tsun-Yee
2000-01-01
This paper describes an extended-image tracking technique based on the maximum likelihood estimation. The target image is assume to have a known profile covering more than one element of a focal plane detector array. It is assumed that the relative position between the imager and the target is changing with time and the received target image has each of its pixels disturbed by an independent additive white Gaussian noise. When a rotation-invariant movement between imager and target is considered, the maximum likelihood based image tracking technique described in this paper is a closed-loop structure capable of providing iterative update of the movement estimate by calculating the loop feedback signals from a weighted correlation between the currently received target image and the previously estimated reference image in the transform domain. The movement estimate is then used to direct the imager to closely follow the moving target. This image tracking technique has many potential applications, including free-space optical communications and astronomy where accurate and stabilized optical pointing is essential.
Annualized earthquake loss estimates for California and their sensitivity to site amplification
Chen, Rui; Jaiswal, Kishor; Bausch, D; Seligson, H; Wills, C.J.
2016-01-01
Input datasets for annualized earthquake loss (AEL) estimation for California were updated recently by the scientific community, and include the National Seismic Hazard Model (NSHM), site‐response model, and estimates of shear‐wave velocity. Additionally, the Federal Emergency Management Agency’s loss estimation tool, Hazus, was updated to include the most recent census and economic exposure data. These enhancements necessitated a revisit to our previous AEL estimates and a study of the sensitivity of AEL estimates subjected to alternate inputs for site amplification. The NSHM ground motions for a uniform site condition are modified to account for the effect of local near‐surface geology. The site conditions are approximated in three ways: (1) by VS30 (time‐averaged shear‐wave velocity in the upper 30 m) value obtained from a geology‐ and topography‐based map consisting of 15 VS30 groups, (2) by site classes categorized according to National Earthquake Hazards Reduction Program (NEHRP) site classification, and (3) by a uniform NEHRP site class D. In case 1, ground motions are amplified using the Seyhan and Stewart (2014) semiempirical nonlinear amplification model. In cases 2 and 3, ground motions are amplified using the 2014 version of the NEHRP site amplification factors, which are also based on the Seyhan and Stewart model but are approximated to facilitate their use for building code applications. Estimated AELs are presented at multiple resolutions, starting with the state level assessment and followed by detailed assessments for counties, metropolitan statistical areas (MSAs), and cities. AEL estimate at the state level is ∼$3.7 billion, 70% of which is contributed from Los Angeles–Long Beach–Santa Ana, San Francisco–Oakland–Fremont, and Riverside–San Bernardino–Ontario MSAs. The statewide AEL estimate is insensitive to alternate assumptions of site amplification. However, we note significant differences in AEL estimates among the three sensitivity cases for smaller geographic units.
Maurer, Douglas K.; Watkins, Sharon A.; Burrowws, Robert L.
2004-01-01
Rapid population growth in Carson Valley has caused concern over the continued availability of water resources to sustain future growth. The U.S. Geological Survey, in cooperation with Douglas County, began a study to update estimates of water-budget components in Carson Valley for current climatic conditions. Data collected at 19 sites included 9 continuous records of tributary streamflows, 1 continuous record of outflow from the valley, and 408 measurements of 10 perennially flowing but ungaged drainages. These data were compiled and analyzed to provide updated computations and estimates of streamflows tributary to Carson Valley, 1990-2002. Mean monthly and annual flows were computed from continuous records for the period 1990-2002 for five streams, and for the period available, 1990-97, for four streams. Daily mean flow from ungaged drainages was estimated using multi-variate regressions of individual discharge measurements against measured flow at selected continuous gages. From the estimated daily mean flows, monthly and annual mean flows were calculated from 1990 to 2002. These values were used to compute estimates of mean monthly and annual flows for the ungaged perennial drainages. Using the computed and estimated mean annual flows, annual unit-area runoff was computed for the perennial drainages, which ranged from 0.30 to 2.02 feet. For the period 1990-2002, estimated inflow of perennial streams tributary to Carson Valley totaled about 25,900 acre-feet per year. Inflow computed from gaged perennial drainages totaled 10,300 acre-feet per year, and estimated inflow from ungaged perennial drainages totaled 15,600 acre-feet per year. The annual flow of perennial streams ranges from 4,210 acre-feet at Clear Creek to 450 acre-feet at Stutler Canyon Creek. Differences in unit-area runoff and in the seasonal timing of flow likely are caused by differences in geologic setting, altitude, slope, or aspect of the individual drainages. The remaining drainages are ephemeral and supply inflow to the valley floor only during spring runoff in wet years or during large precipitation events. Annual unit-area runoff for the perennial drainages was used to estimate inflow from ephemeral drainages totaling 11,700 acre-feet per year. The totaled estimate of perennial and ephemeral tributary inflows to Carson Valley is 37,600 acre-feet per year. Gaged perennial inflow is 27 percent of the total, ungaged perennial inflow is 42 percent, and ephemeral inflow is 31 percent. The estimate is from 50 to 60 percent greater than three previous estimates, one made for a larger area and similar to two other estimates made for larger areas. The combined uncertainty of the estimates totaled about 33 percent of the total inflow or about 12,000 acre-feet per year.
Report of the International Ice Patrol in the North Atlantic. Bulletin Number 76
1990-01-01
lIP’s iceberg predictions. Ice were updated by Scobie dynamic height, resulting in a and Schultz (1976). Figure low estimate of the current Page 78 I...0. - 9 *460 460 440 1. 4 40 I42*N 9 .’I...........1 442*N 53OW 510 490 470 450W Figure C-2. Mean Dynamic Topography Relative to 1000 db ( Scobie and...Schultz, 1976). Paqle 79 MI magnitude. Scobie and PREVIOUS CHANGES TO Kassler and Shuhy I Schultz (1976) substantially THE 1979 CURRENT FILE also
A selective-update affine projection algorithm with selective input vectors
NASA Astrophysics Data System (ADS)
Kong, NamWoong; Shin, JaeWook; Park, PooGyeon
2011-10-01
This paper proposes an affine projection algorithm (APA) with selective input vectors, which based on the concept of selective-update in order to reduce estimation errors and computations. The algorithm consists of two procedures: input- vector-selection and state-decision. The input-vector-selection procedure determines the number of input vectors by checking with mean square error (MSE) whether the input vectors have enough information for update. The state-decision procedure determines the current state of the adaptive filter by using the state-decision criterion. As the adaptive filter is in transient state, the algorithm updates the filter coefficients with the selected input vectors. On the other hand, as soon as the adaptive filter reaches the steady state, the update procedure is not performed. Through these two procedures, the proposed algorithm achieves small steady-state estimation errors, low computational complexity and low update complexity for colored input signals.
Updated Estimates of the Average Financial Return on Master's Degree Programs in the United States
ERIC Educational Resources Information Center
Gándara, Denisa; Toutkoushian, Robert K.
2017-01-01
In this study, we provide updated estimates of the private and social financial return on enrolling in a master's degree program in the United States. In addition to returns for all fields of study, we show estimated returns to enrolling in master's degree programs in business and education, specifically. We also conduct a sensitivity analysis to…
Parametric cost estimation for space science missions
NASA Astrophysics Data System (ADS)
Lillie, Charles F.; Thompson, Bruce E.
2008-07-01
Cost estimation for space science missions is critically important in budgeting for successful missions. The process requires consideration of a number of parameters, where many of the values are only known to a limited accuracy. The results of cost estimation are not perfect, but must be calculated and compared with the estimates that the government uses for budgeting purposes. Uncertainties in the input parameters result from evolving requirements for missions that are typically the "first of a kind" with "state-of-the-art" instruments and new spacecraft and payload technologies that make it difficult to base estimates on the cost histories of previous missions. Even the cost of heritage avionics is uncertain due to parts obsolescence and the resulting redesign work. Through experience and use of industry best practices developed in participation with the Aerospace Industries Association (AIA), Northrop Grumman has developed a parametric modeling approach that can provide a reasonably accurate cost range and most probable cost for future space missions. During the initial mission phases, the approach uses mass- and powerbased cost estimating relationships (CER)'s developed with historical data from previous missions. In later mission phases, when the mission requirements are better defined, these estimates are updated with vendor's bids and "bottoms- up", "grass-roots" material and labor cost estimates based on detailed schedules and assigned tasks. In this paper we describe how we develop our CER's for parametric cost estimation and how they can be applied to estimate the costs for future space science missions like those presented to the Astronomy & Astrophysics Decadal Survey Study Committees.
Species longevity in North American fossil mammals.
Prothero, Donald R
2014-08-01
Species longevity in the fossil record is related to many paleoecological variables and is important to macroevolutionary studies, yet there are very few reliable data on average species durations in Cenozoic fossil mammals. Many of the online databases (such as the Paleobiology Database) use only genera of North American Cenozoic mammals and there are severe problems because key groups (e.g. camels, oreodonts, pronghorns and proboscideans) have no reliable updated taxonomy, with many invalid genera and species and/or many undescribed genera and species. Most of the published datasets yield species duration estimates of approximately 2.3-4.3 Myr for larger mammals, with small mammals tending to have shorter species durations. My own compilation of all the valid species durations in families with updated taxonomy (39 families, containing 431 genera and 998 species, averaging 2.3 species per genus) yields a mean duration of 3.21 Myr for larger mammals. This breaks down to 4.10-4.39 Myr for artiodactyls, 3.14-3.31 Myr for perissodactyls and 2.63-2.95 Myr for carnivorous mammals (carnivorans plus creodonts). These averages are based on a much larger, more robust dataset than most previous estimates, so they should be more reliable for any studies that need species longevity to be accurately estimated. © 2013 International Society of Zoological Sciences, Institute of Zoology/Chinese Academy of Sciences and Wiley Publishing Asia Pty Ltd.
Valence-Dependent Belief Updating: Computational Validation
Kuzmanovic, Bojana; Rigoux, Lionel
2017-01-01
People tend to update beliefs about their future outcomes in a valence-dependent way: they are likely to incorporate good news and to neglect bad news. However, belief formation is a complex process which depends not only on motivational factors such as the desire for favorable conclusions, but also on multiple cognitive variables such as prior beliefs, knowledge about personal vulnerabilities and resources, and the size of the probabilities and estimation errors. Thus, we applied computational modeling in order to test for valence-induced biases in updating while formally controlling for relevant cognitive factors. We compared biased and unbiased Bayesian models of belief updating, and specified alternative models based on reinforcement learning. The experiment consisted of 80 trials with 80 different adverse future life events. In each trial, participants estimated the base rate of one of these events and estimated their own risk of experiencing the event before and after being confronted with the actual base rate. Belief updates corresponded to the difference between the two self-risk estimates. Valence-dependent updating was assessed by comparing trials with good news (better-than-expected base rates) with trials with bad news (worse-than-expected base rates). After receiving bad relative to good news, participants' updates were smaller and deviated more strongly from rational Bayesian predictions, indicating a valence-induced bias. Model comparison revealed that the biased (i.e., optimistic) Bayesian model of belief updating better accounted for data than the unbiased (i.e., rational) Bayesian model, confirming that the valence of the new information influenced the amount of updating. Moreover, alternative computational modeling based on reinforcement learning demonstrated higher learning rates for good than for bad news, as well as a moderating role of personal knowledge. Finally, in this specific experimental context, the approach based on reinforcement learning was superior to the Bayesian approach. The computational validation of valence-dependent belief updating represents a novel support for a genuine optimism bias in human belief formation. Moreover, the precise control of relevant cognitive variables justifies the conclusion that the motivation to adopt the most favorable self-referential conclusions biases human judgments. PMID:28706499
Valence-Dependent Belief Updating: Computational Validation.
Kuzmanovic, Bojana; Rigoux, Lionel
2017-01-01
People tend to update beliefs about their future outcomes in a valence-dependent way: they are likely to incorporate good news and to neglect bad news. However, belief formation is a complex process which depends not only on motivational factors such as the desire for favorable conclusions, but also on multiple cognitive variables such as prior beliefs, knowledge about personal vulnerabilities and resources, and the size of the probabilities and estimation errors. Thus, we applied computational modeling in order to test for valence-induced biases in updating while formally controlling for relevant cognitive factors. We compared biased and unbiased Bayesian models of belief updating, and specified alternative models based on reinforcement learning. The experiment consisted of 80 trials with 80 different adverse future life events. In each trial, participants estimated the base rate of one of these events and estimated their own risk of experiencing the event before and after being confronted with the actual base rate. Belief updates corresponded to the difference between the two self-risk estimates. Valence-dependent updating was assessed by comparing trials with good news (better-than-expected base rates) with trials with bad news (worse-than-expected base rates). After receiving bad relative to good news, participants' updates were smaller and deviated more strongly from rational Bayesian predictions, indicating a valence-induced bias. Model comparison revealed that the biased (i.e., optimistic) Bayesian model of belief updating better accounted for data than the unbiased (i.e., rational) Bayesian model, confirming that the valence of the new information influenced the amount of updating. Moreover, alternative computational modeling based on reinforcement learning demonstrated higher learning rates for good than for bad news, as well as a moderating role of personal knowledge. Finally, in this specific experimental context, the approach based on reinforcement learning was superior to the Bayesian approach. The computational validation of valence-dependent belief updating represents a novel support for a genuine optimism bias in human belief formation. Moreover, the precise control of relevant cognitive variables justifies the conclusion that the motivation to adopt the most favorable self-referential conclusions biases human judgments.
DOT National Transportation Integrated Search
The purpose of this report, "Working Paper National Costs of the Metropolitan ITS infrastructure: Updated with 2004 Deployment Data," is to update the estimates of the costs remaining to deploy Intelligent Transportation Systems (ITS) infrastructure ...
Updating CMAQ secondary organic aerosol properties relevant for aerosol water interactions
Properties of secondary organic aerosol (SOA) compounds in CMAQ are updated with state-of-the-science estimates from structure activity relationships to provide consistency among volatility, molecular weight, degree of oxygenation, and solubility/hygroscopicity. These updated pro...
Cadilhac, Dominique A; Carter, Rob; Thrift, Amanda G; Dewey, Helen M
2009-03-01
Stroke is associated with considerable societal costs. Cost-of-illness studies have been undertaken to estimate lifetime costs; most incorporating data up to 12 months after stroke. Costs of stroke, incorporating data collected up to 12 months, have previously been reported from the North East Melbourne Stroke Incidence Study (NEMESIS). NEMESIS now has patient-level resource use data for 5 years. We aimed to recalculate the long-term resource utilization of first-ever stroke patients and compare these to previous estimates obtained using data collected to 12 months. Population structure, life expectancy, and unit prices within the original cost-of-illness models were updated from 1997 to 2004. New Australian stroke survival and recurrence data up to 10 years were incorporated, as well as cross-sectional resource utilization data at 3, 4, and 5 years from NEMESIS. To enable comparisons, 1997 costs were inflated to 2004 prices and discounting was standardized. In 2004, 27 291 ischemic stroke (IS) and 4291 intracerebral hemorrhagic stroke (ICH) first-ever events were estimated. Average annual resource use after 12 months was AU$6022 for IS and AU$3977 for ICH. This is greater than the 1997 estimates for IS (AU$4848) and less than those for ICH (previously AU$10 692). The recalculated average lifetime costs per first-ever case differed for IS (AU$57 106 versus AU$52 855 [1997]), but differed more for ICH (AU$49 995 versus AU$92 308 [1997]). Basing lifetime cost estimates on short-term data overestimated the costs for ICH and underestimated those for IS. Patterns of resource use varied by stroke subtype and, overall, the societal cost impact was large.
Basis for the ICRP’s updated biokinetic model for carbon inhaled as CO 2
Leggett, Richard W.
2017-03-02
Here, the International Commission on Radiological Protection (ICRP) is updating its biokinetic and dosimetric models for occupational intake of radionuclides (OIR) in a series of reports called the OIR series. This paper describes the basis for the ICRP's updated biokinetic model for inhalation of radiocarbon as carbon dioxide (CO 2) gas. The updated model is based on biokinetic data for carbon isotopes inhaled as carbon dioxide or injected or ingested as bicarbonatemore » $$({{{\\rm{HCO}}}_{3}}^{-}).$$ The data from these studies are expected to apply equally to internally deposited (or internally produced) carbon dioxide and bicarbonate based on comparison of excretion rates for the two administered forms and the fact that carbon dioxide and bicarbonate are largely carried in a common form (CO 2–H$${{{\\rm{CO}}}_{3}}^{-})$$ in blood. Compared with dose estimates based on current ICRP biokinetic models for inhaled carbon dioxide or ingested carbon, the updated model will result in a somewhat higher dose estimate for 14C inhaled as CO 2 and a much lower dose estimate for 14C ingested as bicarbonate.« less
Basis for the ICRP’s updated biokinetic model for carbon inhaled as CO 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leggett, Richard W.
Here, the International Commission on Radiological Protection (ICRP) is updating its biokinetic and dosimetric models for occupational intake of radionuclides (OIR) in a series of reports called the OIR series. This paper describes the basis for the ICRP's updated biokinetic model for inhalation of radiocarbon as carbon dioxide (CO 2) gas. The updated model is based on biokinetic data for carbon isotopes inhaled as carbon dioxide or injected or ingested as bicarbonatemore » $$({{{\\rm{HCO}}}_{3}}^{-}).$$ The data from these studies are expected to apply equally to internally deposited (or internally produced) carbon dioxide and bicarbonate based on comparison of excretion rates for the two administered forms and the fact that carbon dioxide and bicarbonate are largely carried in a common form (CO 2–H$${{{\\rm{CO}}}_{3}}^{-})$$ in blood. Compared with dose estimates based on current ICRP biokinetic models for inhaled carbon dioxide or ingested carbon, the updated model will result in a somewhat higher dose estimate for 14C inhaled as CO 2 and a much lower dose estimate for 14C ingested as bicarbonate.« less
Thiros, Susan A.
2006-01-01
This report evaluates the performance of a numerical model of the ground-water system in northern Utah Valley, Utah, that originally simulated ground-water conditions during 1947-1980 and was updated to include conditions estimated for 1981-2002. Estimates of annual recharge to the ground-water system and discharge from wells in the area were added to the original ground-water flow model of the area.The files used in the original transient-state model of the ground-water flow system in northern Utah Valley were imported into MODFLOW-96, an updated version of MODFLOW. The main model input files modified as part of this effort were the well and recharge files. Discharge from pumping wells in northern Utah Valley was estimated on an annual basis for 1981-2002. Although the amount of average annual withdrawals from wells has not changed much since the previous study, there have been changes in the distribution of well discharge in the area. Discharge estimates for flowing wells during 1981-2002 were assumed to be the same as those used in the last stress period of the original model because of a lack of new data. Variations in annual recharge were assumed to be proportional to changes in total surface-water inflow to northern Utah Valley. Recharge specified in the model during the additional stress periods varied from 255,000 acre-feet in 1986 to 137,000 acre-feet in 1992.The ability of the updated transient-state model to match hydrologic conditions determined for 1981-2002 was evaluated by comparing water-level changes measured in wells to those computed by the model. Water-level measurements made in February, March, or April were available for 39 wells in the modeled area during all or part of 1981-2003. In most cases, the magnitude and direction of annual water-level change from 1981 to 2002 simulated by the updated model reasonably matched the measured change. The greater-than-normal precipitation that occurred during 1982-84 resulted in period-of-record high water levels measured in many of the observation wells in March 1984. The model-computed water levels at the end of 1982-84 also are among the highest for the period. Both measured and computed water levels decreased during the period representing ground-water conditions from 1999 to 2002. Precipitation was less than normal during 1999-2002.The ability of the model to adequately simulate climatic extremes such as the wetter-than-normal conditions of 1982-84 and the drier-than-normal conditions of 1999-2002 indicates that the annual variation of recharge to the ground-water system based on streamflow entering the valley, which in turn is primarily dependent upon precipitation, is appropriate but can be improved. The updated transient-state model of the ground-water system in northern Utah Valley can be improved by making revisions on the basis of currently available data and information.
Adaptive tracking of a time-varying field with a quantum sensor
NASA Astrophysics Data System (ADS)
Bonato, Cristian; Berry, Dominic W.
2017-05-01
Sensors based on single spins can enable magnetic-field detection with very high sensitivity and spatial resolution. Previous work has concentrated on sensing of a constant magnetic field or a periodic signal. Here, we instead investigate the problem of estimating a field with nonperiodic variation described by a Wiener process. We propose and study, by numerical simulations, an adaptive tracking protocol based on Bayesian estimation. The tracking protocol updates the probability distribution for the magnetic field based on measurement outcomes and adapts the choice of sensing time and phase in real time. By taking the statistical properties of the signal into account, our protocol strongly reduces the required measurement time. This leads to a reduction of the error in the estimation of a time-varying signal by up to a factor of four compare with protocols that do not take this information into account.
Updated energy budgets for neural computation in the neocortex and cerebellum
Howarth, Clare; Gleeson, Padraig; Attwell, David
2012-01-01
The brain's energy supply determines its information processing power, and generates functional imaging signals. The energy use on the different subcellular processes underlying neural information processing has been estimated previously for the grey matter of the cerebral and cerebellar cortex. However, these estimates need reevaluating following recent work demonstrating that action potentials in mammalian neurons are much more energy efficient than was previously thought. Using this new knowledge, this paper provides revised estimates for the energy expenditure on neural computation in a simple model for the cerebral cortex and a detailed model of the cerebellar cortex. In cerebral cortex, most signaling energy (50%) is used on postsynaptic glutamate receptors, 21% is used on action potentials, 20% on resting potentials, 5% on presynaptic transmitter release, and 4% on transmitter recycling. In the cerebellar cortex, excitatory neurons use 75% and inhibitory neurons 25% of the signaling energy, and most energy is used on information processing by non-principal neurons: Purkinje cells use only 15% of the signaling energy. The majority of cerebellar signaling energy use is on the maintenance of resting potentials (54%) and postsynaptic receptors (22%), while action potentials account for only 17% of the signaling energy use. PMID:22434069
DOT National Transportation Integrated Search
2006-07-01
The purpose of this report, "Working Paper National Costs of the Metropolitan ITS Infrastructure: Updated with 2005 Deployment Data," is to update the estimates of the costs remaining to fully deploy Intelligent Transportation Systems (ITS) infrastru...
Current sources of carbon tetrachloride (CCl4) in our atmosphere
NASA Astrophysics Data System (ADS)
Sherry, David; McCulloch, Archie; Liang, Qing; Reimann, Stefan; Newman, Paul A.
2018-02-01
Carbon tetrachloride (CCl4 or CTC) is an ozone-depleting substance whose emissive uses are controlled and practically banned by the Montreal Protocol (MP). Nevertheless, previous work estimated ongoing emissions of 35 Gg year-1 of CCl4 into the atmosphere from observation-based methods, in stark contrast to emissions estimates of 3 (0-8) Gg year-1 from reported numbers to UNEP under the MP. Here we combine information on sources from industrial production processes and legacy emissions from contaminated sites to provide an updated bottom-up estimate on current CTC global emissions of 15-25 Gg year-1. We now propose 13 Gg year-1 of global emissions from unreported non-feedstock emissions from chloromethane and perchloroethylene plants as the most significant CCl4 source. Additionally, 2 Gg year-1 are estimated as fugitive emissions from the usage of CTC as feedstock and possibly up to 10 Gg year-1 from legacy emissions and chlor-alkali plants.
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
Amino acid composition of rumen bacteria and protozoa in cattle.
Sok, M; Ouellet, D R; Firkins, J L; Pellerin, D; Lapierre, H
2017-07-01
Because microbial crude protein (MCP) constitutes more than 50% of the protein digested in cattle, its AA composition is needed to adequately estimate AA supply. Our objective was to update the AA contributions of the rumen microbial AA flowing to the duodenum using only studies from cattle, differentiating between fluid-associated bacteria (FAB), particle-associated bacteria (PAB), and protozoa, based on published literature (53, 16, and 18 treatment means were used for each type of microorganism, respectively). In addition, Cys and Met reported concentrations were retained only when an adequate protection of the sulfur groups was performed before the acid hydrolysis. The total AA (or true protein) fraction represented 82.4% of CP in bacteria. For 10 AA, including 4 essential AA, the AA composition differed between protozoa and bacteria. The most noticeable differences were a 45% lower Lys concentration and 40% higher Ala concentration in bacteria than in protozoa. Differences between FAB and PAB were less pronounced than differences between bacteria and protozoa. Assuming 33% FAB, 50% PAB, and 17% of protozoa in MCP duodenal flow, the updated concentrations of AA would decrease supply estimates of Met, Thr, and Val originating from MCP and increase those of Lys and Phe by 5 to 10% compared with those calculated using the FAB composition reported previously. Therefore, inclusion of the contribution of PAB and protozoa to the duodenal MCP flow is needed to adequately estimate AA supply from microbial origin when a factorial method is used to estimate duodenal AA flow. Furthermore, acknowledging the fact that hydrolysis of 1 kg of true microbial protein yields 1.16 kg of free AA substantially increases the estimates of AA supply from MCP. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Aircraft engine sensor fault diagnostics using an on-line OBEM update method.
Liu, Xiaofeng; Xue, Naiyu; Yuan, Ye
2017-01-01
This paper proposed a method to update the on-line health reference baseline of the On-Board Engine Model (OBEM) to maintain the effectiveness of an in-flight aircraft sensor Fault Detection and Isolation (FDI) system, in which a Hybrid Kalman Filter (HKF) was incorporated. Generated from a rapid in-flight engine degradation, a large health condition mismatch between the engine and the OBEM can corrupt the performance of the FDI. Therefore, it is necessary to update the OBEM online when a rapid degradation occurs, but the FDI system will lose estimation accuracy if the estimation and update are running simultaneously. To solve this problem, the health reference baseline for a nonlinear OBEM was updated using the proposed channel controller method. Simulations based on the turbojet engine Linear-Parameter Varying (LPV) model demonstrated the effectiveness of the proposed FDI system in the presence of substantial degradation, and the channel controller can ensure that the update process finishes without interference from a single sensor fault.
Aircraft engine sensor fault diagnostics using an on-line OBEM update method
Liu, Xiaofeng; Xue, Naiyu; Yuan, Ye
2017-01-01
This paper proposed a method to update the on-line health reference baseline of the On-Board Engine Model (OBEM) to maintain the effectiveness of an in-flight aircraft sensor Fault Detection and Isolation (FDI) system, in which a Hybrid Kalman Filter (HKF) was incorporated. Generated from a rapid in-flight engine degradation, a large health condition mismatch between the engine and the OBEM can corrupt the performance of the FDI. Therefore, it is necessary to update the OBEM online when a rapid degradation occurs, but the FDI system will lose estimation accuracy if the estimation and update are running simultaneously. To solve this problem, the health reference baseline for a nonlinear OBEM was updated using the proposed channel controller method. Simulations based on the turbojet engine Linear-Parameter Varying (LPV) model demonstrated the effectiveness of the proposed FDI system in the presence of substantial degradation, and the channel controller can ensure that the update process finishes without interference from a single sensor fault. PMID:28182692
Updating histological data on crown initiation and crown completion ages in southern Africans.
Reid, Donald J; Guatelli-Steinberg, Debbie
2017-04-01
To update histological data on crown initiation and completion ages in southern Africans. To evaluate implications of these data for studies that: (a) rely on these data to time linear enamel hypoplasias (LEHs), or, (b) use these data for comparison to fossil hominins. Initiation ages were calculated on 67 histological sections from southern Africans, with sample sizes ranging from one to 11 per tooth type. Crown completion ages for southern Africans were calculated in two ways. First, actual derived initiation ages were added to crown formation times for each histological section to obtain direct information on the crown completion ages of individuals. Second, average initiation ages from this study were added to average crown formation times of southern Africans from the Reid and coworkers previous studies that were based on larger samples. For earlier-initiating tooth types (all anterior teeth and first molars), there is little difference in ages of initiation and crown completion between this and previous studies. Differences increase as a function of initiation age, such that the greatest differences between this and previous studies for both initiation and crown completion ages are for the second and third molars. This study documents variation in initiation ages, particularly for later-initiating tooth types. It upholds the use of previously published histological aging charts for LEHs on anterior teeth. However, this study finds that ages of crown initiation and completion in second and third molars for this southern African sample are earlier than previously estimated. These earlier ages reduce differences between modern humans and fossil hominins for these developmental events in second and third molars. © 2017 Wiley Periodicals, Inc.
Sadatsafavi, Mohsen; Xie, Hui; Etminan, Mahyar; Johnson, Kate; FitzGerald, J Mark
2018-01-01
There is minimal evidence on the extent to which the occurrence of a severe acute exacerbation of COPD that results in hospitalization affects the subsequent disease course. Previous studies on this topic did not generate causally-interpretable estimates. Our aim was to use corrected methodology to update previously reported estimates of the associations between previous and future exacerbations in these patients. Using administrative health data in British Columbia, Canada (1997-2012), we constructed a cohort of patients with at least one severe exacerbation, defined as an episode of inpatient care with the main diagnosis of COPD based on international classification of diseases (ICD) codes. We applied a random-effects 'joint frailty' survival model that is particularly developed for the analysis of recurrent events in the presence of competing risk of death and heterogeneity among individuals in their rate of events. Previous severe exacerbations entered the model as dummy-coded time-dependent covariates, and the model was adjusted for several observable patient and disease characteristics. 35,994 individuals (mean age at baseline 73.7, 49.8% female, average follow-up 3.21 years) contributed 34,271 severe exacerbations during follow-up. The first event was associated with a hazard ratio (HR) of 1.75 (95%CI 1.69-1.82) for the risk of future severe exacerbations. This risk decreased to HR = 1.36 (95%CI 1.30-1.42) for the second event and to 1.18 (95%CI 1.12-1.25) for the third event. The first two severe exacerbations that occurred during follow-up were also significantly associated with increased risk of all-cause mortality. There was substantial heterogeneity in the individual-specific rate of severe exacerbations. Even after adjusting for observable characteristics, individuals in the 97.5th percentile of exacerbation rate had 5.6 times higher rate of severe exacerbations than those in the 2.5th percentile. Using robust statistical methodology that controlled for heterogeneity in exacerbation rates among individuals, we demonstrated potential causal associations among past and future severe exacerbations, albeit the magnitude of association was noticeably lower than previously reported. The prevention of severe exacerbations has the potential to modify the disease trajectory.
Economic Costs of Alcohol and Drug Abuse in Texas: 1997 Update.
ERIC Educational Resources Information Center
Liu, Liang Y.
This report provides an update of the costs of alcohol and drug abuse for 1997. The 1997 costs were estimated by multiplying the percent changes in various socioeconomic factors from 1989 to 1997 by the cost estimates. The adverse health and social consequences of substance abuse extensively increased costs to the state. The total economic costs…
Steenland, Kyle; Burnett, Carol; Lalich, Nina; Ward, Elizabeth; Hurrell, Joseph
2003-05-01
Deaths due to occupational disease and injury place a heavy burden on society in terms of economic costs and human suffering. We estimate the annual deaths due to selected diseases for which an occupational association is reasonably well established and quantifiable, by calculation of attributable fractions (AFs), with full documentation; the deaths due to occupational injury are then added to derive an estimated number of annual deaths due to occupation. Using 1997 US mortality data, the estimated annual burden of occupational disease mortality resulting from selected respiratory diseases, cancers, cardiovascular disease, chronic renal failure, and hepatitis is 49,000, with a range from 26,000 to 72,000. The Bureau of Labor Statistics estimates there are about 6,200 work-related injury deaths annually. Adding disease and injury data, we estimate that there are a total of 55,200 US deaths annually resulting from occupational disease or injury (range 32,200-78,200). Our estimate is in the range reported by previous investigators, although we have restricted ourselves more than others to only those diseases with well-established occupational etiology, biasing our estimates conservatively. The underlying assumptions and data used to generate the estimates are well documented, so our estimates may be updated as new data emerges on occupational risks and exposed populations, providing an advantage over previous studies. We estimate that occupational deaths are the 8th leading cause of death in the US, after diabetes (64,751) but ahead of suicide (30,575), and greater than the annual number of motor vehicle deaths per year (43,501). Copyright 2003 Wiley-Liss, Inc.
How unrealistic optimism is maintained in the face of reality.
Sharot, Tali; Korn, Christoph W; Dolan, Raymond J
2011-10-09
Unrealistic optimism is a pervasive human trait that influences domains ranging from personal relationships to politics and finance. How people maintain unrealistic optimism, despite frequently encountering information that challenges those biased beliefs, is unknown. We examined this question and found a marked asymmetry in belief updating. Participants updated their beliefs more in response to information that was better than expected than to information that was worse. This selectivity was mediated by a relative failure to code for errors that should reduce optimism. Distinct regions of the prefrontal cortex tracked estimation errors when those called for positive update, both in individuals who scored high and low on trait optimism. However, highly optimistic individuals exhibited reduced tracking of estimation errors that called for negative update in right inferior prefrontal gyrus. These findings indicate that optimism is tied to a selective update failure and diminished neural coding of undesirable information regarding the future.
Renewable Hydrogen Potential from Biogas in the United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saur, G.; Milbrandt, A.
This analysis updates and expands upon previous biogas studies to include total potential and net availability of methane in raw biogas with respect to competing demands and includes a resource assessment of four sources of biogas: (1) wastewater treatment plants, including domestic and a new assessment of industrial sources; (2) landfills; (3) animal manure; and (4) a new assessment of industrial, institutional, and commercial sources. The results of the biogas resource assessment are used to estimate the potential production of renewable hydrogen from biogas as well as the fuel cell electric vehicles that the produced hydrogen might support.
Calibrating random forests for probability estimation.
Dankowski, Theresa; Ziegler, Andreas
2016-09-30
Probabilities can be consistently estimated using random forests. It is, however, unclear how random forests should be updated to make predictions for other centers or at different time points. In this work, we present two approaches for updating random forests for probability estimation. The first method has been proposed by Elkan and may be used for updating any machine learning approach yielding consistent probabilities, so-called probability machines. The second approach is a new strategy specifically developed for random forests. Using the terminal nodes, which represent conditional probabilities, the random forest is first translated to logistic regression models. These are, in turn, used for re-calibration. The two updating strategies were compared in a simulation study and are illustrated with data from the German Stroke Study Collaboration. In most simulation scenarios, both methods led to similar improvements. In the simulation scenario in which the stricter assumptions of Elkan's method were not met, the logistic regression-based re-calibration approach for random forests outperformed Elkan's method. It also performed better on the stroke data than Elkan's method. The strength of Elkan's method is its general applicability to any probability machine. However, if the strict assumptions underlying this approach are not met, the logistic regression-based approach is preferable for updating random forests for probability estimation. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
America's Children and the Environment, Third Edition ...
America's Children and the Environment is the U.S. EPA's report of children's environmental health indicators. Two editions of the report have been published, in 2000 and 2003, and a website is maintained with updated values for the indicators. The new Third Edition of America's Children and the Environment incorporates updates and revisions to previous content as well as several new indicators. America's Children and the Environment is the U.S. EPA's report of children's environmental health indicators. Two editions of the report have been published, in 2000 and 2003, and a website is maintained with updated values for the indicators. The new Third Edition of America's Children and the Environment incorporates updates and revisions to previous content as well as several new indicators.
LESS: Link Estimation with Sparse Sampling in Intertidal WSNs
Ji, Xiaoyu; Chen, Yi-chao; Li, Xiaopeng; Xu, Wenyuan
2018-01-01
Deploying wireless sensor networks (WSN) in the intertidal area is an effective approach for environmental monitoring. To sustain reliable data delivery in such a dynamic environment, a link quality estimation mechanism is crucial. However, our observations in two real WSN systems deployed in the intertidal areas reveal that link update in routing protocols often suffers from energy and bandwidth waste due to the frequent link quality measurement and updates. In this paper, we carefully investigate the network dynamics using real-world sensor network data and find it feasible to achieve accurate estimation of link quality using sparse sampling. We design and implement a compressive-sensing-based link quality estimation protocol, LESS, which incorporates both spatial and temporal characteristics of the system to aid the link update in routing protocols. We evaluate LESS in both real WSN systems and a large-scale simulation, and the results show that LESS can reduce energy and bandwidth consumption by up to 50% while still achieving more than 90% link quality estimation accuracy. PMID:29494557
Thermodynamic characterization of tandem mismatches found in naturally occurring RNA
Christiansen, Martha E.; Znosko, Brent M.
2009-01-01
Although all sequence symmetric tandem mismatches and some sequence asymmetric tandem mismatches have been thermodynamically characterized and a model has been proposed to predict the stability of previously unmeasured sequence asymmetric tandem mismatches [Christiansen,M.E. and Znosko,B.M. (2008) Biochemistry, 47, 4329–4336], experimental thermodynamic data for frequently occurring tandem mismatches is lacking. Since experimental data is preferred over a predictive model, the thermodynamic parameters for 25 frequently occurring tandem mismatches were determined. These new experimental values, on average, are 1.0 kcal/mol different from the values predicted for these mismatches using the previous model. The data for the sequence asymmetric tandem mismatches reported here were then combined with the data for 72 sequence asymmetric tandem mismatches that were published previously, and the parameters used to predict the thermodynamics of previously unmeasured sequence asymmetric tandem mismatches were updated. The average absolute difference between the measured values and the values predicted using these updated parameters is 0.5 kcal/mol. This updated model improves the prediction for tandem mismatches that were predicted rather poorly by the previous model. This new experimental data and updated predictive model allow for more accurate calculations of the free energy of RNA duplexes containing tandem mismatches, and, furthermore, should allow for improved prediction of secondary structure from sequence. PMID:19509311
NASA Technical Reports Server (NTRS)
Wier, C. E.; Wobber, F. J.; Russell, O. R.; Martin, K. R. (Principal Investigator)
1973-01-01
The author has identified the following significant results. Mined land reclamation analysis procedures developed within the Indiana portion of the Illinois Coal Basin were independently tested in Ohio utilizing 1:80,000 scale enlargements of ERTS-1 image 1029-15361-7 (dated August 21, 1972). An area in Belmont County was selected for analysis due to the extensive surface mining and the different degrees of reclamation occurring in this area. Contour mining in this area provided the opportunity to extend techniques developed for analysis of relatively flat mining areas in Indiana to areas of rolling topography in Ohio. The analysts had no previous experience in the area. Field investigations largely confirmed office analysis results although in a few areas estimates of vegetation percentages were found to be too high. In one area this error approximated 25%. These results suggest that systematic ERTS-1 analysis in combination with selective field sampling can provide reliable vegetation percentage estimates in excess of 25% accuracy with minimum equipment investment and training. The utility of ERTS-1 for practical and reasonably reliable update of mined lands information for groups with budget limitations is suggested. Many states can benefit from low cost updates using ERTS-1 imagery from public sources.
Doulamis, A; Doulamis, N; Ntalianis, K; Kollias, S
2003-01-01
In this paper, an unsupervised video object (VO) segmentation and tracking algorithm is proposed based on an adaptable neural-network architecture. The proposed scheme comprises: 1) a VO tracking module and 2) an initial VO estimation module. Object tracking is handled as a classification problem and implemented through an adaptive network classifier, which provides better results compared to conventional motion-based tracking algorithms. Network adaptation is accomplished through an efficient and cost effective weight updating algorithm, providing a minimum degradation of the previous network knowledge and taking into account the current content conditions. A retraining set is constructed and used for this purpose based on initial VO estimation results. Two different scenarios are investigated. The first concerns extraction of human entities in video conferencing applications, while the second exploits depth information to identify generic VOs in stereoscopic video sequences. Human face/ body detection based on Gaussian distributions is accomplished in the first scenario, while segmentation fusion is obtained using color and depth information in the second scenario. A decision mechanism is also incorporated to detect time instances for weight updating. Experimental results and comparisons indicate the good performance of the proposed scheme even in sequences with complicated content (object bending, occlusion).
Updates Technologies of Media Change
ERIC Educational Resources Information Center
Comer, Joshua
2015-01-01
Whether as status notifications in news feeds or interactive prompts in online video services, updates punctuate the background routines of media by bringing a variety of changes to the attention of users. In this dissertation I argue that updates rationalize media change by making previously obscure actions of users and movements of technologies…
Spatial Updating of Environments Described in Texts
ERIC Educational Resources Information Center
Avraamides, Marios N.
2003-01-01
People update egocentric spatial relations in an effortless and on-line manner when they move in the environment, but not when they only imagine themselves moving. In contrast to previous studies, the present experiments examined egocentric updating with spatial scenes that were encoded linguistically instead of perceived directly. Experiment 1…
The four-dimensional data assimilation (FDDA) technique in the Weather Research and Forecasting (WRF) meteorological model has recently undergone an important update from the original version. Previous evaluation results have demonstrated that the updated FDDA approach in WRF pr...
Comprehensive Thematic T-matrix Reference Database: a 2013-2014 Update
NASA Technical Reports Server (NTRS)
Mishchenko, Michael I.; Zakharova, Nadezhda T.; Khlebtsov, Nikolai G.; Wriedt, Thomas; Videen, Gorden
2014-01-01
This paper is the sixth update to the comprehensive thematic database of peer-reviewedT-matrix publications initiated by us in 2004 and includes relevant publications that have appeared since 2013. It also lists several earlier publications not incorporated in the original database and previous updates.
Bias Corrections for Regional Estimates of the Time-averaged Geomagnetic Field
NASA Astrophysics Data System (ADS)
Constable, C.; Johnson, C. L.
2009-05-01
We assess two sources of bias in the time-averaged geomagnetic field (TAF) and paleosecular variation (PSV): inadequate temporal sampling, and the use of unit vectors in deriving temporal averages of the regional geomagnetic field. For the first temporal sampling question we use statistical resampling of existing data sets to minimize and correct for bias arising from uneven temporal sampling in studies of the time- averaged geomagnetic field (TAF) and its paleosecular variation (PSV). The techniques are illustrated using data derived from Hawaiian lava flows for 0-5~Ma: directional observations are an updated version of a previously published compilation of paleomagnetic directional data centered on ± 20° latitude by Lawrence et al./(2006); intensity data are drawn from Tauxe & Yamazaki, (2007). We conclude that poor temporal sampling can produce biased estimates of TAF and PSV, and resampling to appropriate statistical distribution of ages reduces this bias. We suggest that similar resampling should be attempted as a bias correction for all regional paleomagnetic data to be used in TAF and PSV modeling. The second potential source of bias is the use of directional data in place of full vector data to estimate the average field. This is investigated for the full vector subset of the updated Hawaiian data set. Lawrence, K.P., C.G. Constable, and C.L. Johnson, 2006, Geochem. Geophys. Geosyst., 7, Q07007, DOI 10.1029/2005GC001181. Tauxe, L., & Yamazkai, 2007, Treatise on Geophysics,5, Geomagnetism, Elsevier, Amsterdam, Chapter 13,p509
Walking Distance Estimation Using Walking Canes with Inertial Sensors
Suh, Young Soo
2018-01-01
A walking distance estimation algorithm for cane users is proposed using an inertial sensor unit attached to various positions on the cane. A standard inertial navigation algorithm using an indirect Kalman filter was applied to update the velocity and position of the cane during movement. For quadripod canes, a standard zero-velocity measurement-updating method is proposed. For standard canes, a velocity-updating method based on an inverted pendulum model is proposed. The proposed algorithms were verified by three walking experiments with two different types of canes and different positions of the sensor module. PMID:29342971
Robust double gain unscented Kalman filter for small satellite attitude estimation
NASA Astrophysics Data System (ADS)
Cao, Lu; Yang, Weiwei; Li, Hengnian; Zhang, Zhidong; Shi, Jianjun
2017-08-01
Limited by the low precision of small satellite sensors, the estimation theories with high performance remains the most popular research topic for the attitude estimation. The Kalman filter (KF) and its extensions have been widely applied in the satellite attitude estimation and achieved plenty of achievements. However, most of the existing methods just take use of the current time-step's priori measurement residuals to complete the measurement update and state estimation, which always ignores the extraction and utilization of the previous time-step's posteriori measurement residuals. In addition, the uncertainty model errors always exist in the attitude dynamic system, which also put forward the higher performance requirements for the classical KF in attitude estimation problem. Therefore, the novel robust double gain unscented Kalman filter (RDG-UKF) is presented in this paper to satisfy the above requirements for the small satellite attitude estimation with the low precision sensors. It is assumed that the system state estimation errors can be exhibited in the measurement residual; therefore, the new method is to derive the second Kalman gain Kk2 for making full use of the previous time-step's measurement residual to improve the utilization efficiency of the measurement data. Moreover, the sequence orthogonal principle and unscented transform (UT) strategy are introduced to robust and enhance the performance of the novel Kalman Filter in order to reduce the influence of existing uncertainty model errors. Numerical simulations show that the proposed RDG-UKF is more effective and robustness in dealing with the model errors and low precision sensors for the attitude estimation of small satellite by comparing with the classical unscented Kalman Filter (UKF).
Source Update Capture in Information Agents
NASA Technical Reports Server (NTRS)
Ashish, Naveen; Kulkarni, Deepak; Wang, Yao
2003-01-01
In this paper we present strategies for successfully capturing updates at Web sources. Web-based information agents provide integrated access to autonomous Web sources that can get updated. For many information agent applications we are interested in knowing when a Web source to which the application provides access, has been updated. We may also be interested in capturing all the updates at a Web source over a period of time i.e., detecting the updates and, for each update retrieving and storing the new version of data. Previous work on update and change detection by polling does not adequately address this problem. We present strategies for intelligently polling a Web source for efficiently capturing changes at the source.
Landform partitioning and estimates of deep storage of soil organic matter in Zackenberg, Greenland
NASA Astrophysics Data System (ADS)
Palmtag, Juri; Cable, Stefanie; Christiansen, Hanne H.; Hugelius, Gustaf; Kuhry, Peter
2018-05-01
Soils in the northern high latitudes are a key component in the global carbon cycle, with potential feedback on climate. This study aims to improve the previous soil organic carbon (SOC) and total nitrogen (TN) storage estimates for the Zackenberg area (NE Greenland) that were based on a land cover classification (LCC) approach, by using geomorphological upscaling. In addition, novel organic carbon (OC) estimates for deeper alluvial and deltaic deposits (down to 300 cm depth) are presented. We hypothesise that landforms will better represent the long-term slope and depositional processes that result in deep SOC burial in this type of mountain permafrost environments. The updated mean SOC storage for the 0-100 cm soil depth is 4.8 kg C m-2, which is 42 % lower than the previous estimate of 8.3 kg C m-2 based on land cover upscaling. Similarly, the mean soil TN storage in the 0-100 cm depth decreased with 44 % from 0.50 kg (± 0.1 CI) to 0.28 (±0.1 CI) kg TN m-2. We ascribe the differences to a previous areal overestimate of SOC- and TN-rich vegetated land cover classes. The landform-based approach more correctly constrains the depositional areas in alluvial fans and deltas with high SOC and TN storage. These are also areas of deep carbon storage with an additional 2.4 kg C m-2 in the 100-300 cm depth interval. This research emphasises the need to consider geomorphology when assessing SOC pools in mountain permafrost landscapes.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-25
... dead discards. We will adjust the quotas in the final rule based on updated data, including dead... quota, the sum of updated landings data (from late reports) and dead discard estimates would need to reach or exceed 475 mt dw. In 2011, dead discards were estimated to equal 101.5 mt dw and late reports...
Kurtz, Steven M; Ong, Kevin L; Lau, Edmund; Bozic, Kevin J
2014-04-16
Few studies have explored the role of the National Health Expenditure and macroeconomics on the utilization of total joint replacement. The economic downturn has raised questions about the sustainability of growth for total joint replacement in the future. Previous projections of total joint replacement demand in the United States were based on data up to 2003 using a statistical methodology that neglected macroeconomic factors, such as the National Health Expenditure. Data from the Nationwide Inpatient Sample (1993 to 2010) were used with United States Census and National Health Expenditure data to quantify historical trends in total joint replacement rates, including the two economic downturns in the 2000s. Primary and revision hip and knee arthroplasty were identified using codes from the International Classification of Diseases, Ninth Revision, Clinical Modification. Projections in total joint replacement were estimated using a regression model incorporating the growth in population and rate of arthroplasties from 1993 to 2010 as a function of age, sex, race, and census region using the National Health Expenditure as the independent variable. The regression model was used in conjunction with government projections of National Health Expenditure from 2011 to 2021 to estimate future arthroplasty rates in subpopulations of the United States and to derive national estimates. The growth trend for the incidence of joint arthroplasty, for the overall United States population as well as for the United States workforce, was insensitive to economic downturns. From 2009 to 2010, the total number of procedures increased by 6.0% for primary total hip arthroplasty, 6.1% for primary total knee arthroplasty, 10.8% for revision total hip arthroplasty, and 13.5% for revision total knee arthroplasty. The National Health Expenditure model projections for primary hip replacement in 2020 were higher than a previously projected model, whereas the current model estimates for total knee arthroplasty were lower. Economic downturns in the 2000s did not substantially influence the national growth trends for hip and knee arthroplasty in the United States. These latest updated projections provide a basis for surgeons, hospitals, payers, and policy makers to plan for the future demand for total joint replacement surgery.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dirian, Yves; Foffa, Stefano; Kunz, Martin
We present a comprehensive and updated comparison with cosmological observations of two non-local modifications of gravity previously introduced by our group, the so called RR and RT models. We implement the background evolution and the cosmological perturbations of the models in a modified Boltzmann code, using CLASS. We then test the non-local models against the Planck 2015 TT, TE, EE and Cosmic Microwave Background (CMB) lensing data, isotropic and anisotropic Baryonic Acoustic Oscillations (BAO) data, JLA supernovae, H {sub 0} measurements and growth rate data, and we perform Bayesian parameter estimation. We then compare the RR, RT and ΛCDM models,more » using the Savage-Dickey method. We find that the RT model and ΛCDM perform equally well, while the performance of the RR model with respect to ΛCDM depends on whether or not we include a prior on H {sub 0} based on local measurements.« less
An ocean data assimilation system and reanalysis of the World Ocean hydrophysical fields
NASA Astrophysics Data System (ADS)
Zelenko, A. A.; Vil'fand, R. M.; Resnyanskii, Yu. D.; Strukov, B. S.; Tsyrulnikov, M. D.; Svirenko, P. I.
2016-07-01
A new version of the ocean data assimilation system (ODAS) developed at the Hydrometcentre of Russia is presented. The assimilation is performed following the sequential scheme analysis-forecast-analysis. The main components of the ODAS are procedures for operational observation data processing, a variational analysis scheme, and an ocean general circulation model used to estimate the first guess fields involved in the analysis. In situ observations of temperature and salinity in the upper 1400-m ocean layer obtained from various observational platforms are used as input data. In the new ODAS version, the horizontal resolution of the assimilating model and of the output products is increased, the previous 2D-Var analysis scheme is replaced by a more general 3D-Var scheme, and a more flexible incremental analysis updating procedure is introduced to correct the model calculations. A reanalysis of the main World Ocean hydrophysical fields over the 2005-2015 period has been performed using the updated ODAS. The reanalysis results are compared with data from independent sources.
Energy data sourcebook for the US residential sector
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wenzel, T.P.; Koomey, J.G.; Sanchez, M.
Analysts assessing policies and programs to improve energy efficiency in the residential sector require disparate input data from a variety of sources. This sourcebook, which updates a previous report, compiles these input data into a single location. The data provided include information on end-use unit energy consumption (UEC) values of appliances and equipment efficiency; historical and current appliance and equipment market shares; appliances and equipment efficiency and sales trends; appliance and equipment efficiency standards; cost vs. efficiency data for appliances and equipment; product lifetime estimates; thermal shell characteristics of buildings; heating and cooling loads; shell measure cost data for newmore » and retrofit buildings; baseline housing stocks; forecasts of housing starts; and forecasts of energy prices and other economic drivers. This report is the essential sourcebook for policy analysts interested in residential sector energy use. The report can be downloaded from the Web at http://enduse.lbl. gov/Projects/RED.html. Future updates to the report, errata, and related links, will also be posted at this address.« less
Unthank, Michael D.
2013-01-01
The Ohio River alluvial aquifer near Carrollton, Ky., is an important water resource for the cities of Carrollton and Ghent, as well as for several industries in the area. The groundwater of the aquifer is the primary source of drinking water in the region and a highly valued natural resource that attracts various water-dependent industries because of its quantity and quality. This report evaluates the performance of a numerical model of the groundwater-flow system in the Ohio River alluvial aquifer near Carrollton, Ky., published by the U.S. Geological Survey in 1999. The original model simulated conditions in November 1995 and was updated to simulate groundwater conditions estimated for September 2010. The files from the calibrated steady-state model of November 1995 conditions were imported into MODFLOW-2005 to update the model to conditions in September 2010. The model input files modified as part of this update were the well and recharge files. The design of the updated model and other input files are the same as the original model. The ability of the updated model to match hydrologic conditions for September 2010 was evaluated by comparing water levels measured in wells to those computed by the model. Water-level measurements were available for 48 wells in September 2010. Overall, the updated model underestimated the water levels at 36 of the 48 measured wells. The average difference between measured water levels and model-computed water levels was 3.4 feet and the maximum difference was 10.9 feet. The root-mean-square error of the simulation was 4.45 for all 48 measured water levels. The updated steady-state model could be improved by introducing more accurate and site-specific estimates of selected field parameters, refined model geometry, and additional numerical methods. Collection of field data to better estimate hydraulic parameters, together with continued review of available data and information from area well operators, could provide the model with revised estimates of conductance values for the riverbed and valley wall, hydraulic conductivities for the model layer, and target water levels for future simulations. Additional model layers, a redesigned model grid, and revised boundary conditions could provide a better framework for more accurate simulations. Additional numerical methods would identify possible parameter estimates and determine parameter sensitivities.
Neural basis for dynamic updating of object representation in visual working memory.
Takahama, Sachiko; Miyauchi, Satoru; Saiki, Jun
2010-02-15
In real world, objects have multiple features and change dynamically. Thus, object representations must satisfy dynamic updating and feature binding. Previous studies have investigated the neural activity of dynamic updating or feature binding alone, but not both simultaneously. We investigated the neural basis of feature-bound object representation in a dynamically updating situation by conducting a multiple object permanence tracking task, which required observers to simultaneously process both the maintenance and dynamic updating of feature-bound objects. Using an event-related design, we separated activities during memory maintenance and change detection. In the search for regions showing selective activation in dynamic updating of feature-bound objects, we identified a network during memory maintenance that was comprised of the inferior precentral sulcus, superior parietal lobule, and middle frontal gyrus. In the change detection period, various prefrontal regions, including the anterior prefrontal cortex, were activated. In updating object representation of dynamically moving objects, the inferior precentral sulcus closely cooperates with a so-called "frontoparietal network", and subregions of the frontoparietal network can be decomposed into those sensitive to spatial updating and feature binding. The anterior prefrontal cortex identifies changes in object representation by comparing memory and perceptual representations rather than maintaining object representations per se, as previously suggested. Copyright 2009 Elsevier Inc. All rights reserved.
View Estimation Based on Value System
NASA Astrophysics Data System (ADS)
Takahashi, Yasutake; Shimada, Kouki; Asada, Minoru
Estimation of a caregiver's view is one of the most important capabilities for a child to understand the behavior demonstrated by the caregiver, that is, to infer the intention of behavior and/or to learn the observed behavior efficiently. We hypothesize that the child develops this ability in the same way as behavior learning motivated by an intrinsic reward, that is, he/she updates the model of the estimated view of his/her own during the behavior imitated from the observation of the behavior demonstrated by the caregiver based on minimizing the estimation error of the reward during the behavior. From this view, this paper shows a method for acquiring such a capability based on a value system from which values can be obtained by reinforcement learning. The parameters of the view estimation are updated based on the temporal difference error (hereafter TD error: estimation error of the state value), analogous to the way such that the parameters of the state value of the behavior are updated based on the TD error. Experiments with simple humanoid robots show the validity of the method, and the developmental process parallel to young children's estimation of its own view during the imitation of the observed behavior of the caregiver is discussed.
19 CFR 10.21 - Updating cost data and other information.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 19 Customs Duties 1 2014-04-01 2014-04-01 false Updating cost data and other information. 10.21... Articles Assembled Abroad with United States Components § 10.21 Updating cost data and other information. When a claim for the exemption is predicated on estimated cost data furnished either in advance of or...
19 CFR 10.21 - Updating cost data and other information.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 19 Customs Duties 1 2013-04-01 2013-04-01 false Updating cost data and other information. 10.21... Articles Assembled Abroad with United States Components § 10.21 Updating cost data and other information. When a claim for the exemption is predicated on estimated cost data furnished either in advance of or...
19 CFR 10.21 - Updating cost data and other information.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 19 Customs Duties 1 2010-04-01 2010-04-01 false Updating cost data and other information. 10.21... Articles Assembled Abroad with United States Components § 10.21 Updating cost data and other information. When a claim for the exemption is predicated on estimated cost data furnished either in advance of or...
19 CFR 10.21 - Updating cost data and other information.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 19 Customs Duties 1 2011-04-01 2011-04-01 false Updating cost data and other information. 10.21... Articles Assembled Abroad with United States Components § 10.21 Updating cost data and other information. When a claim for the exemption is predicated on estimated cost data furnished either in advance of or...
Trends in College Spending: 2001-2011. A Delta Data Update
ERIC Educational Resources Information Center
Desrochers, Donna M.; Hurlburt, Steven
2014-01-01
This "Trends in College Spending" update presents national-level estimates for the "Delta Cost Project" data metrics during the period 2001-11. To accelerate the release of more current trend data, however, this update includes only a brief summary of the financial patterns and trends observed during the decade 2001-11, with…
Key algorithms used in GR02: A computer simulation model for predicting tree and stand growth
Garrett A. Hughes; Paul E. Sendak; Paul E. Sendak
1985-01-01
GR02 is an individual tree, distance-independent simulation model for predicting tree and stand growth over time. It performs five major functions during each run: (1) updates diameter at breast height, (2) updates total height, (3) estimates mortality, (4) determines regeneration, and (5) updates crown class.
Quaternion normalization in spacecraft attitude determination
NASA Technical Reports Server (NTRS)
Deutschmann, J.; Markley, F. L.; Bar-Itzhack, Itzhack Y.
1993-01-01
Attitude determination of spacecraft usually utilizes vector measurements such as Sun, center of Earth, star, and magnetic field direction to update the quaternion which determines the spacecraft orientation with respect to some reference coordinates in the three dimensional space. These measurements are usually processed by an extended Kalman filter (EKF) which yields an estimate of the attitude quaternion. Two EKF versions for quaternion estimation were presented in the literature; namely, the multiplicative EKF (MEKF) and the additive EKF (AEKF). In the multiplicative EKF, it is assumed that the error between the correct quaternion and its a-priori estimate is, by itself, a quaternion that represents the rotation necessary to bring the attitude which corresponds to the a-priori estimate of the quaternion into coincidence with the correct attitude. The EKF basically estimates this quotient quaternion and then the updated quaternion estimate is obtained by the product of the a-priori quaternion estimate and the estimate of the difference quaternion. In the additive EKF, it is assumed that the error between the a-priori quaternion estimate and the correct one is an algebraic difference between two four-tuple elements and thus the EKF is set to estimate this difference. The updated quaternion is then computed by adding the estimate of the difference to the a-priori quaternion estimate. If the quaternion estimate converges to the correct quaternion, then, naturally, the quaternion estimate has unity norm. This fact was utilized in the past to obtain superior filter performance by applying normalization to the filter measurement update of the quaternion. It was observed for the AEKF that when the attitude changed very slowly between measurements, normalization merely resulted in a faster convergence; however, when the attitude changed considerably between measurements, without filter tuning or normalization, the quaternion estimate diverged. However, when the quaternion estimate was normalized, the estimate converged faster and to a lower error than with tuning only. In last years, symposium we presented three new AEKF normalization techniques and we compared them to the brute force method presented in the literature. The present paper presents the issue of normalization of the MEKF and examines several MEKF normalization techniques.
NASA Astrophysics Data System (ADS)
Bonato, M.; Negrello, M.; Cai, Z.-Y.; De Zotti, G.; Bressan, A.; Lapi, A.; Pozzi, F.; Gruppioni, C.; Danese, L.
2014-11-01
We present new estimates of redshift-dependent luminosity functions of IR lines detectable by SPICA/SAFARI (SPace InfraRed telescope for Cosmology and Astrophysics/SpicA FAR infrared Instrument) and excited both by star formation and by AGN activity. The new estimates improve over previous work by using updated evolutionary models and dealing in a self-consistent way with emission of galaxies as a whole, including both the starburst and the AGN component. New relationships between line and AGN bolometric luminosity have been derived and those between line and IR luminosities of the starburst component have been updated. These ingredients were used to work out predictions for the source counts in 11 mid-/far-IR emission lines partially or entirely excited by AGN activity. We find that the statistics of the emission line detection of galaxies as a whole is mainly determined by the star formation rate, because of the rarity of bright AGNs. We also find that the slope of the line integral number counts is flatter than two implying that the number of detections at fixed observing time increases more by extending the survey area than by going deeper. We thus propose a wide spectroscopic survey of 1 h integration per field of view over an area of 5 deg2 to detect (at 5σ) ˜760 AGNs in [O IV]25.89 μm - the brightest AGN mid-infrared line - out to z ˜ 2. Pointed observations of strongly lensed or hyperluminous galaxies previously detected by large area surveys such as those by Herschel and by the South Pole Telescope can provide key information on the galaxy-AGN co-evolution out to higher redshifts.
A surprising dynamical mass for V773 Tau B
Boden, Andrew F.; Torres, Guillermo; Duchene, Gaspard; ...
2012-02-10
Here, we report on new high-resolution imaging and spectroscopy on the multiple T Tauri star system V773 Tau over the 2003-2009 period. With these data we derive relative astrometry, photometry between the A and B components, and radial velocity (RV) of the A-subsystem components. Combining these new data with previously published astrometry and RVs, we update the relative A-B orbit model. This updated orbit model, the known system distance, and A-subsystem parameters yield a dynamical mass for the B component for the first time. Remarkably, the derived B dynamical mass is in the range 1.7-3.0 M⊙. This is much highermore » than previous estimates and suggests that like A, B is also a multiple stellar system. Among these data, spatially resolved spectroscopy provides new insight into the nature of the B component. Similar to A, these near-IR spectra indicate that the dominant source in B is of mid-K spectral type. If B is in fact a multiple star system as suggested by the dynamical mass estimate, the simplest assumption is that B is composed of similar ~1.2 M ⊙ pre-main-sequence stars in a close (<1 AU) binary system. This inference is supported by line-shape changes in near-IR spectroscopy of B, tentatively interpreted as changing RV among components in V773 Tau B. Relative photometry indicates that B is highly variable in the near-IR. The most likely explanation for this variability is circum-B material resulting in variable line-of-sight extinction. The distribution of this material must be significantly affected by both the putative B multiplicity and the A-B orbit.« less
Giorda, Carlo B; Carnà, Paolo; Romeo, Francesco; Costa, Giuseppe; Tartaglino, Barbara; Gnavi, Roberto
2017-05-01
Estimates of the prevalence of hypothyroidism in unselected populations date from the late 1990s. We present an update on the prevalence and incidence of overt hypothyroidism in Piedmont, northwest Italy and examine the association between hypothyroidism and multiple chronic comorbidities. Data were obtained from drug prescription and hospital discharge databases. Individuals who had received at least two levothyroxine prescriptions in 2012 were defined as having hypothyroidism; those who had undergone thyroidectomy or I 131 irradiation in the previous 5 years were defined as having iatrogenic hypothyroidism and those who had either obtained exemption from treatment co-payment or had been discharged from hospital with a chronic comorbidity (diabetes and connective tissue diseases) were identified as having one of these conditions. The overall crude prevalence was 31.1/1000 (2.3/1000 for iatrogenic hypothyroidism) and the overall crude incidence was 7/1000. The average daily dose of thyroxine (122 µg) roughly corresponded to 1.7 µg/kg. There was a strong association between hypothyroidism and diabetes (type 1, type 2 or gestational) and with autoimmune diseases, with the odds ratio ranging from 1.43 (1.02-1.99) for psoriatic arthritis to 4.99 (3.06-8.15) for lupus erythematosus. As compared with previous estimates, the prevalence of hypothyroidism rose by about 35%, driven mainly by non-iatrogenic forms. The increase may be due to either population aging or improved diagnostic capability or both. The frequent co-occurrence of hypothyroidism with other multiple chronic conditions characterizes it more as a comorbidity rather than an isolated chronic disease. © 2017 European Society of Endocrinology.
Connecticut's Children: Still at Risk. 1995 Data Update.
ERIC Educational Resources Information Center
Cunningham, Michelle Doucette
This 1995 update to "Connecticut's Children: Still at Risk" is the second annual report examining how children in the state are faring. The title indicates that Connecticut's children are at tremendous risk of failing to become productive adults. The update does not repeat much of the general information from the previous year's…
NASA Technical Reports Server (NTRS)
Tikidjian, Raffi; Mackey, Ryan
2008-01-01
The DSN Array Simulator (wherein 'DSN' signifies NASA's Deep Space Network) is an updated version of software previously denoted the DSN Receive Array Technology Assessment Simulation. This software (see figure) is used for computational modeling of a proposed DSN facility comprising user-defined arrays of antennas and transmitting and receiving equipment for microwave communication with spacecraft on interplanetary missions. The simulation includes variations in spacecraft tracked and communication demand changes for up to several decades of future operation. Such modeling is performed to estimate facility performance, evaluate requirements that govern facility design, and evaluate proposed improvements in hardware and/or software. The updated version of this software affords enhanced capability for characterizing facility performance against user-defined mission sets. The software includes a Monte Carlo simulation component that enables rapid generation of key mission-set metrics (e.g., numbers of links, data rates, and date volumes), and statistical distributions thereof as functions of time. The updated version also offers expanded capability for mixed-asset network modeling--for example, for running scenarios that involve user-definable mixtures of antennas having different diameters (in contradistinction to a fixed number of antennas having the same fixed diameter). The improved version also affords greater simulation fidelity, sufficient for validation by comparison with actual DSN operations and analytically predictable performance metrics.
Kaiser, Kathryn A.; Shikany, James M.; Keating, Karen D.; Allison, David B.
2014-01-01
We provide arguments to the debate question and update a previous meta-analysis with recently published studies on effects of sugar-sweetened beverages (SSBs) on body weight/composition indices (BWIs). We abstracted data from randomized controlled trials examining effects of consumption of SSBs on BWIs. Six new studies met these criteria: 1) human trials, 2) 3 weeks duration, 3) random assignment to conditions differing only in consumption of SSBs, and 4) including a BWI outcome. Updated meta-analysis of a total of seven studies that added SSBs to persons’ diets showed dose-dependent increases in weight. Updated meta-analysis of eight studies attempting to reduce SSB consumption showed an equivocal effect on BWIs in all randomized subjects. When limited to subjects overweight at baseline, meta-analysis showed a significant effect of roughly 0.25 standard deviations (more weight loss/less weight gain) relative to controls. Evidence to date is equivocal in showing that decreasing SSB consumption will reduce the prevalence of obesity. Although new evidence suggests that an effect may yet be demonstrable in some populations, the integrated effect size estimate remains very small and of equivocal statistical significance. Problems in this research area and suggestions for future research are highlighted. PMID:23742715
NASA Astrophysics Data System (ADS)
Ghosh, S.; Lopez-Coto, I.; Prasad, K.; Karion, A.; Mueller, K.; Gourdji, S.; Martin, C.; Whetstone, J. R.
2017-12-01
The National Institute of Standards and Technology (NIST) supports the North-East Corridor Baltimore Washington (NEC-B/W) project and Indianapolis Flux Experiment (INFLUX) aiming to quantify sources of Greenhouse Gas (GHG) emissions as well as their uncertainties. These projects employ different flux estimation methods including top-down inversion approaches. The traditional Bayesian inversion method estimates emission distributions by updating prior information using atmospheric observations of Green House Gases (GHG) coupled to an atmospheric and dispersion model. The magnitude of the update is dependent upon the observed enhancement along with the assumed errors such as those associated with prior information and the atmospheric transport and dispersion model. These errors are specified within the inversion covariance matrices. The assumed structure and magnitude of the specified errors can have large impact on the emission estimates from the inversion. The main objective of this work is to build a data-adaptive model for these covariances matrices. We construct a synthetic data experiment using a Kalman Filter inversion framework (Lopez et al., 2017) employing different configurations of transport and dispersion model and an assumed prior. Unlike previous traditional Bayesian approaches, we estimate posterior emissions using regularized sample covariance matrices associated with prior errors to investigate whether the structure of the matrices help to better recover our hypothetical true emissions. To incorporate transport model error, we use ensemble of transport models combined with space-time analytical covariance to construct a covariance that accounts for errors in space and time. A Kalman Filter is then run using these covariances along with Maximum Likelihood Estimates (MLE) of the involved parameters. Preliminary results indicate that specifying sptio-temporally varying errors in the error covariances can improve the flux estimates and uncertainties. We also demonstrate that differences between the modeled and observed meteorology can be used to predict uncertainties associated with atmospheric transport and dispersion modeling which can help improve the skill of an inversion at urban scales.
Adaptive bearing estimation and tracking of multiple targets in a realistic passive sonar scenario
NASA Astrophysics Data System (ADS)
Rajagopal, R.; Challa, Subhash; Faruqi, Farhan A.; Rao, P. R.
1997-06-01
In a realistic passive sonar environment, the received signal consists of multipath arrivals from closely separated moving targets. The signals are contaminated by spatially correlated noise. The differential MUSIC has been proposed to estimate the DOAs in such a scenario. This method estimates the 'noise subspace' in order to estimate the DOAs. However, the 'noise subspace' estimate has to be updated as and when new data become available. In order to save the computational costs, a new adaptive noise subspace estimation algorithm is proposed in this paper. The salient features of the proposed algorithm are: (1) Noise subspace estimation is done by QR decomposition of the difference matrix which is formed from the data covariance matrix. Thus, as compared to standard eigen-decomposition based methods which require O(N3) computations, the proposed method requires only O(N2) computations. (2) Noise subspace is updated by updating the QR decomposition. (3) The proposed algorithm works in a realistic sonar environment. In the second part of the paper, the estimated bearing values are used to track multiple targets. In order to achieve this, the nonlinear system/linear measurement extended Kalman filtering proposed is applied. Computer simulation results are also presented to support the theory.
Impact of the time scale of model sensitivity response on coupled model parameter estimation
NASA Astrophysics Data System (ADS)
Liu, Chang; Zhang, Shaoqing; Li, Shan; Liu, Zhengyu
2017-11-01
That a model has sensitivity responses to parameter uncertainties is a key concept in implementing model parameter estimation using filtering theory and methodology. Depending on the nature of associated physics and characteristic variability of the fluid in a coupled system, the response time scales of a model to parameters can be different, from hourly to decadal. Unlike state estimation, where the update frequency is usually linked with observational frequency, the update frequency for parameter estimation must be associated with the time scale of the model sensitivity response to the parameter being estimated. Here, with a simple coupled model, the impact of model sensitivity response time scales on coupled model parameter estimation is studied. The model includes characteristic synoptic to decadal scales by coupling a long-term varying deep ocean with a slow-varying upper ocean forced by a chaotic atmosphere. Results show that, using the update frequency determined by the model sensitivity response time scale, both the reliability and quality of parameter estimation can be improved significantly, and thus the estimated parameters make the model more consistent with the observation. These simple model results provide a guideline for when real observations are used to optimize the parameters in a coupled general circulation model for improving climate analysis and prediction initialization.
Economic burden of seasonal influenza in the United States.
Putri, Wayan C W S; Muscatello, David J; Stockwell, Melissa S; Newall, Anthony T
2018-05-22
Seasonal influenza is responsible for a large disease and economic burden. Despite the expanding recommendation of influenza vaccination, influenza has continued to be a major public health concern in the United States (U.S.). To evaluate influenza prevention strategies it is important that policy makers have current estimates of the economic burden of influenza. To provide an updated estimate of the average annual economic burden of seasonal influenza in the U.S. population in the presence of vaccination efforts. We evaluated estimates of age-specific influenza-attributable outcomes (ill-non medically attended, office-based outpatient visit, emergency department visits, hospitalizations and death) and associated productivity loss. Health outcome rates were applied to the 2015 U.S. population and multiplied by the relevant estimated unit costs for each outcome. We evaluated both direct healthcare costs and indirect costs (absenteeism from paid employment) reporting results from both a healthcare system and societal perspective. Results were presented in five age groups (<5 years, 5-17 years, 18-49 years, 50-64 years and ≥65 years of age). The estimated average annual total economic burden of influenza to the healthcare system and society was $11.2 billion ($6.3-$25.3 billion). Direct medical costs were estimated to be $3.2 billion ($1.5-$11.7 billion) and indirect costs $8.0 billion ($4.8-$13.6 billion). These total costs were based on the estimated average numbers of (1) ill-non medically attended patients (21.6 million), (2) office-based outpatient visits (3.7 million), (3) emergency department visit (0.65 million) (4) hospitalizations (247.0 thousand), (5) deaths (36.3 thousand) and (6) days of productivity lost (20.1 million). This study provides an updated estimate of the total economic burden of influenza in the U.S. Although we found a lower total cost than previously estimated, our results confirm that influenza is responsible for a substantial economic burden in the U.S. Copyright © 2018. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Jansen, Jan T. M.; Shrimpton, Paul C.
2016-07-01
The ImPACT (imaging performance assessment of CT scanners) CT patient dosimetry calculator is still used world-wide to estimate organ and effective doses (E) for computed tomography (CT) examinations, although the tool is based on Monte Carlo calculations reflecting practice in the early 1990’s. Subsequent developments in CT scanners, definitions of E, anthropomorphic phantoms, computers and radiation transport codes, have all fuelled an urgent need for updated organ dose conversion factors for contemporary CT. A new system for such simulations has been developed and satisfactorily tested. Benchmark comparisons of normalised organ doses presently derived for three old scanners (General Electric 9800, Philips Tomoscan LX and Siemens Somatom DRH) are within 5% of published values. Moreover, calculated normalised values of CT Dose Index for these scanners are in reasonable agreement (within measurement and computational uncertainties of ±6% and ±1%, respectively) with reported standard measurements. Organ dose coefficients calculated for a contemporary CT scanner (Siemens Somatom Sensation 16) demonstrate potential deviations by up to around 30% from the surrogate values presently assumed (through a scanner matching process) when using the ImPACT CT Dosimetry tool for newer scanners. Also, illustrative estimates of E for some typical examinations and a range of anthropomorphic phantoms demonstrate the significant differences (by some 10’s of percent) that can arise when changing from the previously adopted stylised mathematical phantom to the voxel phantoms presently recommended by the International Commission on Radiological Protection (ICRP), and when following the 2007 ICRP recommendations (updated from 1990) concerning tissue weighting factors. Further simulations with the validated dosimetry system will provide updated series of dose coefficients for a wide range of contemporary scanners.
Astrometric exoplanet detection with Gaia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perryman, Michael; Hartman, Joel; Bakos, Gáspár Á.
2014-12-10
We provide a revised assessment of the number of exoplanets that should be discovered by Gaia astrometry, extending previous studies to a broader range of spectral types, distances, and magnitudes. Our assessment is based on a large representative sample of host stars from the TRILEGAL Galaxy population synthesis model, recent estimates of the exoplanet frequency distributions as a function of stellar type, and detailed simulation of the Gaia observations using the updated instrument performance and scanning law. We use two approaches to estimate detectable planetary systems: one based on the signal-to-noise ratio of the astrometric signature per field crossing, easilymore » reproducible and allowing comparisons with previous estimates, and a new and more robust metric based on orbit fitting to the simulated satellite data. With some plausible assumptions on planet occurrences, we find that some 21,000 (±6000) high-mass (∼1-15M {sub J}) long-period planets should be discovered out to distances of ∼500 pc for the nominal 5 yr mission (including at least 1000-1500 around M dwarfs out to 100 pc), rising to some 70,000 (±20, 000) for a 10 yr mission. We indicate some of the expected features of this exoplanet population, amongst them ∼25-50 intermediate-period (P ∼ 2-3 yr) transiting systems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moerk, Anna-Karin, E-mail: anna-karin.mork@ki.s; Jonsson, Fredrik; Pharsight, a Certara company, St. Louis, MO
2009-11-01
The aim of this study was to derive improved estimates of population variability and uncertainty of physiologically based pharmacokinetic (PBPK) model parameters, especially of those related to the washin-washout behavior of polar volatile substances. This was done by optimizing a previously published washin-washout PBPK model for acetone in a Bayesian framework using Markov chain Monte Carlo simulation. The sensitivity of the model parameters was investigated by creating four different prior sets, where the uncertainty surrounding the population variability of the physiological model parameters was given values corresponding to coefficients of variation of 1%, 25%, 50%, and 100%, respectively. The PBPKmore » model was calibrated to toxicokinetic data from 2 previous studies where 18 volunteers were exposed to 250-550 ppm of acetone at various levels of workload. The updated PBPK model provided a good description of the concentrations in arterial, venous, and exhaled air. The precision of most of the model parameter estimates was improved. New information was particularly gained on the population distribution of the parameters governing the washin-washout effect. The results presented herein provide a good starting point to estimate the target dose of acetone in the working and general populations for risk assessment purposes.« less
Hróbjartsson, A; Gøtzsche, P C
2004-08-01
It is widely believed that placebo interventions induce powerful effects. We could not confirm this in a systematic review of 114 randomized trials that compared placebo-treated with untreated patients. To study whether a new sample of trials would reproduce our earlier findings, and to update the review. Systematic review of trials that were published since our last search (or not previously identified), and of all available trials. Data was available in 42 out of 52 new trials (3212 patients). The results were similar to our previous findings. The updated review summarizes data from 156 trials (11 737 patients). We found no statistically significant pooled effect in 38 trials with binary outcomes, relative risk 0.95 (95% confidence interval 0.89-1.01). The effect on continuous outcomes decreased with increasing sample size, and there was considerable variation in effect also between large trials; the effect estimates should therefore be interpreted cautiously. If this bias is disregarded, the pooled standardized mean difference in 118 trials with continuous outcomes was -0.24 (-0.31 to -0.17). For trials with patient-reported outcomes the effect was -0.30 (-0.38 to -0.21), but only -0.10 (-0.20 to 0.01) for trials with observer-reported outcomes. Of 10 clinical conditions investigated in three trials or more, placebo had a statistically significant pooled effect only on pain or phobia on continuous scales. We found no evidence of a generally large effect of placebo interventions. A possible small effect on patient-reported continuous outcomes, especially pain, could not be clearly distinguished from bias.
Houseknecht, D.W.; Bird, K.J.; Schuenemeyer, J.H.; Attanasi, E.D.; Garrity, C.P.; Schenk, C.J.; Charpentier, R.R.; Pollastro, R.M.; Cook, T.A.; and Klett, T.R.
2010-01-01
Using a geology-based assessment methodology, the U.S. Geological Survey estimated mean volumes of 896 million barrels of oil (MMBO) and about 53 trillion cubic feet (TCFG) of nonassociated natural gas in conventional, undiscovered accumulations within the National Petroleum Reserve in Alaska and adjacent State waters. The estimated volume of undiscovered oil is significantly lower than estimates released in 2002, owing primarily to recent exploration drilling that revealed an abrupt transition from oil to gas and reduced reservoir quality in the Alpine sandstone 15-20 miles west of the giant Alpine oil field. The National Petroleum Reserve in Alaska (NPRA) has been the focus of oil exploration during the past decade, stimulated by the mid-1990s discovery of the adjacent Alpine field-the largest onshore oil discovery in the United States during the past 25 years. Recent activities in NPRA, including extensive 3-D seismic surveys, six Federal lease sales totaling more than $250 million in bonus bids, and completion of more than 30 exploration wells on Federal and Native lands, indicate in key formations more gas than oil and poorer reservoir quality than anticipated. In the absence of a gas pipeline from northern Alaska, exploration has waned and several petroleum companies have relinquished assets in the NPRA. This fact sheet updates U.S. Geological Survey (USGS) estimates of undiscovered oil and gas in NPRA, based on publicly released information from exploration wells completed during the past decade and on the results of research that documents significant Cenozoic uplift and erosion in NPRA. The results included in this fact sheet-released in October 2010-supersede those of a previous assessment completed by the USGS in 2002.
Haroldson, Mark A.; Schwartz, Charles C.; Thompson, Daniel J.; Bjornlie, Daniel D.; Gunther, Kerry A.; Cain, Steven L.; Tyers, Daniel B.; Frey, Kevin L.; Aber, Bryan C.
2014-01-01
The distribution of the Greater Yellowstone Ecosystem grizzly bear (Ursus arctos) population has expanded into areas unoccupied since the early 20th century. Up-to-date information on the area and extent of this distribution is crucial for federal, state, and tribal wildlife and land managers to make informed decisions regarding grizzly bear management. The most recent estimate of grizzly bear distribution (2004) utilized fixed-kernel density estimators to describe distribution. This method was complex and computationally time consuming and excluded observations of unmarked bears. Our objective was to develop a technique to estimate grizzly bear distribution that would allow for the use of all verified grizzly bear location data, as well as provide the simplicity to be updated more frequently. We placed all verified grizzly bear locations from all sources from 1990 to 2004 and 1990 to 2010 onto a 3-km × 3-km grid and used zonal analysis and ordinary kriging to develop a predicted surface of grizzly bear distribution. We compared the area and extent of the 2004 kriging surface with the previous 2004 effort and evaluated changes in grizzly bear distribution from 2004 to 2010. The 2004 kriging surface was 2.4% smaller than the previous fixed-kernel estimate, but more closely represented the data. Grizzly bear distribution increased 38.3% from 2004 to 2010, with most expansion in the northern and southern regions of the range. This technique can be used to provide a current estimate of grizzly bear distribution for management and conservation applications.
This revised draft document was prepared for U.S. EPA's Office of Research and Development, and describes the data analysis undertaken to update the Municipal Solid Waste (MSW) Landfill section of AP-42. This 2008 update includes the addition of data from 62 landfill gas emission...
The use of remote sensing for updating extensive forest inventories
John F. Kelly
1990-01-01
The Forest Inventory and Analysis unit of the USDA Forest Service Southern Forest Experiment Station (SO-FIA) has the research task of devising an inventory updating system that can be used to provide reliable estimates of forest area, volume, growth, and removals at the State level. These updated inventories must be accomplished within current budgetary restraints....
Apparatus for sensor failure detection and correction in a gas turbine engine control system
NASA Technical Reports Server (NTRS)
Spang, H. A., III; Wanger, R. P. (Inventor)
1981-01-01
A gas turbine engine control system maintains a selected level of engine performance despite the failure or abnormal operation of one or more engine parameter sensors. The control system employs a continuously updated engine model which simulates engine performance and generates signals representing real time estimates of the engine parameter sensor signals. The estimate signals are transmitted to a control computational unit which utilizes them in lieu of the actual engine parameter sensor signals to control the operation of the engine. The estimate signals are also compared with the corresponding actual engine parameter sensor signals and the resulting difference signals are utilized to update the engine model. If a particular difference signal exceeds specific tolerance limits, the difference signal is inhibited from updating the model and a sensor failure indication is provided to the engine operator.
Morgantown people mover : updated description.
DOT National Transportation Integrated Search
2005-01-01
The Morgantown People Mover is a five-station Automated Group Rapid Transit System (AGRT). This : paper reviews history and operating principles, providing an updated description. Compared to previous : papers, new contributions include: depiction of...
A Robust Adaptive Unscented Kalman Filter for Nonlinear Estimation with Uncertain Noise Covariance.
Zheng, Binqi; Fu, Pengcheng; Li, Baoqing; Yuan, Xiaobing
2018-03-07
The Unscented Kalman filter (UKF) may suffer from performance degradation and even divergence while mismatch between the noise distribution assumed as a priori by users and the actual ones in a real nonlinear system. To resolve this problem, this paper proposes a robust adaptive UKF (RAUKF) to improve the accuracy and robustness of state estimation with uncertain noise covariance. More specifically, at each timestep, a standard UKF will be implemented first to obtain the state estimations using the new acquired measurement data. Then an online fault-detection mechanism is adopted to judge if it is necessary to update current noise covariance. If necessary, innovation-based method and residual-based method are used to calculate the estimations of current noise covariance of process and measurement, respectively. By utilizing a weighting factor, the filter will combine the last noise covariance matrices with the estimations as the new noise covariance matrices. Finally, the state estimations will be corrected according to the new noise covariance matrices and previous state estimations. Compared with the standard UKF and other adaptive UKF algorithms, RAUKF converges faster to the actual noise covariance and thus achieves a better performance in terms of robustness, accuracy, and computation for nonlinear estimation with uncertain noise covariance, which is demonstrated by the simulation results.
A Robust Adaptive Unscented Kalman Filter for Nonlinear Estimation with Uncertain Noise Covariance
Zheng, Binqi; Yuan, Xiaobing
2018-01-01
The Unscented Kalman filter (UKF) may suffer from performance degradation and even divergence while mismatch between the noise distribution assumed as a priori by users and the actual ones in a real nonlinear system. To resolve this problem, this paper proposes a robust adaptive UKF (RAUKF) to improve the accuracy and robustness of state estimation with uncertain noise covariance. More specifically, at each timestep, a standard UKF will be implemented first to obtain the state estimations using the new acquired measurement data. Then an online fault-detection mechanism is adopted to judge if it is necessary to update current noise covariance. If necessary, innovation-based method and residual-based method are used to calculate the estimations of current noise covariance of process and measurement, respectively. By utilizing a weighting factor, the filter will combine the last noise covariance matrices with the estimations as the new noise covariance matrices. Finally, the state estimations will be corrected according to the new noise covariance matrices and previous state estimations. Compared with the standard UKF and other adaptive UKF algorithms, RAUKF converges faster to the actual noise covariance and thus achieves a better performance in terms of robustness, accuracy, and computation for nonlinear estimation with uncertain noise covariance, which is demonstrated by the simulation results. PMID:29518960
47 CFR 36.612 - Updating information submitted to the National Exchange Carrier Association.
Code of Federal Regulations, 2013 CFR
2013-10-01
... update the information submitted to the National Exchange Carrier Association (NECA) on July 31st... non-rural telephone company must update the information submitted to NECA on July 31st pursuant to § 36.611 (h) according to the schedule. (1) Submit data covering the last nine months of the previous...
47 CFR 36.612 - Updating information submitted to the National Exchange Carrier Association.
Code of Federal Regulations, 2012 CFR
2012-10-01
... update the information submitted to the National Exchange Carrier Association (NECA) on July 31st... non-rural telephone company must update the information submitted to NECA on July 31st pursuant to § 36.611 (h) according to the schedule. (1) Submit data covering the last nine months of the previous...
Updated Estimates of Glacier Mass Change for Western North America
NASA Astrophysics Data System (ADS)
Menounos, B.; Gardner, A. S.; Howat, I.; Berthier, E.; Dehecq, A.; Noh, M. J.; Pelto, B. M.
2017-12-01
Alpine glaciers are critical components in Western North America's hydrologic cycle. We use varied remotely-sensed datasets to provide updated mass change estimates for Region 2 of the Randolf Glacier Inventory (RGI-02 - all North American glaciers outside of Alaska). Our datasets include: i) aerial laser altimetry surveys completed over many thousands of square kilometers; and ii) multiple Terabytes of high resolution optical stereo imagery (World View 1-3 and Pleiades). Our data from the period 2014-2017 includes the majority of glaciers in RGI-02, specifically those ice masses in the Rocky Mountains (US and Canada), Interior Ranges in British Columbia and the Cascade Mountains (Washington). We co-registered and bias corrected the recent surface models to the Shuttle Radar Topographic Mapping (SRTM) data acquired in February, 2000. In British Columbia, our estimates of mass change are within the uncertainty estimates obtained for the period 1985-2000, but estimates from some regions indicate accelerated mass loss. Work is also underway to update glacier mass change estimates for glaciers in Washington and Montana. Finally, we use re-analysis data (ERA interim and ERA5) to evaluate the meteorological drivers that explain the temporal and spatial variability of mass change evident in our analysis.
A Population Study of Wide-Separation Brown Dwarf Companions to Main Sequence Stars
NASA Technical Reports Server (NTRS)
Smith, Jeffrey J.
2005-01-01
Increased interest in infrared astronomy has opened the frontier to study cooler objects that shed significant light on the formation of planetary systems. Brown dwarf research provides a wealth of information useful for sorting through a myriad of proposed formation theories. Our study combines observational data from 2MASS with rigorous computer simulations to estimate the true population of long-range (greater than 1000 AU) brown dwarf companions in the solar neighborhood (less than 25 pc from Earth). Expanding on Gizis et al. (2001), we have found the margin of error in previous estimates to be significantly underestimated after we included orbit eccentricity, longitude of pericenter, angle of inclination, field star density, and primary and secondary luminosities as parameters influencing the companion systems in observational studies. We apply our simulation results to current L- and T-dwarf catalogs to provide updated estimates on the frequency of wide-separation brown dwarf companions to main sequence stars.
2017 update of the discoveries of nuclides
NASA Astrophysics Data System (ADS)
Thoennessen, M.
The 2017 update of the discovery of nuclide project is presented. 34 new nuclides were observed for the first time in 2017. However, the assignment of six previously identified nuclides had to be retracted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barmin, V. V.; Asratyan, A. E.; Borisov, V. S.
2010-07-15
The data on the charge-exchange reaction K{sup +}Xe {sup {yields}}K{sup 0}pXe', obtained with the bubble chamber DIANA, are reanalyzed using increased statistics and updated selections. Our previous evidence for formation of a narrow pK{sup 0} resonance with mass near 1538 MeV is confirmed. The statistical significance of the signal reaches some 8{sigma} (6{sigma}) standard deviations when estimated as S/{radical}B (S/{radical}B + S. The mass and intrinsic width of the {Theta}{sup +} baryon are measured as m = 1538 {+-} 2 MeV and {Gamma} = 0.39 {+-} 0.10 MeV.
The costs of nurse turnover: part 1: an economic perspective.
Jones, Cheryl Bland
2004-12-01
Nurse turnover is costly for healthcare organizations. Administrators and nurse executives need a reliable estimate of nurse turnover costs and the origins of those costs if they are to develop effective measures of reducing nurse turnover and its costs. However, determining how to best capture and quantify nurse turnover costs can be challenging. Part 1 of this series conceptualizes nurse turnover via human capital theory and presents an update of a previously developed method for determining the costs of nurse turnover, the Nursing Turnover Cost Calculation Method. Part 2 (January 2005) presents a recent application of the methodology in an acute care hospital.
In 2006, EPA published an inventory of sources and environmental releases of dioxin-like compounds in the United States. This draft report presents an update and revision to that dioxin source inventory. It also presents updated estimates of environmental releases of dioxin-like...
Update on the epidemiology of gastro-oesophageal reflux disease: a systematic review
El-Serag, Hashem B; Sweet, Stephen; Winchester, Christopher C; Dent, John
2014-01-01
Objective To update the findings of the 2005 systematic review of population-based studies assessing the epidemiology of gastro-oesophageal reflux disease (GERD). Design PubMed and Embase were screened for new references using the original search strings. Studies were required to be population-based, to include ≥200 individuals, to have response rates ≥50% and recall periods <12 months. GERD was defined as heartburn and/or regurgitation on at least 1 day a week, or according to the Montreal definition, or diagnosed by a clinician. Temporal and geographic trends in disease prevalence were examined using a Poisson regression model. Results 16 studies of GERD epidemiology published since the original review were found to be suitable for inclusion (15 reporting prevalence and one reporting incidence), and were added to the 13 prevalence and two incidence studies found previously. The range of GERD prevalence estimates was 18.1%–27.8% in North America, 8.8%–25.9% in Europe, 2.5%–7.8% in East Asia, 8.7%–33.1% in the Middle East, 11.6% in Australia and 23.0% in South America. Incidence per 1000 person-years was approximately 5 in the overall UK and US populations, and 0.84 in paediatric patients aged 1– 17 years in the UK. Evidence suggests an increase in GERD prevalence since 1995 (p<0.0001), particularly in North America and East Asia. Conclusions GERD is prevalent worldwide, and disease burden may be increasing. Prevalence estimates show considerable geographic variation, but only East Asia shows estimates consistently lower than 10%. PMID:23853213
Serial and Parallel Processes in Working Memory after Practice
ERIC Educational Resources Information Center
Oberauer, Klaus; Bialkova, Svetlana
2011-01-01
Six young adults practiced for 36 sessions on a working-memory updating task in which 2 digits and 2 spatial positions were continuously updated. Participants either did 1 updating operation at a time, or attempted 1 numerical and 1 spatial operation at the same time. In contrast to previous research using the same paradigm with a single digit and…
Method and system to estimate variables in an integrated gasification combined cycle (IGCC) plant
Kumar, Aditya; Shi, Ruijie; Dokucu, Mustafa
2013-09-17
System and method to estimate variables in an integrated gasification combined cycle (IGCC) plant are provided. The system includes a sensor suite to measure respective plant input and output variables. An extended Kalman filter (EKF) receives sensed plant input variables and includes a dynamic model to generate a plurality of plant state estimates and a covariance matrix for the state estimates. A preemptive-constraining processor is configured to preemptively constrain the state estimates and covariance matrix to be free of constraint violations. A measurement-correction processor may be configured to correct constrained state estimates and a constrained covariance matrix based on processing of sensed plant output variables. The measurement-correction processor is coupled to update the dynamic model with corrected state estimates and a corrected covariance matrix. The updated dynamic model may be configured to estimate values for at least one plant variable not originally sensed by the sensor suite.
Al-Herz, Waleed; Bousfiha, Aziz; Casanova, Jean-Laurent; Chapel, Helen; Conley, Mary Ellen; Cunningham-Rundles, Charlotte; Etzioni, Amos; Fischer, Alain; Franco, Jose Luis; Geha, Raif S.; Hammarström, Lennart; Nonoyama, Shigeaki; Notarangelo, Luigi Daniele; Ochs, Hans Dieter; Puck, Jennifer M.; Roifman, Chaim M.; Seger, Reinhard; Tang, Mimi L. K.
2011-01-01
We report the updated classification of primary immunodeficiency diseases, compiled by the ad hoc Expert Committee of the International Union of Immunological Societies. As compared to the previous edition, more than 15 novel disease entities have been added in the updated version. For each disorders, the key clinical and laboratory features are provided. This updated classification is meant to help in the diagnostic approach to patients with these diseases. PMID:22566844
NASA Astrophysics Data System (ADS)
Baatz, D.; Kurtz, W.; Hendricks Franssen, H. J.; Vereecken, H.; Kollet, S. J.
2017-12-01
Parameter estimation for physically based, distributed hydrological models becomes increasingly challenging with increasing model complexity. The number of parameters is usually large and the number of observations relatively small, which results in large uncertainties. A moving transmitter - receiver concept to estimate spatially distributed hydrological parameters is presented by catchment tomography. In this concept, precipitation, highly variable in time and space, serves as a moving transmitter. As response to precipitation, runoff and stream discharge are generated along different paths and time scales, depending on surface and subsurface flow properties. Stream water levels are thus an integrated signal of upstream parameters, measured by stream gauges which serve as the receivers. These stream water level observations are assimilated into a distributed hydrological model, which is forced with high resolution, radar based precipitation estimates. Applying a joint state-parameter update with the Ensemble Kalman Filter, the spatially distributed Manning's roughness coefficient and saturated hydraulic conductivity are estimated jointly. The sequential data assimilation continuously integrates new information into the parameter estimation problem, especially during precipitation events. Every precipitation event constrains the possible parameter space. In the approach, forward simulations are performed with ParFlow, a variable saturated subsurface and overland flow model. ParFlow is coupled to the Parallel Data Assimilation Framework for the data assimilation and the joint state-parameter update. In synthetic, 3-dimensional experiments including surface and subsurface flow, hydraulic conductivity and the Manning's coefficient are efficiently estimated with the catchment tomography approach. A joint update of the Manning's coefficient and hydraulic conductivity tends to improve the parameter estimation compared to a single parameter update, especially in cases of biased initial parameter ensembles. The computational experiments additionally show to which degree of spatial heterogeneity and to which degree of uncertainty of subsurface flow parameters the Manning's coefficient and hydraulic conductivity can be estimated efficiently.
An Update of Sea Level Rise in the northwestern part of the Arabian Gulf
NASA Astrophysics Data System (ADS)
Alothman, Abdulaziz; Bos, Machiel; Fernandes, Rui
2017-04-01
Relative sea level variations in the northwestern part of the Arabian Gulf have been estimated in the past using no more than 10 to 15 years of observations. In Alothman et al. (2014), we have almost doubled the period to 28.7 years by examining all available tide gauge data in the area and constructing a mean gauge time-series from seven coastal tide gauges. We found for the period 1979-2007 a relative sea level rise of about 2mm/yr, which correspond to an absolute sea level rise of about 1.5mm/yr based on the vertical displacement of GNSS stations in the region. By taking into account the temporal correlations we concluded that previous published results underestimate the true sea level rate error in this area by a factor of 5-10. In this work, we discuss and update the methodology and results from Alothman et al. (2014), particularly by checking and extending the GNSS solutions. Since 3 of the 6 GPS stations used only started observing in the end of 2011, the longer time series have now significantly lower uncertainties in the estimated vertical rate. In addition, we compare our results with GRACE derived ocean bottom pressure time series which are a good proxy of the changes in water mass in this area over time.
Breivik, Knut; Sweetman, Andy; Pacyna, Jozef M; Jones, Kevin C
2007-05-15
Previously published estimates of the global production, consumption and atmospheric emissions of 22 individual PCB congeners [Breivik K, Sweetman A, Pacyna JM, Jones KC. Towards a global historical emission inventory for selected PCB congeners - a mass balance approach. 1. Global production and consumption. Sci Total Environ 2002a; 290: 181-198.; Breivik K, Sweetman A, Pacyna JM, Jones KC. Towards a global historical emission inventory for selected PCB congeners--a mass balance approach. 2. Emissions. Sci Total Environ 2002b; 290: 181-198.] have provided useful information for later studies attempting to interpret contaminant levels in remote areas as well as in the global environment. As a result of the need for more contemporary emission data (following the year 2000), an update of this emission database is presented. This exercise takes into account new information on PCB production in Poland, as well as new data on the chemical composition of various technical mixtures for which less information had been available. The methodology to estimate temporal trends of PCB emissions associated with various types of PCB usage is improved. Projected emissions up to year 2100 are presented to facilitate predictions of future environmental exposure. The national emission data for each of the 114 countries considered is spatially resolved on a 1 degrees x1 degrees grid for each congener and year, using population density as a surrogate.
Updating of working memory: lingering bindings.
Oberauer, Klaus; Vockenberg, Kerstin
2009-05-01
Three experiments investigated proactive interference and proactive facilitation in a memory-updating paradigm. Participants remembered several letters or spatial patterns, distinguished by their spatial positions, and updated them by new stimuli up to 20 times per trial. Self-paced updating times were shorter when an item previously remembered and then replaced reappeared in the same location than when it reappeared in a different location. This effect demonstrates residual memory for no-longer-relevant bindings of items to locations. The effect increased with the number of items to be remembered. With one exception, updating times did not increase, and recall of final values did not decrease, over successive updating steps, thus providing little evidence for proactive interference building up cumulatively.
Chemical Data Reporting - Previously Collected Data
EPA now refers to the Inventory Update Reporting (IUR) rule as the Chemical Data Reporting (CDR) Rule. This change was effective with the publication of the Inventory Update Reporting Modifications; Chemical Data Reporting Final Rule in August 2011.
Liu, Yan-Jun; Tang, Li; Tong, Shaocheng; Chen, C L Philip; Li, Dong-Juan
2015-01-01
Based on the neural network (NN) approximator, an online reinforcement learning algorithm is proposed for a class of affine multiple input and multiple output (MIMO) nonlinear discrete-time systems with unknown functions and disturbances. In the design procedure, two networks are provided where one is an action network to generate an optimal control signal and the other is a critic network to approximate the cost function. An optimal control signal and adaptation laws can be generated based on two NNs. In the previous approaches, the weights of critic and action networks are updated based on the gradient descent rule and the estimations of optimal weight vectors are directly adjusted in the design. Consequently, compared with the existing results, the main contributions of this paper are: 1) only two parameters are needed to be adjusted, and thus the number of the adaptation laws is smaller than the previous results and 2) the updating parameters do not depend on the number of the subsystems for MIMO systems and the tuning rules are replaced by adjusting the norms on optimal weight vectors in both action and critic networks. It is proven that the tracking errors, the adaptation laws, and the control inputs are uniformly bounded using Lyapunov analysis method. The simulation examples are employed to illustrate the effectiveness of the proposed algorithm.
Garwood, Candice L; Korkis, Bianca; Grande, Domenico; Hanni, Claudia; Morin, Amy; Moser, Lynette R
2017-06-01
In 2011 we reviewed clinical updates and controversies surrounding anticoagulation bridge therapy in patients with atrial fibrillation (AF). Since then, options for oral anticoagulation have expanded with the addition of four direct oral anticoagulant (DOAC) agents available in the United States. Nonetheless, vitamin K antagonist (VKA) therapy continues to be the treatment of choice for patients who are poor candidates for a DOAC and for whom bridge therapy remains a therapeutic dilemma. This literature review identifies evidence and guideline and consensus statements from the last 5 years to provide updated recommendations and insight into bridge therapy for patients using a VKA for AF. Since our last review, at least four major international guidelines have been updated plus a new consensus document addressing bridge therapy was released. Prospective trials and one randomized controlled trial have provided guidance for perioperative bridge therapy. The clinical trial data showed that bridging with heparin is associated with a significant bleeding risk compared with not bridging; furthermore, data suggested that actual perioperative thromboembolic risk may be lower than previously estimated. Notably, patients at high risk for stroke have not been adequately represented. These findings highlight the importance of assessing thrombosis and bleeding risk before making bridging decisions. Thrombosis and bleeding risk tools have emerged to facilitate this assessment and have been incorporated into guideline recommendations. Results from ongoing trials are expected to provide more guidance on safe and effective perioperative management approaches for patients at high risk for stroke. © 2017 Pharmacotherapy Publications, Inc.
An updated geospatial liquefaction model for global application
Zhu, Jing; Baise, Laurie G.; Thompson, Eric M.
2017-01-01
We present an updated geospatial approach to estimation of earthquake-induced liquefaction from globally available geospatial proxies. Our previous iteration of the geospatial liquefaction model was based on mapped liquefaction surface effects from four earthquakes in Christchurch, New Zealand, and Kobe, Japan, paired with geospatial explanatory variables including slope-derived VS30, compound topographic index, and magnitude-adjusted peak ground acceleration from ShakeMap. The updated geospatial liquefaction model presented herein improves the performance and the generality of the model. The updates include (1) expanding the liquefaction database to 27 earthquake events across 6 countries, (2) addressing the sampling of nonliquefaction for incomplete liquefaction inventories, (3) testing interaction effects between explanatory variables, and (4) overall improving model performance. While we test 14 geospatial proxies for soil density and soil saturation, the most promising geospatial parameters are slope-derived VS30, modeled water table depth, distance to coast, distance to river, distance to closest water body, and precipitation. We found that peak ground velocity (PGV) performs better than peak ground acceleration (PGA) as the shaking intensity parameter. We present two models which offer improved performance over prior models. We evaluate model performance using the area under the curve under the Receiver Operating Characteristic (ROC) curve (AUC) and the Brier score. The best-performing model in a coastal setting uses distance to coast but is problematic for regions away from the coast. The second best model, using PGV, VS30, water table depth, distance to closest water body, and precipitation, performs better in noncoastal regions and thus is the model we recommend for global implementation.
Rodriguez-Ortiz, Carlos J.; De la Cruz, Vanesa; Gutiérrez, Ranier; Bermudez-Rattoni, Federico
2005-01-01
Consolidation theory proposes that through the synthesis of new proteins recently acquired memories are strengthened over time into a stable long-term memory trace. However, evidence has accumulated suggesting that retrieved memory is susceptible to disruption, seeming to consolidate again (reconsolidate) to be retained in long-term storage. Here we show that intracortical blockade of protein synthesis in the gustatory cortex after retrieval of taste-recognition memory disrupts previously consolidated memory to a restricted degree only if the experience is updated. Our results suggest that retrieved memory can be modified as part of a mechanism for incorporating updated information into previously consolidated memory. PMID:16166395
Depth estimation and camera calibration of a focused plenoptic camera for visual odometry
NASA Astrophysics Data System (ADS)
Zeller, Niclas; Quint, Franz; Stilla, Uwe
2016-08-01
This paper presents new and improved methods of depth estimation and camera calibration for visual odometry with a focused plenoptic camera. For depth estimation we adapt an algorithm previously used in structure-from-motion approaches to work with images of a focused plenoptic camera. In the raw image of a plenoptic camera, scene patches are recorded in several micro-images under slightly different angles. This leads to a multi-view stereo-problem. To reduce the complexity, we divide this into multiple binocular stereo problems. For each pixel with sufficient gradient we estimate a virtual (uncalibrated) depth based on local intensity error minimization. The estimated depth is characterized by the variance of the estimate and is subsequently updated with the estimates from other micro-images. Updating is performed in a Kalman-like fashion. The result of depth estimation in a single image of the plenoptic camera is a probabilistic depth map, where each depth pixel consists of an estimated virtual depth and a corresponding variance. Since the resulting image of the plenoptic camera contains two plains: the optical image and the depth map, camera calibration is divided into two separate sub-problems. The optical path is calibrated based on a traditional calibration method. For calibrating the depth map we introduce two novel model based methods, which define the relation of the virtual depth, which has been estimated based on the light-field image, and the metric object distance. These two methods are compared to a well known curve fitting approach. Both model based methods show significant advantages compared to the curve fitting method. For visual odometry we fuse the probabilistic depth map gained from one shot of the plenoptic camera with the depth data gained by finding stereo correspondences between subsequent synthesized intensity images of the plenoptic camera. These images can be synthesized totally focused and thus finding stereo correspondences is enhanced. In contrast to monocular visual odometry approaches, due to the calibration of the individual depth maps, the scale of the scene can be observed. Furthermore, due to the light-field information better tracking capabilities compared to the monocular case can be expected. As result, the depth information gained by the plenoptic camera based visual odometry algorithm proposed in this paper has superior accuracy and reliability compared to the depth estimated from a single light-field image.
Discrete-time state estimation for stochastic polynomial systems over polynomial observations
NASA Astrophysics Data System (ADS)
Hernandez-Gonzalez, M.; Basin, M.; Stepanov, O.
2018-07-01
This paper presents a solution to the mean-square state estimation problem for stochastic nonlinear polynomial systems over polynomial observations confused with additive white Gaussian noises. The solution is given in two steps: (a) computing the time-update equations and (b) computing the measurement-update equations for the state estimate and error covariance matrix. A closed form of this filter is obtained by expressing conditional expectations of polynomial terms as functions of the state estimate and error covariance. As a particular case, the mean-square filtering equations are derived for a third-degree polynomial system with second-degree polynomial measurements. Numerical simulations show effectiveness of the proposed filter compared to the extended Kalman filter.
Vinyl Chloride: A Case Study of Data Suppression and Misrepresentation
Sass, Jennifer Beth; Castleman, Barry; Wallinga, David
2005-01-01
When the U.S. Environmental Protection Agency (EPA) finalized its 2000 update of the toxicological effects of vinyl chloride (VC), it was concerned with two issues: the classification of VC as a carcinogen and the numerical estimate of its potency. In this commentary we describe how the U.S. EPA review of VC toxicology, which was drafted with substantial input from the chemical industry, weakened safeguards on both points. First, the assessment downplays risks from all cancer sites other than the liver. Second, the estimate of cancer potency was reduced 10-fold from values previously used for environmental decision making, a finding that reduces the cost and extent of pollution reduction and cleanup measures. We suggest that this assessment reflects discredited scientific practices and recommend that the U.S. EPA reverse its trend toward ever-increasing collaborations with the regulated industries when generating scientific reviews and risk assessments. PMID:16002366
Model Package Report: Hanford Soil Inventory Model SIM v.2 Build 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nichols, Will E.; Zaher, U.; Mehta, S.
The Hanford Soil Inventory Model (SIM) is a tool for the estimation of inventory of contaminants that were released to soil from liquid discharges during the U.S. Department of Energy’s Hanford Site operations. This model package report documents the construction and development of a second version of SIM (SIM-v2) to support the needs of Hanford Site Composite Analysis. The SIM-v2 is implemented using GoldSim Pro®1 software with a new model architecture that preserves the uncertainty in inventory estimates while reducing the computational burden (compared to the previous version) and allowing more traceability and transparency in calculation methodology. The calculation architecturemore » is designed in such a manner that future updates to the waste stream composition along with addition or deletion of waste sites can be performed with relative ease. In addition, the new computational platform allows for continued hardware upgrade.« less
FORCARB2: An updated version of the U.S. Forest Carbon Budget Model
Linda S. Heath; Michael C. Nichols; James E. Smith; John R. Mills
2010-01-01
FORCARB2, an updated version of the U.S. FORest CARBon Budget Model (FORCARB), produces estimates of carbon stocks and stock changes for forest ecosystems and forest products at 5-year intervals. FORCARB2 includes a new methodology for carbon in harvested wood products, updated initial inventory data, a revised algorithm for dead wood, and now includes public forest...
Lee, A J; Cunningham, A P; Kuchenbaecker, K B; Mavaddat, N; Easton, D F; Antoniou, A C
2014-01-01
Background: The Breast and Ovarian Analysis of Disease Incidence and Carrier Estimation Algorithm (BOADICEA) is a risk prediction model that is used to compute probabilities of carrying mutations in the high-risk breast and ovarian cancer susceptibility genes BRCA1 and BRCA2, and to estimate the future risks of developing breast or ovarian cancer. In this paper, we describe updates to the BOADICEA model that extend its capabilities, make it easier to use in a clinical setting and yield more accurate predictions. Methods: We describe: (1) updates to the statistical model to include cancer incidences from multiple populations; (2) updates to the distributions of tumour pathology characteristics using new data on BRCA1 and BRCA2 mutation carriers and women with breast cancer from the general population; (3) improvements to the computational efficiency of the algorithm so that risk calculations now run substantially faster; and (4) updates to the model's web interface to accommodate these new features and to make it easier to use in a clinical setting. Results: We present results derived using the updated model, and demonstrate that the changes have a significant impact on risk predictions. Conclusion: All updates have been implemented in a new version of the BOADICEA web interface that is now available for general use: http://ccge.medschl.cam.ac.uk/boadicea/. PMID:24346285
NASA Technical Reports Server (NTRS)
Knox, C. E.
1978-01-01
Navigation error data from these flights are presented in a format utilizing three independent axes - horizontal, vertical, and time. The navigation position estimate error term and the autopilot flight technical error term are combined to form the total navigation error in each axis. This method of error presentation allows comparisons to be made between other 2-, 3-, or 4-D navigation systems and allows experimental or theoretical determination of the navigation error terms. Position estimate error data are presented with the navigation system position estimate based on dual DME radio updates that are smoothed with inertial velocities, dual DME radio updates that are smoothed with true airspeed and magnetic heading, and inertial velocity updates only. The normal mode of navigation with dual DME updates that are smoothed with inertial velocities resulted in a mean error of 390 m with a standard deviation of 150 m in the horizontal axis; a mean error of 1.5 m low with a standard deviation of less than 11 m in the vertical axis; and a mean error as low as 252 m with a standard deviation of 123 m in the time axis.
NASA Astrophysics Data System (ADS)
Itahashi, S.; Uno, I.; Irie, H.; Kurokawa, J.; Ohara, T.
2013-04-01
Satellite observations of the tropospheric NO2 vertical column density (VCD) are closely correlated to surface NOx emissions and can thus be used to estimate the latter. In this study, the NO2 VCDs simulated by a regional chemical transport model with data from the updated Regional Emission inventory in ASia (REAS) version 2.1 were validated by comparison with multi-satellite observations (GOME, SCIAMACHY, GOME-2, and OMI) between 2000 and 2010. Rapid growth in NO2 VCD driven by expansion of anthropogenic NOx emissions was revealed above the central eastern China region, except during the economic downturn. In contrast, slightly decreasing trends were captured above Japan. The modeled NO2 VCDs using the updated REAS emissions reasonably reproduced the annual trends observed by multi-satellites, suggesting that the NOx emissions growth rate estimated by the updated inventory is robust. On the basis of the close linear relationship of modeled NO2 VCD, observed NO2 VCD, and anthropogenic NOx emissions, the NOx emissions in 2009 and 2010 were estimated. It was estimated that the NOx emissions from anthropogenic sources in China beyond doubled between 2000 and 2010, reflecting the strong growth of anthropogenic emissions in China with the rapid recovery from the economic downturn during late 2008 and mid-2009.
Mechanistic modeling of reactive soil nitrogen emissions across agricultural management practices
NASA Astrophysics Data System (ADS)
Rasool, Q. Z.; Miller, D. J.; Bash, J. O.; Venterea, R. T.; Cooter, E. J.; Hastings, M. G.; Cohan, D. S.
2017-12-01
The global reactive nitrogen (N) budget has increased by a factor of 2-3 from pre-industrial levels. This increase is especially pronounced in highly N fertilized agricultural regions in summer. The reactive N emissions from soil to atmosphere can be in reduced (NH3) or oxidized (NO, HONO, N2O) forms, depending on complex biogeochemical transformations of soil N reservoirs. Air quality models like CMAQ typically neglect soil emissions of HONO and N2O. Previously, soil NO emissions estimated by models like CMAQ remained parametric and inconsistent with soil NH3 emissions. Thus, there is a need to more mechanistically and consistently represent the soil N processes that lead to reactive N emissions to the atmosphere. Our updated approach estimates soil NO, HONO and N2O emissions by incorporating detailed agricultural fertilizer inputs from EPIC, and CMAQ-modeled N deposition, into the soil N pool. EPIC addresses the nitrification, denitrification and volatilization rates along with soil N pools for agricultural soils. Suitable updates to account for factors like nitrite (NO2-) accumulation not addressed in EPIC, will also be made. The NO and N2O emissions from nitrification and denitrification are computed mechanistically using the N sub-model of DAYCENT. These mechanistic definitions use soil water content, temperature, NH4+ and NO3- concentrations, gas diffusivity and labile C availability as dependent parameters at various soil layers. Soil HONO emissions found to be most probable under high NO2- availability will be based on observed ratios of HONO to NO emissions under different soil moistures, pH and soil types. The updated scheme will utilize field-specific soil properties and N inputs across differing manure management practices such as tillage. Comparison of the modeled soil NO emission rates from the new mechanistic and existing schemes against field measurements will be discussed. Our updated framework will help to predict the diurnal and daily variability of different reactive N emissions (NO, HONO, N2O) with soil temperature, moisture and N inputs.
Utilizing Flight Data to Update Aeroelastic Stability Estimates
NASA Technical Reports Server (NTRS)
Lind, Rick; Brenner, Marty
1997-01-01
Stability analysis of high performance aircraft must account for errors in the system model. A method for computing flutter margins that incorporates flight data has been developed using robust stability theory. This paper considers applying this method to update flutter margins during a post-flight or on-line analysis. Areas of modeling uncertainty that arise when using flight data with this method are investigated. The amount of conservatism in the resulting flutter margins depends on the flight data sets used to update the model. Post-flight updates of flutter margins for an F/A-18 are presented along with a simulation of on-line updates during a flight test.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rockhold, Mark L.; Zhang, Z. F.; Meyer, Philip D.
2015-02-28
Current plans for treatment and disposal of immobilized low-activity waste (ILAW) from Hanford’s underground waste storage tanks include vitrification and storage of the glass waste form in a nearsurface disposal facility. This Integrated Disposal Facility (IDF) is located in the 200 East Area of the Hanford Central Plateau. Performance assessment (PA) of the IDF requires numerical modeling of subsurface flow and reactive transport processes over very long periods (thousands of years). The models used to predict facility performance require parameters describing various physical, hydraulic, and transport properties. This report provides updated estimates of physical, hydraulic, and transport properties and parametersmore » for both near- and far-field materials, intended for use in future IDF PA modeling efforts. Previous work on physical and hydraulic property characterization for earlier IDF PA analyses is reviewed and summarized. For near-field materials, portions of this document and parameter estimates are taken from an earlier data package. For far-field materials, a critical review is provided of methodologies used in previous data packages. Alternative methods are described and associated parameters are provided.« less
Evaluation of Individuals With Pulmonary Nodules: When Is It Lung Cancer?
Donington, Jessica; Lynch, William R.; Mazzone, Peter J.; Midthun, David E.; Naidich, David P.; Wiener, Renda Soylemez
2013-01-01
Objectives: The objective of this article is to update previous evidence-based recommendations for evaluation and management of individuals with solid pulmonary nodules and to generate new recommendations for those with nonsolid nodules. Methods: We updated prior literature reviews, synthesized evidence, and formulated recommendations by using the methods described in the “Methodology for Development of Guidelines for Lung Cancer” in the American College of Chest Physicians Lung Cancer Guidelines, 3rd ed. Results: We formulated recommendations for evaluating solid pulmonary nodules that measure > 8 mm in diameter, solid nodules that measure ≤ 8 mm in diameter, and subsolid nodules. The recommendations stress the value of assessing the probability of malignancy, the utility of imaging tests, the need to weigh the benefits and harms of different management strategies (nonsurgical biopsy, surgical resection, and surveillance with chest CT imaging), and the importance of eliciting patient preferences. Conclusions: Individuals with pulmonary nodules should be evaluated and managed by estimating the probability of malignancy, performing imaging tests to better characterize the lesions, evaluating the risks associated with various management alternatives, and eliciting their preferences for management. PMID:23649456
Improved meteorology from an updated WRF/CMAQ modeling ...
Realistic vegetation characteristics and phenology from the Moderate Resolution Imaging Spectroradiometer (MODIS) products improve the simulation for the meteorology and air quality modeling system WRF/CMAQ (Weather Research and Forecasting model and Community Multiscale Air Quality model) that employs the Pleim-Xiu land surface model (PX LSM). Recently, PX LSM WRF/CMAQ has been updated in vegetation, soil, and boundary layer processes resulting in improved 2 m temperature (T) and mixing ratio (Q), 10 m wind speed, and surface ozone simulations across the domain compared to the previous version for a period around August 2006. Yearlong meteorology simulations with the updated system demonstrate that MODIS input helps reduce bias of the 2 m Q estimation during the growing season from April to September. Improvements follow the green-up in the southeast from April and move toward the west and north through August. From October to March, MODIS input does not have much influence on the system because vegetation is not as active. The greatest effects of MODIS input include more accurate phenology, better representation of leaf area index (LAI) for various forest ecosystems and agricultural areas, and realistically sparse vegetation coverage in the western drylands. Despite the improved meteorology, MODIS input causes higher bias for the surface O3 simulation in April, August, and October in areas where MODIS LAI is much less than the base LAI. Thus, improvement
Planck 2013 results. XXXII. The updated Planck catalogue of Sunyaev-Zeldovich sources
NASA Astrophysics Data System (ADS)
Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Aussel, H.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Barrena, R.; Bartelmann, M.; Bartlett, J. G.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bikmaev, I.; Bobin, J.; Bock, J. J.; Böhringer, H.; Bonaldi, A.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Bridges, M.; Bucher, M.; Burenin, R.; Burigana, C.; Butler, R. C.; Cardoso, J.-F.; Carvalho, P.; Catalano, A.; Challinor, A.; Chamballu, A.; Chary, R.-R.; Chen, X.; Chiang, H. C.; Chiang, L.-Y.; Chon, G.; Christensen, P. R.; Churazov, E.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Comis, B.; Couchot, F.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Da Silva, A.; Dahle, H.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Démoclès, J.; Désert, F.-X.; Dickinson, C.; Diego, J. M.; Dolag, K.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Dupac, X.; Efstathiou, G.; Enßlin, T. A.; Eriksen, H. K.; Feroz, F.; Ferragamo, A.; Finelli, F.; Flores-Cacho, I.; Forni, O.; Frailis, M.; Franceschi, E.; Fromenteau, S.; Galeotta, S.; Ganga, K.; Génova-Santos, R. T.; Giard, M.; Giardino, G.; Gilfanov, M.; Giraud-Héraud, Y.; González-Nuevo, J.; Górski, K. M.; Grainge, K. J. B.; Gratton, S.; Gregorio, A.; Groeneboom, N., E.; Gruppuso, A.; Hansen, F. K.; Hanson, D.; Harrison, D.; Hempel, A.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Hurley-Walker, N.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Khamitov, I.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Laureijs, R. J.; Lawrence, C. R.; Leahy, J. P.; Leonardi, R.; León-Tavares, J.; Lesgourgues, J.; Li, C.; Liddle, A.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; MacTavish, C. J.; Maffei, B.; Maino, D.; Mandolesi, N.; Maris, M.; Marshall, D. J.; Martin, P. G.; Martínez-González, E.; Masi, S.; Massardi, M.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Mei, S.; Meinhold, P. R.; Melchiorri, A.; Melin, J.-B.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mikkelsen, K.; Mitra, S.; Miville-Deschênes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nastasi, A.; Nati, F.; Natoli, P.; Nesvadba, N. P. H.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I. J.; Olamaie, M.; Osborne, S.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Perdereau, O.; Perotto, L.; Perrott, Y. C.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Reach, W. T.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubiño-Martín, J. A.; Rumsey, C.; Rusholme, B.; Sandri, M.; Santos, D.; Saunders, R. D. E.; Savini, G.; Schammel, M. P.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Shimwell, T. W.; Spencer, L. D.; Starck, J.-L.; Stolyarov, V.; Stompor, R.; Streblyanska, A.; Sudiwala, R.; Sunyaev, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tramonte, D.; Tristram, M.; Tucci, M.; Tuovinen, J.; Türler, M.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vibert, L.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L. A.; Wandelt, B. D.; White, M.; White, S. D. M.; Yvon, D.; Zacchei, A.; Zonca, A.
2015-09-01
We update the all-sky Planck catalogue of 1227 clusters and cluster candidates (PSZ1) published in March 2013, derived from detections of the Sunyaev-Zeldovich (SZ) effect using the first 15.5 months of Planck satellite observations. As an addendum, we deliver an updated version of the PSZ1 catalogue, reporting the further confirmation of 86 Planck-discovered clusters. In total, the PSZ1 now contains 947 confirmed clusters, of which 214 were confirmed as newly discovered clusters through follow-up observations undertaken by the Planck Collaboration. The updated PSZ1 contains redshifts for 913 systems, of which 736 (~ 80.6%) are spectroscopic, and associated mass estimates derived from the Yz mass proxy. We also provide a new SZ quality flag for the remaining 280 candidates. This flag was derived from a novel artificial neural-network classification of the SZ signal. Based on this assessment, the purity of the updated PSZ1 catalogue is estimated to be 94%. In this release, we provide the full updated catalogue and an additional readme file with further information on the Planck SZ detections. The catalogue is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/581/A14
Metabolic Power in Team Sports - Part 2: Aerobic and Anaerobic Energy Yields.
Osgnach, Cristian; di Prampero, Pietro Enrico
2018-06-14
A previous approach to estimate the time course of instantaneous metabolic power and O 2 consumption in team sports has been updated to assess also energy expenditure against air resistance and to identify walking and running separately. Whole match energy expenditure turned out ≈14% smaller than previously obtained, the fraction against the air resistance amounting to ≈2% of the total. Estimated net O 2 consumption and overall energy expenditure are fairly close to those measured by means of a portable metabolic cart; the average difference, after a 45 min exercise period of variable intensity and mode, amounting to ≈10%. Aerobic and anaerobic energy yields, metabolic power, energy expenditure and duration of High (HI) and Low (LI) intensity bouts can also be estimated. Indeed, data on 497 soccer players during the 2014/2015 Italian "Serie A" show that the number of HI efforts decreased from the first to the last 15-min periods of the match, without substantial changes in mean metabolic power (≈22 W·kg -1 ) and duration (≈6.5 s). On the contrary, mean metabolic power of the LI decreased (5.8 to 4.8 W·kg -1 ), mainly because of a longer duration thereof, thus underscoring the need for longer recovery periods between HI. © Georg Thieme Verlag KG Stuttgart · New York.
The pH-dependent surface charging and points of zero charge: V. Update.
Kosmulski, Marek
2011-01-01
The points of zero charge (PZC) and isoelectric points (IEP) from the recent literature are discussed. This study is an update of the previous compilation [M. Kosmulski, Surface Charging and Points of Zero Charge, CRC, Boca Raton, FL, 2009] and of its previous update [J. Colloid Interface Sci. 337 (2009) 439]. In several recent publications, the terms PZC/IEP have been used outside their usual meaning. Only the PZC/IEP obtained according to the methods recommended by the present author are reported in this paper, and the other results are ignored. PZC/IEP of albite, sepiolite, and sericite, which have not been studied before, became available over the past 2 years. Copyright © 2010 Elsevier Inc. All rights reserved.
75 FR 44 - Temporary Suspension of the Population Estimates and Income Estimates Challenge Programs
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-04
..., conduct research to enhance the estimates and challenge programs, and to integrate the updates from the... local governments would increase the administrative and evaluative complexity of this program for the... comparison with the population estimates, conducting research to enhance the estimates and challenge programs...
A memetic optimization algorithm for multi-constrained multicast routing in ad hoc networks
Hammad, Karim; El Bakly, Ahmed M.
2018-01-01
A mobile ad hoc network is a conventional self-configuring network where the routing optimization problem—subject to various Quality-of-Service (QoS) constraints—represents a major challenge. Unlike previously proposed solutions, in this paper, we propose a memetic algorithm (MA) employing an adaptive mutation parameter, to solve the multicast routing problem with higher search ability and computational efficiency. The proposed algorithm utilizes an updated scheme, based on statistical analysis, to estimate the best values for all MA parameters and enhance MA performance. The numerical results show that the proposed MA improved the delay and jitter of the network, while reducing computational complexity as compared to existing algorithms. PMID:29509760
A memetic optimization algorithm for multi-constrained multicast routing in ad hoc networks.
Ramadan, Rahab M; Gasser, Safa M; El-Mahallawy, Mohamed S; Hammad, Karim; El Bakly, Ahmed M
2018-01-01
A mobile ad hoc network is a conventional self-configuring network where the routing optimization problem-subject to various Quality-of-Service (QoS) constraints-represents a major challenge. Unlike previously proposed solutions, in this paper, we propose a memetic algorithm (MA) employing an adaptive mutation parameter, to solve the multicast routing problem with higher search ability and computational efficiency. The proposed algorithm utilizes an updated scheme, based on statistical analysis, to estimate the best values for all MA parameters and enhance MA performance. The numerical results show that the proposed MA improved the delay and jitter of the network, while reducing computational complexity as compared to existing algorithms.
Real-time antenna fault diagnosis experiments at DSS 13
NASA Technical Reports Server (NTRS)
Mellstrom, J.; Pierson, C.; Smyth, P.
1992-01-01
Experimental results obtained when a previously described fault diagnosis system was run online in real time at the 34-m beam waveguide antenna at Deep Space Station (DSS) 13 are described. Experimental conditions and the quality of results are described. A neural network model and a maximum-likelihood Gaussian classifier are compared with and without a Markov component to model temporal context. At the rate of a state update every 6.4 seconds, over a period of roughly 1 hour, the neural-Markov system had zero errors (incorrect state estimates) while monitoring both faulty and normal operations. The overall results indicate that the neural-Markov combination is the most accurate model and has significant practical potential.
Trasande, Leonardo; Liu, Yinghua
2011-05-01
A 2002 analysis documented $54.9 billion in annual costs of environmentally mediated diseases in US children. However, few important changes in federal policy have been implemented to prevent exposures to toxic chemicals. We therefore updated and expanded the previous analysis and found that the costs of lead poisoning, prenatal methylmercury exposure, childhood cancer, asthma, intellectual disability, autism, and attention deficit hyperactivity disorder were $76.6 billion in 2008. To prevent further increases in these costs, efforts are needed to institute premarket testing of new chemicals; conduct toxicity testing on chemicals already in use; reduce lead-based paint hazards; and curb mercury emissions from coal-fired power plants.
Useful Life | Energy Analysis | NREL
Useful Life Useful Life Transparent Cost Database Button The table below gives ranges on useful seeking utility-scale technology cost and performance estimates, please visit the Transparent Cost Cost (February 2016 Update) Operations & Maintenance (February 2016 Update) Utility-Scale Capacity
Wireless data collection system for travel time estimation and traffic performance evaluation.
DOT National Transportation Integrated Search
2010-09-01
Having accurate and continually updated travel time and other performance data for the road and highway system has many benefits. From the perspective of the road users, having real-time updates on travel times will permit better travel and route pla...
EOS Terra Terra Constellation Exit/Future Maneuver Plans Update
NASA Technical Reports Server (NTRS)
Mantziaras, Dimitrios
2016-01-01
This EOS Terra Constellation Exit/Future Maneuver Plans Update presentation will discuss brief history of Terra EOM work; lifetime fuel estimates; baseline vs. proposed plan origin; resultant exit orbit; baseline vs. proposed exit plan; long term orbit altitude; revised lifetime proposal and fallback options.
A linear recurrent kernel online learning algorithm with sparse updates.
Fan, Haijin; Song, Qing
2014-02-01
In this paper, we propose a recurrent kernel algorithm with selectively sparse updates for online learning. The algorithm introduces a linear recurrent term in the estimation of the current output. This makes the past information reusable for updating of the algorithm in the form of a recurrent gradient term. To ensure that the reuse of this recurrent gradient indeed accelerates the convergence speed, a novel hybrid recurrent training is proposed to switch on or off learning the recurrent information according to the magnitude of the current training error. Furthermore, the algorithm includes a data-dependent adaptive learning rate which can provide guaranteed system weight convergence at each training iteration. The learning rate is set as zero when the training violates the derived convergence conditions, which makes the algorithm updating process sparse. Theoretical analyses of the weight convergence are presented and experimental results show the good performance of the proposed algorithm in terms of convergence speed and estimation accuracy. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ait-El-Fquih, Boujemaa; El Gharamti, Mohamad; Hoteit, Ibrahim
2016-08-01
Ensemble Kalman filtering (EnKF) is an efficient approach to addressing uncertainties in subsurface groundwater models. The EnKF sequentially integrates field data into simulation models to obtain a better characterization of the model's state and parameters. These are generally estimated following joint and dual filtering strategies, in which, at each assimilation cycle, a forecast step by the model is followed by an update step with incoming observations. The joint EnKF directly updates the augmented state-parameter vector, whereas the dual EnKF empirically employs two separate filters, first estimating the parameters and then estimating the state based on the updated parameters. To develop a Bayesian consistent dual approach and improve the state-parameter estimates and their consistency, we propose in this paper a one-step-ahead (OSA) smoothing formulation of the state-parameter Bayesian filtering problem from which we derive a new dual-type EnKF, the dual EnKFOSA. Compared with the standard dual EnKF, it imposes a new update step to the state, which is shown to enhance the performance of the dual approach with almost no increase in the computational cost. Numerical experiments are conducted with a two-dimensional (2-D) synthetic groundwater aquifer model to investigate the performance and robustness of the proposed dual EnKFOSA, and to evaluate its results against those of the joint and dual EnKFs. The proposed scheme is able to successfully recover both the hydraulic head and the aquifer conductivity, providing further reliable estimates of their uncertainties. Furthermore, it is found to be more robust to different assimilation settings, such as the spatial and temporal distribution of the observations, and the level of noise in the data. Based on our experimental setups, it yields up to 25 % more accurate state and parameter estimations than the joint and dual approaches.
Trasande, L; Zoeller, R T; Hass, U; Kortenkamp, A; Grandjean, P; Myers, J P; DiGangi, J; Hunt, P M; Rudel, R; Sathyanarayana, S; Bellanger, M; Hauser, R; Legler, J; Skakkebaek, N E; Heindel, J J
2016-07-01
A previous report documented that endocrine disrupting chemicals contribute substantially to certain forms of disease and disability. In the present analysis, our main objective was to update a range of health and economic costs that can be reasonably attributed to endocrine disrupting chemical exposures in the European Union, leveraging new burden and disease cost estimates of female reproductive conditions from accompanying report. Expert panels evaluated the epidemiologic evidence, using adapted criteria from the WHO Grading of Recommendations Assessment, Development and Evaluation Working Group, and evaluated laboratory and animal evidence of endocrine disruption using definitions recently promulgated by the Danish Environmental Protection Agency. The Delphi method was used to make decisions on the strength of the data. Expert panels consensus was achieved for probable (>20%) endocrine disrupting chemical causation for IQ loss and associated intellectual disability; autism; attention deficit hyperactivity disorder; endometriosis; fibroids; childhood obesity; adult obesity; adult diabetes; cryptorchidism; male infertility, and mortality associated with reduced testosterone. Accounting for probability of causation, and using the midpoint of each range for probability of causation, Monte Carlo simulations produced a median annual cost of €163 billion (1.28% of EU Gross Domestic Product) across 1000 simulations. We conclude that endocrine disrupting chemical exposures in the EU are likely to contribute substantially to disease and dysfunction across the life course with costs in the hundreds of billions of Euros per year. These estimates represent only those endocrine disrupting chemicals with the highest probability of causation; a broader analysis would have produced greater estimates of burden of disease and costs. © 2016 American Society of Andrology and European Academy of Andrology.
Szklo, André Salem; Yuan, Zhe; Levy, David
2017-12-18
A previous application of the Brazil SimSmoke tobacco control policy simulation model was used to show the effect of policies implemented between 1989 and 2010 on smoking-attributable deaths (SADs). In this study, we updated and further validated the Brazil SimSmoke model to incorporate policies implemented since 2011 (e.g., a new tax structure with the purpose of increasing revenues/real prices). In addition, we extended the model to estimate smoking-attributable maternal and child health outcomes (MCHOs), such as placenta praevia, placental abruption, preterm birth, low birth weight, and sudden infant death syndrome, to show the role of tobacco control in achieving the Millennium Development Goals. Using data on population, births, smoking, policies, and prevalence of MCHOs, the model is used to assess the effect on both premature deaths and MCHOs of tobacco control policies implemented in Brazil in the last 25 years relative to a counterfactual of policies kept at 1989 levels. Smoking prevalence in Brazil has fallen by an additional 17% for males (16%-19%) and 19% for females (14%-24%) between 2011 and 2015. As a result of the policies implemented since 1989, 7.5 million (6.4-8.5) deaths among adults aged 18 years or older are projected to be averted by 2050. Current policies are also estimated to reduce a cumulative total of 0.9 million (0.4-2.4) adverse MCHOs by 2050. Our findings show the benefits of tobacco control in reducing both SADs and smoking-attributable MCHOs at population level. These benefits may be used to better inform policy makers in low and middle income countries about allocating resources towards tobacco control policies in this important area.
Campbell, Helen; Andrews, Nick; Borrow, Ray; Trotter, Caroline; Miller, Elizabeth
2010-05-01
Meningococcal serogroup C conjugate (MCC) vaccines were licensed in the United Kingdom more than 10 years ago based on correlates of protection that had previously been established for serogroup C-containing polysaccharide vaccines by using the serum bactericidal antibody (SBA) assay. These correlates of protection were subsequently validated against postlicensure estimates of observed vaccine effectiveness up to 7 to 9 months after the administration of the MCC vaccine. Vaccine effectiveness was, however, shown to fall significantly more than 1 year after the administration of a 3-dose course in infancy. Despite this finding, the marked impact on serogroup C disease has been sustained, with the lowest recorded incidence (0.02 case per 100,000 population) in the 2008-2009 epidemiological year, mainly due to the indirect herd immunity effect of the vaccine in reducing carriage. Updated estimates of vaccine effectiveness through 30 June 2009 confirmed high short-term protection after vaccination in infancy, at 97% (95% confidence interval [CI], 91% to 99%), falling to 68% (95% CI, -63% to 90%) more than a year after vaccination. The observed vaccine effectiveness more than 12 months postvaccination was consistent with measured declining SBA levels, but confidence intervals were imprecise; vaccine effectiveness estimates were consistent with SBA titers of 1:4 or 1:8 as correlates of long-term protection after a primary course in infants. Modeling suggested that protection against carriage persists for at least 3 years and predicted the stabilization of serogroup C disease at low levels (fewer than 50 cases per year) up to 2015-2016.
Campbell, Helen; Andrews, Nick; Borrow, Ray; Trotter, Caroline; Miller, Elizabeth
2010-01-01
Meningococcal serogroup C conjugate (MCC) vaccines were licensed in the United Kingdom more than 10 years ago based on correlates of protection that had previously been established for serogroup C-containing polysaccharide vaccines by using the serum bactericidal antibody (SBA) assay. These correlates of protection were subsequently validated against postlicensure estimates of observed vaccine effectiveness up to 7 to 9 months after the administration of the MCC vaccine. Vaccine effectiveness was, however, shown to fall significantly more than 1 year after the administration of a 3-dose course in infancy. Despite this finding, the marked impact on serogroup C disease has been sustained, with the lowest recorded incidence (0.02 case per 100,000 population) in the 2008-2009 epidemiological year, mainly due to the indirect herd immunity effect of the vaccine in reducing carriage. Updated estimates of vaccine effectiveness through 30 June 2009 confirmed high short-term protection after vaccination in infancy, at 97% (95% confidence interval [CI], 91% to 99%), falling to 68% (95% CI, −63% to 90%) more than a year after vaccination. The observed vaccine effectiveness more than 12 months postvaccination was consistent with measured declining SBA levels, but confidence intervals were imprecise; vaccine effectiveness estimates were consistent with SBA titers of 1:4 or 1:8 as correlates of long-term protection after a primary course in infants. Modeling suggested that protection against carriage persists for at least 3 years and predicted the stabilization of serogroup C disease at low levels (fewer than 50 cases per year) up to 2015-2016. PMID:20219881
Reynolds, Richard J.; Calef, F.J.
2011-01-01
The hydrogeology of the stratified-drift aquifer in the Sprout Creek and Fishkill Creek valleys in southern Dutchess County, New York, previously investigated by the U.S. Geological Survey (USGS) in 1982, was updated through the use of new well data made available through the New York State Department of Environmental Conservation's Water Well Program. Additional well data related to U.S. Environmental Protection Agency (USEPA) remedial investigations of two groundwater contamination sites near the villages of Hopewell Junction and Shenandoah, New York, were also used in this study. The boundary of the stratified-drift aquifer described in a previous USGS report was extended slightly eastward and southward to include adjacent tributary valleys and the USEPA groundwater contamination site at Shenandoah, New York. The updated report consists of maps showing well locations, surficial geology, altitude of the water table, and saturated thickness of the aquifer. Geographic information system coverages of these four maps were created as part of the update process.
Indoor Positioning System Using Magnetic Field Map Navigation and an Encoder System
Kim, Han-Sol; Seo, Woojin; Baek, Kwang-Ryul
2017-01-01
In the indoor environment, variation of the magnetic field is caused by building structures, and magnetic field map navigation is based on this feature. In order to estimate position using this navigation, a three-axis magnetic field must be measured at every point to build a magnetic field map. After the magnetic field map is obtained, the position of the mobile robot can be estimated with a likelihood function whereby the measured magnetic field data and the magnetic field map are used. However, if only magnetic field map navigation is used, the estimated position can have large errors. In order to improve performance, we propose a particle filter system that integrates magnetic field map navigation and an encoder system. In this paper, multiple magnetic sensors and three magnetic field maps (a horizontal intensity map, a vertical intensity map, and a direction information map) are used to update the weights of particles. As a result, the proposed system estimates the position and orientation of a mobile robot more accurately than previous systems. Also, when the number of magnetic sensors increases, this paper shows that system performance improves. Finally, experiment results are shown from the proposed system that was implemented and evaluated. PMID:28327513
Indoor Positioning System Using Magnetic Field Map Navigation and an Encoder System.
Kim, Han-Sol; Seo, Woojin; Baek, Kwang-Ryul
2017-03-22
In the indoor environment, variation of the magnetic field is caused by building structures, and magnetic field map navigation is based on this feature. In order to estimate position using this navigation, a three-axis magnetic field must be measured at every point to build a magnetic field map. After the magnetic field map is obtained, the position of the mobile robot can be estimated with a likelihood function whereby the measured magnetic field data and the magnetic field map are used. However, if only magnetic field map navigation is used, the estimated position can have large errors. In order to improve performance, we propose a particle filter system that integrates magnetic field map navigation and an encoder system. In this paper, multiple magnetic sensors and three magnetic field maps (a horizontal intensity map, a vertical intensity map, and a direction information map) are used to update the weights of particles. As a result, the proposed system estimates the position and orientation of a mobile robot more accurately than previous systems. Also, when the number of magnetic sensors increases, this paper shows that system performance improves. Finally, experiment results are shown from the proposed system that was implemented and evaluated.
Users manual for the improved NASA Lewis ice accretion code LEWICE 1.6
NASA Technical Reports Server (NTRS)
Wright, William B.
1995-01-01
This report is intended as an update/replacement to NASA CR 185129 'User's Manual for the NASALewis Ice Accretion Prediction Code (LEWICE)' and as an update to NASA CR 195387 'Update to the NASA Lewis Ice Accretion Code LEWICE'. In addition to describing the changes specifically made for this version, information from previous manuals will be duplicated so that the user will not need three manuals to use this code.
Navigation with Electromagnetic Tracking for Interventional Radiology Procedures
Wood, Bradford J.; Zhang, Hui; Durrani, Amir; Glossop, Neil; Ranjan, Sohan; Lindisch, David; Levy, Eliott; Banovac, Filip; Borgert, Joern; Krueger, Sascha; Kruecker, Jochen; Viswanathan, Anand; Cleary, Kevin
2008-01-01
PURPOSE To assess the feasibility of the use of preprocedural imaging for guide wire, catheter, and needle navigation with electromagnetic tracking in phantom and animal models. MATERIALS AND METHODS An image-guided intervention software system was developed based on open-source software components. Catheters, needles, and guide wires were constructed with small position and orientation sensors in the tips. A tetrahedral-shaped weak electromagnetic field generator was placed in proximity to an abdominal vascular phantom or three pigs on the angiography table. Preprocedural computed tomographic (CT) images of the phantom or pig were loaded into custom-developed tracking, registration, navigation, and rendering software. Devices were manipulated within the phantom or pig with guidance from the previously acquired CT scan and simultaneous real-time angiography. Navigation within positron emission tomography (PET) and magnetic resonance (MR) volumetric datasets was also performed. External and endovascular fiducials were used for registration in the phantom, and registration error and tracking error were estimated. RESULTS The CT scan position of the devices within phantoms and pigs was accurately determined during angiography and biopsy procedures, with manageable error for some applications. Preprocedural CT depicted the anatomy in the region of the devices with real-time position updating and minimal registration error and tracking error (<5 mm). PET can also be used with this system to guide percutaneous biopsies to the most metabolically active region of a tumor. CONCLUSIONS Previously acquired CT, MR, or PET data can be accurately codisplayed during procedures with reconstructed imaging based on the position and orientation of catheters, guide wires, or needles. Multimodality interventions are feasible by allowing the real-time updated display of previously acquired functional or morphologic imaging during angiography, biopsy, and ablation. PMID:15802449
On the timing properties of SAX J1808.4-3658 during its 2015 outburst
NASA Astrophysics Data System (ADS)
Sanna, A.; Di Salvo, T.; Burderi, L.; Riggio, A.; Pintore, F.; Gambino, A. F.; Iaria, R.; Tailo, M.; Scarano, F.; Papitto, A.
2017-10-01
We present a timing analysis of the 2015 outburst of the accreting millisecond X-ray pulsar SAX J1808.4-3658, using non-simultaneous XMM-Newton and NuSTAR observations. We estimate the pulsar spin frequency and update the system orbital solution. Combining the average spin frequency from the previous observed, we confirm the long-term spin-down at an average rate \\dot{ν }_{SD}=1.5(2)× 10^{-15} Hz s-1. We also discuss possible corrections to the spin-down rate accounting for mass accretion on to the compact object when the system is X-ray active. Finally, combining the updated ephemerides with those of the previous outbursts, we find a long-term orbital evolution compatible with a binary expansion at a mean rate \\dot{P}_{orb}=3.6(4)× 10^{-12} s s-1, in agreement with previously reported values. This fast evolution is incompatible with an evolution driven by angular momentum losses caused by gravitational radiation under the hypothesis of conservative mass transfer. We discuss the observed orbital expansion in terms of non-conservative mass transfer and gravitational quadrupole coupling mechanism. We find that the latter can explain, under certain conditions, small fluctuations (of the order of few seconds) of the orbital period around a global parabolic trend. At the same time, a non-conservative mass transfer is required to explain the observed fast orbital evolution, which likely reflects ejection of a large fraction of mass from the inner Lagrangian point caused by the irradiation of the donor by the magnetodipole rotator during quiescence (radio-ejection model). This strong outflow may power tidal dissipation in the companion star and be responsible of the gravitational quadrupole change oscillations.
Factors influencing infants’ ability to update object representations in memory
Moher, Mariko; Feigenson, Lisa
2013-01-01
Remembering persisting objects over occlusion is critical to representing a stable environment. Infants remember hidden objects at multiple locations and can update their representation of a hidden array when an object is added or subtracted. However, the factors influencing these updating abilities have received little systematic exploration. Here we examined the flexibility of infants’ ability to update object representations. We tested 11-month-olds in a looking-time task in which objects were added to or subtracted from two hidden arrays. Across five experiments, infants successfully updated their representations of hidden arrays when the updating occurred successively at one array before beginning at the other. But when updating required alternating between two arrays, infants failed. However, simply connecting the two arrays with a thin strip of foam-core led infants to succeed. Our results suggest that infants’ construal of an event strongly affects their ability to update memory representations of hidden objects. When construing an event as containing multiple updates to the same array, infants succeed, but when construing the event as requiring the revisiting and updating of previously attended arrays, infants fail. PMID:24049245
Torres, Sergio N; Pezoa, Jorge E; Hayat, Majeed M
2003-10-10
What is to our knowledge a new scene-based algorithm for nonuniformity correction in infrared focal-plane array sensors has been developed. The technique is based on the inverse covariance form of the Kalman filter (KF), which has been reported previously and used in estimating the gain and bias of each detector in the array from scene data. The gain and the bias of each detector in the focal-plane array are assumed constant within a given sequence of frames, corresponding to a certain time and operational conditions, but they are allowed to randomly drift from one sequence to another following a discrete-time Gauss-Markov process. The inverse covariance form filter estimates the gain and the bias of each detector in the focal-plane array and optimally updates them as they drift in time. The estimation is performed with considerably higher computational efficiency than the equivalent KF. The ability of the algorithm in compensating for fixed-pattern noise in infrared imagery and in reducing the computational complexity is demonstrated by use of both simulated and real data.
Quantifying fall migration of Ross's gulls (Rhodostethia rosea) past Point Barrow, Alaska
Uher-Koch, Brian D.; Davis, Shanti E.; Maftei, Mark; Gesmundo, Callie; Suydam, R.S.; Mallory, Mark L.
2014-01-01
The Ross's gull (Rhodostethia rosea) is a poorly known seabird of the circumpolar Arctic. The only place in the world where Ross's gulls are known to congregate is in the near-shore waters around Point Barrow, Alaska where they undertake an annual passage in late fall. Ross's gulls seen at Point Barrow are presumed to originate from nesting colonies in Siberia, but neither their origin nor their destination has been confirmed. Current estimates of the global population of Ross's gulls are based largely on expert opinion, and the only reliable population estimate is derived from extrapolations from previous counts conducted at Point Barrow, but these data are now over 25 years old. In order to update and clarify the status of this species in Alaska, our study quantified the timing, number, and flight direction of Ross's gulls passing Point Barrow in 2011. We recorded up to two-thirds of the estimated global population of Ross's gulls (≥ 27,000 individuals) over 39 days with numbers peaking on 16 October when we observed over 7,000 birds during a three-hour period.
NASA Astrophysics Data System (ADS)
Noh, Seong Jin; Tachikawa, Yasuto; Shiiba, Michiharu; Kim, Sunmin
Applications of data assimilation techniques have been widely used to improve upon the predictability of hydrologic modeling. Among various data assimilation techniques, sequential Monte Carlo (SMC) filters, known as "particle filters" provide the capability to handle non-linear and non-Gaussian state-space models. This paper proposes a dual state-parameter updating scheme (DUS) based on SMC methods to estimate both state and parameter variables of a hydrologic model. We introduce a kernel smoothing method for the robust estimation of uncertain model parameters in the DUS. The applicability of the dual updating scheme is illustrated using the implementation of the storage function model on a middle-sized Japanese catchment. We also compare performance results of DUS combined with various SMC methods, such as SIR, ASIR and RPF.
NASA Astrophysics Data System (ADS)
Astroza, Rodrigo; Ebrahimian, Hamed; Conte, Joel P.
2015-03-01
This paper describes a novel framework that combines advanced mechanics-based nonlinear (hysteretic) finite element (FE) models and stochastic filtering techniques to estimate unknown time-invariant parameters of nonlinear inelastic material models used in the FE model. Using input-output data recorded during earthquake events, the proposed framework updates the nonlinear FE model of the structure. The updated FE model can be directly used for damage identification and further used for damage prognosis. To update the unknown time-invariant parameters of the FE model, two alternative stochastic filtering methods are used: the extended Kalman filter (EKF) and the unscented Kalman filter (UKF). A three-dimensional, 5-story, 2-by-1 bay reinforced concrete (RC) frame is used to verify the proposed framework. The RC frame is modeled using fiber-section displacement-based beam-column elements with distributed plasticity and is subjected to the ground motion recorded at the Sylmar station during the 1994 Northridge earthquake. The results indicate that the proposed framework accurately estimate the unknown material parameters of the nonlinear FE model. The UKF outperforms the EKF when the relative root-mean-square error of the recorded responses are compared. In addition, the results suggest that the convergence of the estimate of modeling parameters is smoother and faster when the UKF is utilized.
Drift Reduction in Pedestrian Navigation System by Exploiting Motion Constraints and Magnetic Field.
Ilyas, Muhammad; Cho, Kuk; Baeg, Seung-Ho; Park, Sangdeok
2016-09-09
Pedestrian navigation systems (PNS) using foot-mounted MEMS inertial sensors use zero-velocity updates (ZUPTs) to reduce drift in navigation solutions and estimate inertial sensor errors. However, it is well known that ZUPTs cannot reduce all errors, especially as heading error is not observable. Hence, the position estimates tend to drift and even cyclic ZUPTs are applied in updated steps of the Extended Kalman Filter (EKF). This urges the use of other motion constraints for pedestrian gait and any other valuable heading reduction information that is available. In this paper, we exploit two more motion constraints scenarios of pedestrian gait: (1) walking along straight paths; (2) standing still for a long time. It is observed that these motion constraints (called "virtual sensor"), though considerably reducing drift in PNS, still need an absolute heading reference. One common absolute heading estimation sensor is the magnetometer, which senses the Earth's magnetic field and, hence, the true heading angle can be calculated. However, magnetometers are susceptible to magnetic distortions, especially in indoor environments. In this work, an algorithm, called magnetic anomaly detection (MAD) and compensation is designed by incorporating only healthy magnetometer data in the EKF updating step, to reduce drift in zero-velocity updated INS. Experiments are conducted in GPS-denied and magnetically distorted environments to validate the proposed algorithms.
Examining the role of land motion in estimating altimeter system drifts
NASA Astrophysics Data System (ADS)
Leuliette, E. W.; Plagge, A. M.
2016-12-01
With the operational onset of Jason-3 and Sentinel-3 missions, the determination of mission-specific altimeter bias drift via the global tide gauge network is more crucial than ever. Here we extend previously presented work comparing the effect of vertical land motion (VLM) at tide gauges on derived drift for the combined TOPEX/Jason-1/Jason-2 dataset with the addition of Jason-3, and the combined Envisat/AltiKa record, as well as Sentinel-3 as data become available. Estimated drifts for each mission are considered using seven VLM estimations: (1) GPS-based methodology by King et al., 2012 [updated] at University of Tasmania; (2) GPS time series produced by JPL (http://sideshow.jpl.nasa.gov/post/series.html); the Université de La Rochelle's (3) ULR5 (Santamaria-Gomez 2012) and (4) ULR6; (5) GPS time series produced at the Nevada Geodetic Laboratory, and two versions using glacial isostatic adjustment: (6) those by Peltier et al. (2015) and (7) those by A, Wahr, and Zhong (2013). The drift estimates from the combined TOPEX/Jason dataset vary by 0.7 mm/year depending on the VLM estimate. The combined Envisat/AltiKa estimated drifts vary slightly less, more on the order of 0.5 mm/yr. In addition, we demonstrate the sensitivity of the drift estimates to tide gauge selection.
Estimating the Benefit per Ton of Reducing PM2.5 Precursors from 17 Sectors
Technical Support Document (TSD) reporting the human health impact and monetized benefits of reducing emissions of PM2.5 and PM2.5 precursors from 17 sectors. This TSD was updated in 2018 to reflect updated demographic and economic data.
ERIC Educational Resources Information Center
Mock, Karen R.
1998-01-01
Updates cases and issues previously discussed in this regular column on human rights in Canada, including racism and anti-Semitism, laws on hate crimes, hate sites on the World Wide Web, the use of the "free speech" defense by hate groups, and legal challenges to antiracist groups by individuals criticized by them. (DSK)
77 FR 15004 - Updating of Employer Identification Numbers
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-14
... the IRS, including whether the information will have practical utility; The accuracy of the estimated... techniques or other forms of information technology; and Estimates of capital or start-up costs and costs of... respondents are persons that have an EIN. Estimated total annual reporting burden: 403,177 hours. Estimated...
Revised techniques for estimating peak discharges from channel width in Montana
Parrett, Charles; Hull, J.A.; Omang, R.J.
1987-01-01
This study was conducted to develop new estimating equations based on channel width and the updated flood frequency curves of previous investigations. Simple regression equations for estimating peak discharges with recurrence intervals of 2, 5, 10 , 25, 50, and 100 years were developed for seven regions in Montana. The standard errors of estimates for the equations that use active channel width as the independent variables ranged from 30% to 87%. The standard errors of estimate for the equations that use bankfull width as the independent variable ranged from 34% to 92%. The smallest standard errors generally occurred in the prediction equations for the 2-yr flood, 5-yr flood, and 10-yr flood, and the largest standard errors occurred in the prediction equations for the 100-yr flood. The equations that use active channel width and the equations that use bankfull width were determined to be about equally reliable in five regions. In the West Region, the equations that use bankfull width were slightly more reliable than those based on active channel width, whereas in the East-Central Region the equations that use active channel width were slightly more reliable than those based on bankfull width. Compared with similar equations previously developed, the standard errors of estimate for the new equations are substantially smaller in three regions and substantially larger in two regions. Limitations on the use of the estimating equations include: (1) The equations are based on stable conditions of channel geometry and prevailing water and sediment discharge; (2) The measurement of channel width requires a site visit, preferably by a person with experience in the method, and involves appreciable measurement errors; (3) Reliability of results from the equations for channel widths beyond the range of definition is unknown. In spite of the limitations, the estimating equations derived in this study are considered to be as reliable as estimating equations based on basin and climatic variables. Because the two types of estimating equations are independent, results from each can be weighted inversely proportional to their variances, and averaged. The weighted average estimate has a variance less than either individual estimate. (Author 's abstract)
Update to An Inventory of Sources and Environmental ...
In 2006, EPA published an inventory of sources and environmental releases of dioxin-like compounds in the United States. This draft report presents an update and revision to that dioxin source inventory. It also presents updated estimates of environmental releases of dioxin-like compounds to the air, water, land and products. The sources are grouped into five broad categories: combustion sources, metals smelting/refining, chemical manufacturing, natural sources, and environmental reservoirs. Estimates of annual releases to land, air, and water are presented for reference years 1987, 1995, and 2000. While the overall decreasing trend in emissions seen in the original report continues, the individual dioxin releases in this draft updated report are generally higher than the values reported in 2006. This is largely due to the inclusion (in all three years) of additional sources in the quantitative inventory that were not included in the 2006 report. The largest new source included in this draft updated inventory was forest fires. In the 2006 report, this was classified as preliminary and not included in the quantitative inventory. The top three air sources of dioxin emissions in 2000 were forest fires, backyard burning of trash, and medical waste incinerators. The Report Presents An Update To The Dioxin Source Inventory Published In 2006 (U.S. Epa, 2006). The Peer-Review Panel For The 2006 Document Provided Additional Comments After The Final Report Had
Efficient visual grasping alignment for cylinders
NASA Technical Reports Server (NTRS)
Nicewarner, Keith E.; Kelley, Robert B.
1992-01-01
Monocular information from a gripper-mounted camera is used to servo the robot gripper to grasp a cylinder. The fundamental concept for rapid pose estimation is to reduce the amount of information that needs to be processed during each vision update interval. The grasping procedure is divided into four phases: learn, recognition, alignment, and approach. In the learn phase, a cylinder is placed in the gripper and the pose estimate is stored and later used as the servo target. This is performed once as a calibration step. The recognition phase verifies the presence of a cylinder in the camera field of view. An initial pose estimate is computed and uncluttered scan regions are selected. The radius of the cylinder is estimated by moving the robot a fixed distance toward the cylinder and observing the change in the image. The alignment phase processes only the scan regions obtained previously. Rapid pose estimates are used to align the robot with the cylinder at a fixed distance from it. The relative motion of the cylinder is used to generate an extrapolated pose-based trajectory for the robot controller. The approach phase guides the robot gripper to a grasping position. The cylinder can be grasped with a minimal reaction force and torque when only rough global pose information is initially available.
Efficient visual grasping alignment for cylinders
NASA Technical Reports Server (NTRS)
Nicewarner, Keith E.; Kelley, Robert B.
1991-01-01
Monocular information from a gripper-mounted camera is used to servo the robot gripper to grasp a cylinder. The fundamental concept for rapid pose estimation is to reduce the amount of information that needs to be processed during each vision update interval. The grasping procedure is divided into four phases: learn, recognition, alignment, and approach. In the learn phase, a cylinder is placed in the gripper and the pose estimate is stored and later used as the servo target. This is performed once as a calibration step. The recognition phase verifies the presence of a cylinder in the camera field of view. An initial pose estimate is computed and uncluttered scan regions are selected. The radius of the cylinder is estimated by moving the robot a fixed distance toward the cylinder and observing the change in the image. The alignment phase processes only the scan regions obtained previously. Rapid pose estimates are used to align the robot with the cylinder at a fixed distance from it. The relative motion of the cylinder is used to generate an extrapolated pose-based trajectory for the robot controller. The approach phase guides the robot gripper to a grasping position. The cylinder can be grasped with a minimal reaction force and torque when only rough global pose information is initially available.
2011, 2010 petroleum resource assessment of the National Petroleum Reserve in Alaska: GIS play maps
Garrity, Christopher P.; Houseknecht, David W.; Bird, Kenneth J.
2011-01-01
This report provides digital geographic information systems (GIS) files of maps for each of the 24 plays considered in the U.S. Geological Survey (USGS) 2010 updated petroleum resource assessment of the National Petroleum Reserve in Alaska (NPRA) (Houseknecht and others, 2010). These are the sample plays evaluated in a previous USGS assessment of the NPRA (Bird and Houseknecht, 2002a), maps of which were released in pdf format (Bird and Houseknecht, 2002b). The 2010 updated assessment of the NPRA evaluated each of the previously used 24 plays based on new geologic data available from exploration activities and scientific research. Quantitative assessments were revised for 11 plays, and no revisions were made for 9 plays. Estimates of the volume of technically recoverable, undiscovered oil, and nonassociated gas resources in these 20 plays are reported elsewhere (Houseknecht and others, 2010). Four plays quantitatively assessed in 2002 were eliminated from quantitative assessment for reasons explained by Houseknecht and others (2010). The NPRA assessment study area includes Federal and native onshore land and adjacent State offshore areas. A map showing the areal extent of each play was prepared by USGS geologists as a preliminary step in the assessment process. Boundaries were drawn on the basis of a variety of information, including seismic reflection data, results of exploration drilling, and regional patterns of rock properties. Play boundary polygons were captured by digitizing the play maps prepared by USGS geologists.
NASA Astrophysics Data System (ADS)
Ebrahimian, Hamed; Astroza, Rodrigo; Conte, Joel P.; de Callafon, Raymond A.
2017-02-01
This paper presents a framework for structural health monitoring (SHM) and damage identification of civil structures. This framework integrates advanced mechanics-based nonlinear finite element (FE) modeling and analysis techniques with a batch Bayesian estimation approach to estimate time-invariant model parameters used in the FE model of the structure of interest. The framework uses input excitation and dynamic response of the structure and updates a nonlinear FE model of the structure to minimize the discrepancies between predicted and measured response time histories. The updated FE model can then be interrogated to detect, localize, classify, and quantify the state of damage and predict the remaining useful life of the structure. As opposed to recursive estimation methods, in the batch Bayesian estimation approach, the entire time history of the input excitation and output response of the structure are used as a batch of data to estimate the FE model parameters through a number of iterations. In the case of non-informative prior, the batch Bayesian method leads to an extended maximum likelihood (ML) estimation method to estimate jointly time-invariant model parameters and the measurement noise amplitude. The extended ML estimation problem is solved efficiently using a gradient-based interior-point optimization algorithm. Gradient-based optimization algorithms require the FE response sensitivities with respect to the model parameters to be identified. The FE response sensitivities are computed accurately and efficiently using the direct differentiation method (DDM). The estimation uncertainties are evaluated based on the Cramer-Rao lower bound (CRLB) theorem by computing the exact Fisher Information matrix using the FE response sensitivities with respect to the model parameters. The accuracy of the proposed uncertainty quantification approach is verified using a sampling approach based on the unscented transformation. Two validation studies, based on realistic structural FE models of a bridge pier and a moment resisting steel frame, are performed to validate the performance and accuracy of the presented nonlinear FE model updating approach and demonstrate its application to SHM. These validation studies show the excellent performance of the proposed framework for SHM and damage identification even in the presence of high measurement noise and/or way-out initial estimates of the model parameters. Furthermore, the detrimental effects of the input measurement noise on the performance of the proposed framework are illustrated and quantified through one of the validation studies.
Vegetation, plant biomass, and net primary productivity patterns in the Canadian Arctic
NASA Astrophysics Data System (ADS)
Gould, W. A.; Raynolds, M.; Walker, D. A.
2003-01-01
We have developed maps of dominant vegetation types, plant functional types, percent vegetation cover, aboveground plant biomass, and above and belowground annual net primary productivity for Canada north of the northern limit of trees. The area mapped covers 2.5 million km2 including glaciers. Ice-free land covers 2.3 million km2 and represents 42% of all ice-free land in the Circumpolar Arctic. The maps combine information on climate, soils, geology, hydrology, remotely sensed vegetation classifications, previous vegetation studies, and regional expertise to define polygons drawn using photo-interpretation of a 1:4,000,000 scale advanced very high resolution radiometer (AVHRR) color infrared image basemap. Polygons are linked to vegetation description, associated properties, and descriptive literature through a series of lookup tables in a graphic information systems (GIS) database developed as a component of the Circumpolar Arctic Vegetation Map (CAVM) project. Polygons are classified into 20 landcover types including 17 vegetation types. Half of the region is sparsely vegetated (<50% vegetation cover), primarily in the High Arctic (bioclimatic subzones A-C). Whereas most (86%) of the estimated aboveground plant biomass (1.5 × 1015 g) and 87% of the estimated above and belowground annual net primary productivity (2.28 × 1014 g yr-1) are concentrated in the Low Arctic (subzones D and E). The maps present more explicit spatial patterns of vegetation and ecosystem attributes than have been previously available, the GIS database is useful in summarizing ecosystem properties and can be easily updated and integrated into circumpolar mapping efforts, and the derived estimates fall within the range of current published estimates.
Anita K. Rose
2015-01-01
This resource update provides an overview of forest resources in Virginia. Information for this factsheet was updated by means of the Forest Inventory and Analysis (FIA) annualized sample design. Each year, 20 percent of the sample plots (one panel) in Virginia are measured by field crews, the data compiled, and new estimates produced.
Net merit as a measure of lifetime profit: 2010 revision
USDA-ARS?s Scientific Manuscript database
The 2010 revision of net merit (NM$) updates a number of key economic values as well as milk utilization statistics. Members of Project S-1040, Genetic Selection and Crossbreeding To Enhance Reproduction and Survival of Dairy Cattle, provided updated incomes and expenses used to estimate lifetime pr...
Bounthavong, Mark; Li, Meng; Watanabe, Jonathan H
Previous estimates of the economic burden of Crohn's disease (CD) varied widely from $2.0 to $18.2 billion per year (adjusted to 2015 $US). However, these estimates do not reflect recent changes in pharmaceutical treatment options and guidelines. The goal of this study was to update cost estimates of Crohn's disease based on a representative sample of the US population from the most recent 11 years (2003-2013) of the Medical Expenditure Panel Survey (MEPS). A secondary aim described expenditure trends in respondents with and without Crohn's disease pre-post FDA approvals of new biologics and the American College of Gastroenterology Crohn's disease treatment guidelines. Average annual expenditures (total, prescription, inpatient, and outpatient) were evaluated using a pooled cross-sectional design. Respondent data from the most recent 11 years (2003-2013) of MEPS were analyzed. Two-part generalized linear models with power-link were used to estimate the average annual expenditures per patient adjusted to multiple covariates. Confidence intervals (CI) were estimated using bootstrap methods. Difference-in-differences estimations were performed to compare the changes in health care expenditures pre-post FDA approvals of new biologics and the American College of Gastroenterology Crohn's disease treatment guidelines. The annual aggregate economic burden of CD was $6.3 billion in the US. Respondents with CD had higher total (+$6442; 95% CI: $4864 to $8297), prescription (+$3283; 95% CI: $2289 to $4445), inpatient (+$1764; 95% CI: $748 to $3551), and outpatient (+$1191; 95% CI: $592 to $2160) expenditures compared to respondents without CD. In the difference-in-differences estimation, respondents with CD had significantly higher total (P = 0.001) and prescription (P < 0.001) expenditures compared with respondents without CD. Although inpatient and outpatient expenditures were higher in respondents with CD, they were not statistically significant. Respondents with CD diagnosis had higher expenditures compared to respondents without CD diagnosis from 2003 to 2013. This study captured the most recent availability of new treatment options and changes to treatment guidelines, while providing updated estimates of the economic burden of CD in the US. However, this research was unable to study the causes of these increased health care expenditures in respondents with CD. Future investigations will need to determine the causal factors for increased expenditures in CD. Copyright © 2016 Elsevier Inc. All rights reserved.
Climate, orography and scale controls on flood frequency in Triveneto (Italy)
NASA Astrophysics Data System (ADS)
Persiano, Simone; Castellarin, Attilio; Salinas, Jose Luis; Domeneghetti, Alessio; Brath, Armando
2016-05-01
The growing concern about the possible effects of climate change on flood frequency regime is leading Authorities to review previously proposed reference procedures for design-flood estimation, such as national flood frequency models. Our study focuses on Triveneto, a broad geographical region in North-eastern Italy. A reference procedure for design flood estimation in Triveneto is available from the Italian NCR research project "VA.PI.", which considered Triveneto as a single homogeneous region and developed a regional model using annual maximum series (AMS) of peak discharges that were collected up to the 1980s by the former Italian Hydrometeorological Service. We consider a very detailed AMS database that we recently compiled for 76 catchments located in Triveneto. All 76 study catchments are characterized in terms of several geomorphologic and climatic descriptors. The objective of our study is threefold: (1) to inspect climatic and scale controls on flood frequency regime; (2) to verify the possible presence of changes in flood frequency regime by looking at changes in time of regional L-moments of annual maximum floods; (3) to develop an updated reference procedure for design flood estimation in Triveneto by using a focused-pooling approach (i.e. Region of Influence, RoI). Our study leads to the following conclusions: (1) climatic and scale controls on flood frequency regime in Triveneto are similar to the controls that were recently found in Europe; (2) a single year characterized by extreme floods can have a remarkable influence on regional flood frequency models and analyses for detecting possible changes in flood frequency regime; (3) no significant change was detected in the flood frequency regime, yet an update of the existing reference procedure for design flood estimation is highly recommended and we propose the RoI approach for properly representing climate and scale controls on flood frequency in Triveneto, which cannot be regarded as a single homogeneous region.
Shen, Yi; Dai, Wei; Richards, Virginia M
2015-03-01
A MATLAB toolbox for the efficient estimation of the threshold, slope, and lapse rate of the psychometric function is described. The toolbox enables the efficient implementation of the updated maximum-likelihood (UML) procedure. The toolbox uses an object-oriented architecture for organizing the experimental variables and computational algorithms, which provides experimenters with flexibility in experimental design and data management. Descriptions of the UML procedure and the UML Toolbox are provided, followed by toolbox use examples. Finally, guidelines and recommendations of parameter configurations are given.
EAU Guidelines on Non-Muscle-invasive Urothelial Carcinoma of the Bladder: Update 2016.
Babjuk, Marko; Böhle, Andreas; Burger, Maximilian; Capoun, Otakar; Cohen, Daniel; Compérat, Eva M; Hernández, Virginia; Kaasinen, Eero; Palou, Joan; Rouprêt, Morgan; van Rhijn, Bas W G; Shariat, Shahrokh F; Soukup, Viktor; Sylvester, Richard J; Zigeuner, Richard
2017-03-01
The European Association of Urology (EAU) panel on Non-muscle-invasive Bladder Cancer (NMIBC) released an updated version of the guidelines on Non-muscle-invasive Bladder Cancer. To present the 2016 EAU guidelines on NMIBC. A broad and comprehensive scoping exercise covering all areas of the NMIBC guidelines published between April 1, 2014, and May 31, 2015, was performed. Databases covered by the search included Medline, Embase, and the Cochrane Libraries. Previous guidelines were updated, and levels of evidence and grades of recommendation were assigned. Tumours staged as TaT1 or carcinoma in situ (CIS) are grouped as NMIBC. Diagnosis depends on cystoscopy and histologic evaluation of the tissue obtained by transurethral resection of the bladder (TURB) in papillary tumours or by multiple bladder biopsies in CIS. In papillary lesions, a complete TURB is essential for the patient's prognosis. If the initial resection is incomplete, there is no muscle in the specimen, or a high-grade or T1 tumour is detected, a second TURB should be performed within 2-6 wk. The risks of both recurrence and progression may be estimated for individual patients using the European Organisation for Research and Treatment of Cancer (EORTC) scoring system and risk tables. The stratification of patients into low-, intermediate-, and high-risk groups is pivotal to recommending adjuvant treatment. For patients with a low-risk tumour and intermediate-risk patients at a lower risk of recurrence, one immediate instillation of chemotherapy is recommended. Patients with an intermediate-risk tumour should receive 1 yr of full-dose bacillus Calmette-Guérin (BCG) intravesical immunotherapy or instillations of chemotherapy for a maximum of 1 yr. In patients with high-risk tumours, full-dose intravesical BCG for 1-3 yr is indicated. In patients at highest risk of tumour progression, immediate radical cystectomy (RC) should be considered. RC is recommended in BCG-refractory tumours. The long version of the guidelines is available at the EAU Web site (www.uroweb.org/guidelines). These abridged EAU guidelines present updated information on the diagnosis and treatment of NMIBC for incorporation into clinical practice. The European Association of Urology has released updated guidelines on Non-muscle-invasive Bladder Cancer (NMIBC). Stratification of patients into low-, intermediate-, and high-risk groups is essential for decisions about adjuvant intravesical instillations. Risk tables can be used to estimate risks of recurrence and progression. Radical cystectomy should be considered only in case of failure of instillations or in NMIBC with the highest risk of progression. Copyright © 2016. Published by Elsevier B.V.
Costs And Savings Associated With Community Water Fluoridation In The United States.
O'Connell, Joan; Rockell, Jennifer; Ouellet, Judith; Tomar, Scott L; Maas, William
2016-12-01
The most comprehensive study of US community water fluoridation program benefits and costs was published in 2001. This study provides updated estimates using an economic model that includes recent data on program costs, dental caries increments, and dental treatments. In 2013 more than 211 million people had access to fluoridated water through community water systems serving 1,000 or more people. Savings associated with dental caries averted in 2013 as a result of fluoridation were estimated to be $32.19 per capita for this population. Based on 2013 estimated costs ($324 million), net savings (savings minus costs) from fluoridation systems were estimated to be $6,469 million and the estimated return on investment, 20.0. While communities should assess their specific costs for continuing or implementing a fluoridation program, these updated findings indicate that program savings are likely to exceed costs. Project HOPE—The People-to-People Health Foundation, Inc.
Global and Hemispheric Annual Temperature Variations Between 1854 and 1991 (revised 1994) (NDP-022)
Jones, P. D. [University of East Anglia, Norwich, United Kingdom; Wigley, T. M. L. [University of East Anglia, Norwich, United Kingdom; Wright, P. B. [University of East Anglia, Norwich, United Kingdom
1994-01-01
This data set contains estimates of global and hemispheric annual temperature variations, relative to a 1950 through 1979 reference period, for 1861 through 1991. The estimates are based on corrected land and ocean data. Land data were derived from meteorological data and fixed-position weather-ship data that were corrected for nonclimatic errors, such as station shifts and/or instrument changes. The marine data used were those in the Comprehensive Ocean-Atmosphere Data Set (COADS) compilation, which with updates covers to 1986. Updates to 1991 were made with hemispheric sea-surface temperature estimates produced by the U.K. Meteorological Office. Each record includes year and six annual temperature variations: one estimate each for the globe, the Northern Hemisphere, and the Southern Hemisphere and another estimate each that reflects an adjustment to account for the influence of El Niño/Southern Oscillation events. The data are in one file of 13 kB.
77 FR 31378 - Secretarial Commission on Indian Trust Administration and Reform
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-25
... robin session to share perspectives regarding any of the previous panel discussion questions [cir] Do... round robin Review agreements and action items for the day Tuesday, June 12, 2012 Status update on BIA..., including updates to subcommittee charge Public round robin session to share perspectives regarding any of...
Discourse Updating after Reading a Counterfactual Event
ERIC Educational Resources Information Center
de Vega, Manuel; Urrutia, Mabel
2012-01-01
This paper explores the temporal course of discourse updating after reading counterfactual events. To test the accessibility to discourse information, readers were asked to identify probes related to initial events in the text, previous to the counterfactual, or probes related to the critical counterfactual events. Experiment 1 showed that 500 ms…
DOT National Transportation Integrated Search
1984-01-01
This report updated previous reports on the impact of raising and lowering the legal drinking age. The legal drinking age for beer in Virginia was lowered from 21 to 18 years in 1974. The percentage of all crashes that were alcohol-related increased ...
Updated Meta-Analysis of Learner Control within Educational Technology
ERIC Educational Resources Information Center
Karich, Abbey C.; Burns, Matthew K.; Maki, Kathrin E.
2014-01-01
Giving a student control over their learning has theoretical and intuitive appeal, but its effects are neither powerful nor consistent in the empirical literature base. This meta-analysis updated previous meta-analytic research by Niemiec, Sikorski, and Walberg by studying the overall effectiveness of providing learner control within educational…
Updating estimates of low streamflow statistics to account for possible trends
NASA Astrophysics Data System (ADS)
Blum, A. G.; Archfield, S. A.; Hirsch, R. M.; Vogel, R. M.; Kiang, J. E.; Dudley, R. W.
2017-12-01
Given evidence of both increasing and decreasing trends in low flows in many streams, methods are needed to update estimators of low flow statistics used in water resources management. One such metric is the 10-year annual low-flow statistic (7Q10) calculated as the annual minimum seven-day streamflow which is exceeded in nine out of ten years on average. Historical streamflow records may not be representative of current conditions at a site if environmental conditions are changing. We present a new approach to frequency estimation under nonstationary conditions that applies a stationary nonparametric quantile estimator to a subset of the annual minimum flow record. Monte Carlo simulation experiments were used to evaluate this approach across a range of trend and no trend scenarios. Relative to the standard practice of using the entire available streamflow record, use of a nonparametric quantile estimator combined with selection of the most recent 30 or 50 years for 7Q10 estimation were found to improve accuracy and reduce bias. Benefits of data subset selection approaches were greater for higher magnitude trends annual minimum flow records with lower coefficients of variation. A nonparametric trend test approach for subset selection did not significantly improve upon always selecting the last 30 years of record. At 174 stream gages in the Chesapeake Bay region, 7Q10 estimators based on the most recent 30 years of flow record were compared to estimators based on the entire period of record. Given the availability of long records of low streamflow, using only a subset of the flow record ( 30 years) can be used to update 7Q10 estimators to better reflect current streamflow conditions.
Age-specific survival of male golden-cheeked warblers on the Fort Hood Military Reservation, Texas
Duarte, Adam; Hines, James E.; Nichols, James D.; Hatfield, Jeffrey S.; Weckerly, Floyd W.
2014-01-01
Population models are essential components of large-scale conservation and management plans for the federally endangered Golden-cheeked Warbler (Setophaga chrysoparia; hereafter GCWA). However, existing models are based on vital rate estimates calculated using relatively small data sets that are now more than a decade old. We estimated more current, precise adult and juvenile apparent survival (Φ) probabilities and their associated variances for male GCWAs. In addition to providing estimates for use in population modeling, we tested hypotheses about spatial and temporal variation in Φ. We assessed whether a linear trend in Φ or a change in the overall mean Φ corresponded to an observed increase in GCWA abundance during 1992-2000 and if Φ varied among study plots. To accomplish these objectives, we analyzed long-term GCWA capture-resight data from 1992 through 2011, collected across seven study plots on the Fort Hood Military Reservation using a Cormack-Jolly-Seber model structure within program MARK. We also estimated Φ process and sampling variances using a variance-components approach. Our results did not provide evidence of site-specific variation in adult Φ on the installation. Because of a lack of data, we could not assess whether juvenile Φ varied spatially. We did not detect a strong temporal association between GCWA abundance and Φ. Mean estimates of Φ for adult and juvenile male GCWAs for all years analyzed were 0.47 with a process variance of 0.0120 and a sampling variance of 0.0113 and 0.28 with a process variance of 0.0076 and a sampling variance of 0.0149, respectively. Although juvenile Φ did not differ greatly from previous estimates, our adult Φ estimate suggests previous GCWA population models were overly optimistic with respect to adult survival. These updated Φ probabilities and their associated variances will be incorporated into new population models to assist with GCWA conservation decision making.
Reducing Contingency through Sampling at the Luckey FUSRAP Site - 13186
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frothingham, David; Barker, Michelle; Buechi, Steve
2013-07-01
Typically, the greatest risk in developing accurate cost estimates for the remediation of hazardous, toxic, and radioactive waste sites is the uncertainty in the estimated volume of contaminated media requiring remediation. Efforts to address this risk in the remediation cost estimate can result in large cost contingencies that are often considered unacceptable when budgeting for site cleanups. Such was the case for the Luckey Formerly Utilized Sites Remedial Action Program (FUSRAP) site near Luckey, Ohio, which had significant uncertainty surrounding the estimated volume of site soils contaminated with radium, uranium, thorium, beryllium, and lead. Funding provided by the American Recoverymore » and Reinvestment Act (ARRA) allowed the U.S. Army Corps of Engineers (USACE) to conduct additional environmental sampling and analysis at the Luckey Site between November 2009 and April 2010, with the objective to further delineate the horizontal and vertical extent of contaminated soils in order to reduce the uncertainty in the soil volume estimate. Investigative work included radiological, geophysical, and topographic field surveys, subsurface borings, and soil sampling. Results from the investigative sampling were used in conjunction with Argonne National Laboratory's Bayesian Approaches for Adaptive Spatial Sampling (BAASS) software to update the contaminated soil volume estimate for the site. This updated volume estimate was then used to update the project cost-to-complete estimate using the USACE Cost and Schedule Risk Analysis process, which develops cost contingencies based on project risks. An investment of $1.1 M of ARRA funds for additional investigative work resulted in a reduction of 135,000 in-situ cubic meters (177,000 in-situ cubic yards) in the estimated base volume estimate. This refinement of the estimated soil volume resulted in a $64.3 M reduction in the estimated project cost-to-complete, through a reduction in the uncertainty in the contaminated soil volume estimate and the associated contingency costs. (authors)« less
The Propulsive Small Expendable Deployer System (ProSEDS)
NASA Technical Reports Server (NTRS)
Lorenzini, Enrico C.; Estes, Robert D.; Cosmo, Mario L.
2001-01-01
This is the Annual Report #2 entitled "The Propulsive Small Expendable Deployer System (ProSEDS)" prepared by the Smithsonian Astrophysical Observatory for NASA Marshall Space Flight Center. This report covers the period of activity from 1 August 2000 through 30 July 2001. The topics include: 1) Updated System Performance; 2) Mission Analysis; 3) Updated Dynamics Reference Mission; 4) Updated Deployment Control Profiles and Simulations; 5) Comparison of ED tethers and electrical thrusters; 6) Kalman filters for mission estimation; and 7) Delivery of interactive software for ED tethers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalantari, F; Wang, J; Li, T
2015-06-15
Purpose: In conventional 4D-PET, images from different frames are reconstructed individually and aligned by registration methods. Two issues with these approaches are: 1) Reconstruction algorithms do not make full use of all projections statistics; and 2) Image registration between noisy images can Result in poor alignment. In this study we investigated the use of simultaneous motion estimation and image reconstruction (SMEIR) method for cone beam CT for motion estimation/correction in 4D-PET. Methods: Modified ordered-subset expectation maximization algorithm coupled with total variation minimization (OSEM- TV) is used to obtain a primary motion-compensated PET (pmc-PET) from all projection data using Demons derivedmore » deformation vector fields (DVFs) as initial. Motion model update is done to obtain an optimal set of DVFs between the pmc-PET and other phases by matching the forward projection of the deformed pmc-PET and measured projections of other phases. Using updated DVFs, OSEM- TV image reconstruction is repeated and new DVFs are estimated based on updated images. 4D XCAT phantom with typical FDG biodistribution and a 10mm diameter tumor was used to evaluate the performance of the SMEIR algorithm. Results: Image quality of 4D-PET is greatly improved by the SMEIR algorithm. When all projections are used to reconstruct a 3D-PET, motion blurring artifacts are present, leading to a more than 5 times overestimation of the tumor size and 54% tumor to lung contrast ratio underestimation. This error reduced to 37% and 20% for post reconstruction registration methods and SMEIR respectively. Conclusion: SMEIR method can be used for motion estimation/correction in 4D-PET. The statistics is greatly improved since all projection data are combined together to update the image. The performance of the SMEIR algorithm for 4D-PET is sensitive to smoothness control parameters in the DVF estimation step.« less
The advantage of flexible neuronal tunings in neural network models for motor learning
Marongelli, Ellisha N.; Thoroughman, Kurt A.
2013-01-01
Human motor adaptation to novel environments is often modeled by a basis function network that transforms desired movement properties into estimated forces. This network employs a layer of nodes that have fixed broad tunings that generalize across the input domain. Learning is achieved by updating the weights of these nodes in response to training experience. This conventional model is unable to account for rapid flexibility observed in human spatial generalization during motor adaptation. However, added plasticity in the widths of the basis function tunings can achieve this flexibility, and several neurophysiological experiments have revealed flexibility in tunings of sensorimotor neurons. We found a model, Locally Weighted Projection Regression (LWPR), which uniquely possesses the structure of a basis function network in which both the weights and tuning widths of the nodes are updated incrementally during adaptation. We presented this LWPR model with training functions of different spatial complexities and monitored incremental updates to receptive field widths. An inverse pattern of dependence of receptive field adaptation on experienced error became evident, underlying both a relationship between generalization and complexity, and a unique behavior in which generalization always narrows after a sudden switch in environmental complexity. These results implicate a model that is flexible in both basis function widths and weights, like LWPR, as a viable alternative model for human motor adaptation that can account for previously observed plasticity in spatial generalization. This theory can be tested by using the behaviors observed in our experiments as novel hypotheses in human studies. PMID:23888141
NASA Astrophysics Data System (ADS)
Duluc, Matthieu; Bardelay, Aurélie; Celik, Cihangir; Heinrichs, Dave; Hopper, Calvin; Jones, Richard; Kim, Soon; Miller, Thomas; Troisne, Marc; Wilson, Chris
2017-09-01
AWE (UK), IRSN (France), LLNL (USA) and ORNL (USA) began a long term collaboration effort in 2015 to update the nuclear criticality Slide Rule for the emergency response to a nuclear criticality accident. This document, published almost 20 years ago, gives order of magnitude estimates of key parameters, such as number of fissions and doses (neutron and gamma), useful for emergency response teams and public authorities. This paper will present, firstly the motivation and the long term objectives for this update, then the overview of the initial configurations for updated calculations and preliminary results obtained with modern 3D codes.
Estimating Uncertainty in Annual Forest Inventory Estimates
Ronald E. McRoberts; Veronica C. Lessard
1999-01-01
The precision of annual forest inventory estimates may be negatively affected by uncertainty from a variety of sources including: (1) sampling error; (2) procedures for updating plots not measured in the current year; and (3) measurement errors. The impact of these sources of uncertainty on final inventory estimates is investigated using Monte Carlo simulation...
Autonomous reinforcement learning with experience replay.
Wawrzyński, Paweł; Tanwani, Ajay Kumar
2013-05-01
This paper considers the issues of efficiency and autonomy that are required to make reinforcement learning suitable for real-life control tasks. A real-time reinforcement learning algorithm is presented that repeatedly adjusts the control policy with the use of previously collected samples, and autonomously estimates the appropriate step-sizes for the learning updates. The algorithm is based on the actor-critic with experience replay whose step-sizes are determined on-line by an enhanced fixed point algorithm for on-line neural network training. An experimental study with simulated octopus arm and half-cheetah demonstrates the feasibility of the proposed algorithm to solve difficult learning control problems in an autonomous way within reasonably short time. Copyright © 2012 Elsevier Ltd. All rights reserved.
A new Bayesian recursive technique for parameter estimation
NASA Astrophysics Data System (ADS)
Kaheil, Yasir H.; Gill, M. Kashif; McKee, Mac; Bastidas, Luis
2006-08-01
The performance of any model depends on how well its associated parameters are estimated. In the current application, a localized Bayesian recursive estimation (LOBARE) approach is devised for parameter estimation. The LOBARE methodology is an extension of the Bayesian recursive estimation (BARE) method. It is applied in this paper on two different types of models: an artificial intelligence (AI) model in the form of a support vector machine (SVM) application for forecasting soil moisture and a conceptual rainfall-runoff (CRR) model represented by the Sacramento soil moisture accounting (SAC-SMA) model. Support vector machines, based on statistical learning theory (SLT), represent the modeling task as a quadratic optimization problem and have already been used in various applications in hydrology. They require estimation of three parameters. SAC-SMA is a very well known model that estimates runoff. It has a 13-dimensional parameter space. In the LOBARE approach presented here, Bayesian inference is used in an iterative fashion to estimate the parameter space that will most likely enclose a best parameter set. This is done by narrowing the sampling space through updating the "parent" bounds based on their fitness. These bounds are actually the parameter sets that were selected by BARE runs on subspaces of the initial parameter space. The new approach results in faster convergence toward the optimal parameter set using minimum training/calibration data and fewer sets of parameter values. The efficacy of the localized methodology is also compared with the previously used BARE algorithm.
Performance of Trajectory Models with Wind Uncertainty
NASA Technical Reports Server (NTRS)
Lee, Alan G.; Weygandt, Stephen S.; Schwartz, Barry; Murphy, James R.
2009-01-01
Typical aircraft trajectory predictors use wind forecasts but do not account for the forecast uncertainty. A method for generating estimates of wind prediction uncertainty is described and its effect on aircraft trajectory prediction uncertainty is investigated. The procedure for estimating the wind prediction uncertainty relies uses a time-lagged ensemble of weather model forecasts from the hourly updated Rapid Update Cycle (RUC) weather prediction system. Forecast uncertainty is estimated using measures of the spread amongst various RUC time-lagged ensemble forecasts. This proof of concept study illustrates the estimated uncertainty and the actual wind errors, and documents the validity of the assumed ensemble-forecast accuracy relationship. Aircraft trajectory predictions are made using RUC winds with provision for the estimated uncertainty. Results for a set of simulated flights indicate this simple approach effectively translates the wind uncertainty estimate into an aircraft trajectory uncertainty. A key strength of the method is the ability to relate uncertainty to specific weather phenomena (contained in the various ensemble members) allowing identification of regional variations in uncertainty.
NASA Technical Reports Server (NTRS)
Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris
2008-01-01
The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.
Contreras-Gutiérrez, María Angélica; Vélez, Iván Darío; Porter, Charles; Uribe, Sandra Inés
2014-01-01
An updated list of phlebotomine sand flies species in coffee growing areas in the Colombian Andean region is presented. Fifty three species were reported from 12 departments. In addition, species distribution in the region was derived from specimens obtained during intensive field work in five departments, from previously published studies and from the taxonomic revision of specimens in the entomological collection of the Programa de Estudio y Control de Enfermedades Tropicales (PECET). The list includes the genera Brumptomyia (2 species), Lutzomyia (50 species) and Warileya (1 species). The updated list contains eleven new records in the region under study, including Lutzomyia panamensis , a species of medical importance not recorded previously in this zone. Eighteen of the species are considered to be anthropophilic, and many of them have been implicated in the transmission of leishmaniasis.
Update of membership and mean proper motion of open clusters from UCAC5 catalog
NASA Astrophysics Data System (ADS)
Dias, W. S.; Monteiro, H.; Assafin, M.
2018-06-01
We present mean proper motions and membership probabilities of individual stars for optically visible open clusters, which have been determined using data from the UCAC5 catalog. This follows our previous studies with the UCAC2 and UCAC4 catalogs, but now using improved proper motions in the GAIA reference frame. In the present study results were obtained for a sample of 1108 open clusters. For five clusters, this is the first determination of mean proper motion, and for the whole sample, we present results with a much larger number of identified astrometric member stars than on previous studies. It is the last update of our Open cluster Catalog based on proper motion data only. Future updates will count on astrometric, photometric and spectroscopic GAIA data as input for analyses.
DOT National Transportation Integrated Search
2002-01-01
The Illinois Department of Transportation's Highway Construction Manual has been updated from its previous version. All previous editions of this manual are now obsolete and changes in content have been marked. The Construction Manual provides inform...
David E. Haugen
2014-01-01
This resource update provides an overview of forest resources in North Dakota based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Northern Research Station in cooperation with the North Dakota Forest Service. Estimates are based on field data collected using the FIA annualized sample design and are updated...
Mark D. Nelson; Matt Brewer
2014-01-01
This resource update provides an overview of forest resources in Iowa based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Northern Research Station in cooperation with the Iowa Department of Natural Resources. Estimates are based on field data collected using the FIA annualized sample design and are updated...
D.M. Meneguzzo
2014-01-01
This resource update provides an overview of forest resource attributes for Nebraska based on annual inventories conducted by the Forest Inventory and Analysis (FIA) Program of the Northern Research Station (NRS) of the U.S. Forest Service. The estimates presented in this update are based on field data collected in 2009-2013 with comparisons made to data collected from...
D.M. Meneguzzo; B.J. Butler
2014-01-01
This resource update provides an overview of forest resource attributes for Kansas based on annual inventories conducted by the Forest Inventory and Analysis (FIA) program of the Northern Research Station (NRS) of the U.S. Forest Service. The estimates presented in this update are based on field data collected in 2009-2013 with comparisons made to data collected from...
Scott A. Pugh
2015-01-01
This resource update provides an overview of forest resources in Michigan based on inventories conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program of the Northern Research Station. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly.* The annual inventory started in 1999. For the 2014...
S. Lambert; J.T. Vogt.; J. Cooper
2015-01-01
This resource update provides an overview of forest resources in Oklahoma based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Southern Research Station, in cooperation with Oklahoma Forestry Services (OFS). Estimates are based on field data collected using the FIA annualized sample design and are updated yearly...
A review of methods for updating forest monitoring system estimates
Hector Franco-Lopez; Alan R. Ek; Andrew P. Robinson
2000-01-01
Intensifying interest in forests and the development of new monitoring technologies have induced major changes in forest monitoring systems in the last few years, including major revisions in the methods used for updating. This paper describes the methods available for projecting stand- and plot-level information, emphasizing advantages and disadvantages, and the...
Mark D. Nelson; Tivon E. Feeley
2018-01-01
This resource update provides an overview of forest resources in Iowa based on inventories conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Northern Research Station in cooperation with the Iowa Department of Natural Resources. Estimates are based on field data collected using the FIA annualized sample design and are updated...
T.J. Brandeis
2015-01-01
This resource update provides an overview of forest resources in Georgia based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Southern Research Station in cooperation with the Georgia Forestry Commission. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly....
Ronald J. Piva; Thomas B. Treiman
2016-01-01
This resource update provides an overview of forest resources in Missouri based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Northern Research Station in cooperation with the Missouri Department of Conservation. Estimates are based on field data collected using the FIA annualized sample design and are updated...
Richard H. Widmann
2016-01-01
This resource update provides an overview of the forest resources in Pennsylvania based on inventories conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program of the Northern Research Station (NRS). Estimates are based on field data collected using the FIA annualized sample design and are updated yearly1(see footnote 1, page 2). Information...
Scott A. Pugh; Charles Paulson; Brett J. Butler
2016-01-01
This resource update provides an overview of forest resources in Michigan based on inventories conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program of the Northern Research Station. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly. The annual inventory started in 1999. For the 2015...
Richard H. Widmann
2016-01-01
This resource update provides an overview of the forest resources in Ohio based on inventories conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program of the Northern Research Station. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly.1(See footnotes on page 4.) Information about the...
Charles Paulson; Scott A. Pugh
2017-01-01
This resource update provides an overview of forest resources in Michigan based on inventories conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program of the Northern Research Station. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly. The annual inventory started in 1999. For the 2016...
David E. Haugen
2017-01-01
This resource update provides an overview of forest resources in North Dakota based on an inventory conducted by the USDA Forest Service, Forest Inventory and Analysis (FIA) program within the Northern Research Station in cooperation with the North Dakota Forest Service. Estimates are based on field data collected using the FIA annualized sample design and are updated...
Charles S. Paulson
2018-01-01
This resource update provides an overview of forest resources in North Dakota based on an inventory conducted by the USDA Forest Service, Forest Inventory and Analysis (FIA) program within the Northern Research Station in cooperation with the North Dakota Forest Service. Estimates are based on field data collected using the FIA annualized sample design and are updated...
Thomas A. Albright
2017-01-01
This resource update provides an overview of the forest resources in Pennsylvania based on inventories conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program of the Northern Research Station. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly. Information about the national and regional...
Thomas C. Goff
2018-01-01
This resource update provides an overview of forest resources in Missouri based on an inventory conducted by the USDA Forest Service, Forest Inventory and Analysis (FIA) program at the Northern Research Station in cooperation with the Missouri Department of Conservation. Estimates are based on field data collected using the FIA annualized sample design and are updated...
Scott A. Pugh
2018-01-01
This resource update provides an overview of forest resources in Michigan based on inventories conducted by the USDA Forest Service, Forest Inventory and Analysis (FIA) program of the Northern Research Station. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly. The annual inventory started in 1999. For the 2017...
Ronald J. Piva; Thomas B. Treiman
2017-01-01
This resource update provides an overview of forest resources in Missouri based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Northern Research Station in cooperation with the Missouri Department of Conservation. Estimates are based on field data collected using the FIA annualized sample design and are updated...
Mark D. Nelson; Matt Brewer; Dacia M. Meneguzzo; Kathryne. Clark
2016-01-01
This resource update provides an overview of forest resources in Iowa based on inventories conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Northern Research Station in cooperation with the Iowa Department of Natural Resources. Estimates are based on field data collected using the FIA annualized sample design and are updated...
Andy Hartsell
2016-01-01
This resource update provides an overview of forest resources in Alabama based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Southern Research Station in cooperation with the Alabama Forestry Commission. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly....
Thomas A. Albright
2017-01-01
This resource update provides an overview of the forest resources in Ohio based on inventories conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program of the Northern Research Station. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly.1 Information about the national and regional FIA...
S.N. Oswalt
2014-01-01
This resource update provides an overview of forest resources in Louisiana based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Southern Research Station in cooperation with the Mississippi Forestry Commission. Estimates are based on field data collected using the FIA annualized sample design and are updated...
Mark D. Nelson; Matt Brewer; Brett J. Butler; Scott A. Pugh
2015-01-01
This resource update provides an overview of forest resources in Iowa based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Northern Research Station in cooperation with the Iowa Department of Natural Resources. Estimates are based on field data collected using the FIA annualized sample design and are updated...
Richard H. Widmann
2015-01-01
This resource update provides an overview of the forest resources in Pennsylvania based on inventories conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program of the Northern Research Station. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly (see footnote 1 on page 4). Information about...
S. Lambert; K. Randolph; J. Cooper
2015-01-01
This resource update provides an overview of forest resources in Oklahoma based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Southern Research Station, in cooperation with Oklahoma Forestry Services. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly,...
Dacia M. Meneguzzo; Susan J. Crocker
2015-01-01
This resource update provides an overview of forest resource attributes for Nebraska based on annual inventories conducted by the Forest Inventory and Analysis (FIA) Program of the Northern Research Station (NRS), U.S. Forest Service. The estimates presented in this update are based on field data collected in 2010-2014 with comparisons made to data collected from 2005-...
Andy Hartsell
2016-01-01
This resource update provides an overview of forest resources in Alabama based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Southern Research Station in cooperation with the Alabama Forestry Commission. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly....
Richard H. Widmann
2015-01-01
This resource update provides an overview of the forest resources in Ohio based on inventories conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program of the Northern Research Station. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly. (See footnote on page 4.) Information about the...
D.E. Haugen; S.A. Pugh
2014-01-01
This resource update provides an overview of forest resources in North Dakota based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Northern Research Station in cooperation with the North Dakota Forest Service. Estimates are based on field data collected using the FIA annualized sample design and are updated...
Thomas Brandeis; Andy Hartsell
2015-01-01
This resource update provides an overview of forest resources in Georgia based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Southern Research Station in cooperation with the Georgia Forestry Commission. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly....
D.M. Meneguzzo; S.J. Crocker
2015-01-01
This resource update provides an overview of forest resource attributes for Kansas based on annual inventories conducted by the Forest Inventory and Analysis (FIA) program of the Northern Research Station (NRS) of the U.S. Forest Service. The estimates presented in this update are based on field data collected in 2010-2014 with comparisons made to data collected from...
Model-Based Reinforcement Learning under Concurrent Schedules of Reinforcement in Rodents
ERIC Educational Resources Information Center
Huh, Namjung; Jo, Suhyun; Kim, Hoseok; Sul, Jung Hoon; Jung, Min Whan
2009-01-01
Reinforcement learning theories postulate that actions are chosen to maximize a long-term sum of positive outcomes based on value functions, which are subjective estimates of future rewards. In simple reinforcement learning algorithms, value functions are updated only by trial-and-error, whereas they are updated according to the decision-maker's…
M.J. Brown; Jarek. Nowak
2014-01-01
This periodic resource update provides an overview of forest resources in Florida based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Southern Research Station in cooperation with the Florida Forest Service. Estimates are based on field data collected using the FIA annualized sample design and are updated...
Mark D. Nelson; Tivon E. Feeley; Cassandra M. Kurtz
2017-01-01
This resource update provides an overview of forest resources in Iowa based on inventories conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Northern Research Station in cooperation with the Iowa Department of Natural Resources. Estimates are based on field data collected using the FIA annualized sample design and are updated...
S. Lambert; J.A. Cooper
2014-01-01
This resource update provides an overview of forest resources in Oklahoma based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Southern Research Station, in cooperation with Oklahoma Forestry Services (OFS). Estimates are based on field data collected using the FIA annualized sample design and are updated yearly...
Mark Brown; J.. Nowak
2016-01-01
This periodic resource update provides an overview of forest resources in Florida based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Southern Research Station in cooperation with the Florida Forest Service. Estimates are based on field data collected using the FIA annualized sample design and are updated...
A. Hartsell
2017-01-01
This resource update provides an overview of forest resources in Alabama based on an inventory conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program at the Southern Research Station in cooperation with the Alabama Forestry Commission. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly....
Anita Rose
2016-01-01
This resource update provides an overview of forest resources in Virginia. Information for this factsheet was updated by means of the Forest Inventory and Analysis (FIA) annualized sample design. Each year, 20 percent of the sample plots (one panel) in Virginia are measured by field crews, the data compiled, and new estimates produced. After 5 years of measurements,...
Tracking with time-delayed data in multisensor systems
NASA Astrophysics Data System (ADS)
Hilton, Richard D.; Martin, David A.; Blair, William D.
1993-08-01
When techniques for target tracking are expanded to make use of multiple sensors in a multiplatform system, the possibility of time delayed data becomes a reality. When a discrete-time Kalman filter is applied and some of the data entering the filter are delayed, proper processing of these late data is a necessity for obtaining an optimal estimate of a target's state. If this problem is not given special care, the quality of the state estimates can be degraded relative to that quality provided by a single sensor. A negative-time update technique is developed using the criteria of minimum mean-square error (MMSE) under the constraint that only the results of the most recent update are saved. The performance of the MMSE technique is compared to that of the ad hoc approach employed in the Cooperative Engagement Capabilities (CEC) system for processing data from multiple platforms. It was discovered that the MMSE technique is a stable solution to the negative-time update problem, while the CEC technique was found to be less than desirable when used with filters designed for tracking highly maneuvering targets at relatively low data rates. The MMSE negative-time update technique was found to be a superior alternative to the existing CEC negative-time update technique.
An Ensemble-Based Smoother with Retrospectively Updated Weights for Highly Nonlinear Systems
NASA Technical Reports Server (NTRS)
Chin, T. M.; Turmon, M. J.; Jewell, J. B.; Ghil, M.
2006-01-01
Monte Carlo computational methods have been introduced into data assimilation for nonlinear systems in order to alleviate the computational burden of updating and propagating the full probability distribution. By propagating an ensemble of representative states, algorithms like the ensemble Kalman filter (EnKF) and the resampled particle filter (RPF) rely on the existing modeling infrastructure to approximate the distribution based on the evolution of this ensemble. This work presents an ensemble-based smoother that is applicable to the Monte Carlo filtering schemes like EnKF and RPF. At the minor cost of retrospectively updating a set of weights for ensemble members, this smoother has demonstrated superior capabilities in state tracking for two highly nonlinear problems: the double-well potential and trivariate Lorenz systems. The algorithm does not require retrospective adaptation of the ensemble members themselves, and it is thus suited to a streaming operational mode. The accuracy of the proposed backward-update scheme in estimating non-Gaussian distributions is evaluated by comparison to the more accurate estimates provided by a Markov chain Monte Carlo algorithm.
Nonstationary multivariate modeling of cerebral autoregulation during hypercapnia.
Kostoglou, Kyriaki; Debert, Chantel T; Poulin, Marc J; Mitsis, Georgios D
2014-05-01
We examined the time-varying characteristics of cerebral autoregulation and hemodynamics during a step hypercapnic stimulus by using recursively estimated multivariate (two-input) models which quantify the dynamic effects of mean arterial blood pressure (ABP) and end-tidal CO2 tension (PETCO2) on middle cerebral artery blood flow velocity (CBFV). Beat-to-beat values of ABP and CBFV, as well as breath-to-breath values of PETCO2 during baseline and sustained euoxic hypercapnia were obtained in 8 female subjects. The multiple-input, single-output models used were based on the Laguerre expansion technique, and their parameters were updated using recursive least squares with multiple forgetting factors. The results reveal the presence of nonstationarities that confirm previously reported effects of hypercapnia on autoregulation, i.e. a decrease in the MABP phase lead, and suggest that the incorporation of PETCO2 as an additional model input yields less time-varying estimates of dynamic pressure autoregulation obtained from single-input (ABP-CBFV) models. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.
Agricultural residue availability in the United States.
Haq, Zia; Easterly, James L
2006-01-01
The National Energy Modeling System (NEMS) is used by the Energy Information Administration (EIA) to forecast US energy production, consumption, and price trends for a 25-yr-time horizon. Biomass is one of the technologies within NEMS, which plays a key role in several scenarios. An endogenously determined biomass supply schedule is used to derive the price-quantity relationship of biomass. There are four components to the NEMS biomass supply schedule including: agricultural residues, energy crops, forestry residues, and urban wood waste/mill residues. The EIA's Annual Energy Outlook 2005 includes updated estimates of the agricultural residue portion of the biomass supply schedule. The changes from previous agricultural residue supply estimates include: revised assumptions concerning corn stover and wheat straw residue availabilities, inclusion of non-corn and non-wheat agricultural residues (such as barley, rice straw, and sugarcane bagasse), and the implementation of assumptions concerning increases in no-till farming. This article will discuss the impact of these changes on the supply schedule.
Mortality of workers at the Hanford site: 1945-1981
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, E.S.; Petersen, G.R.; Buchanan, J.A.
1989-01-01
Analyses of mortality of workers at the Hanford Site were updated to include an additional three years of data (1979-81). Deaths occurring in the state of Washington in the years 1982-85 were also evaluated. Hanford workers continued to exhibit a strong healthy worker effect with death rates substantially below those of the general U.S. population. Comparisons by level of radiation exposure within the Hanford worker population provided no evidence of a positive correlation of radiation exposure and mortality from all cancers combined or of mortality from leukemia. Estimates of cancer risk due to radiation were negative, but confidence intervals weremore » wide, indicating that the data were consistent with no risk and with risks several times larger than estimates provided by major groups concerned with risk assessment. Of 18 categories of cancer analyzed, a correlation of borderline statistical significance was identified for female genital cancers (p = 0.05), but was interpreted as probably spurious. The previously identified correlation for multiple myeloma persisted (p = 0.002).« less
NASA Astrophysics Data System (ADS)
Four primary tasks have been carried out in this program. Upon request of LANL, the Eloranta paper was reviewed. It was determined that the correlation solution presented was too computationally complex to execute in the allocated 1 second update time. An alternative algorithm approach was under taken using a simulation baseline. A simulation was developed and applied to generate synthetic LIDAR data from randomized aerosol patches drifting with the wind. Algorithms have been designed and implemented in the simulation to reduce the data and apply it to obtain wind estimates. A substantial effort was completed in reverse engineering the EVIEW data format structure of the supplied data. Finally the collected, LIDAR data has been examined to obtain an assessment of the prospects for successful wind estimation. Unfortunately, the data examination has not shown good prospects for a successful outcome. It is recommended that future data be taken with the procedure previously outlined. Hercules believes that if lidar data is collected using this procedure that wind information will be as successful using the collected data as it was in simulation.
Yin, Shasha; Zheng, Junyu; Lu, Qing; Yuan, Zibing; Huang, Zhijiong; Zhong, Liuju; Lin, Hui
2015-05-01
Accurate and gridded VOC emission inventories are important for improving regional air quality model performance. In this study, a four-level VOC emission source categorization system was proposed. A 2010-based gridded Pearl River Delta (PRD) regional VOC emission inventory was developed with more comprehensive source coverage, latest emission factors, and updated activity data. The total anthropogenic VOC emission was estimated to be about 117.4 × 10(4)t, in which on-road mobile source shared the largest contribution, followed by industrial solvent use and industrial processes sources. Among the industrial solvent use source, furniture manufacturing and shoemaking were major VOC emission contributors. The spatial surrogates of VOC emission were updated for major VOC sources such as industrial sectors and gas stations. Subsector-based temporal characteristics were investigated and their temporal variations were characterized. The impacts of updated VOC emission estimates and spatial surrogates were evaluated by modeling O₃ concentration in the PRD region in the July and October of 2010, respectively. The results indicated that both updated emission estimates and spatial allocations can effectively reduce model bias on O₃ simulation. Further efforts should be made on the refinement of source classification, comprehensive collection of activity data, and spatial-temporal surrogates in order to reduce uncertainty in emission inventory and improve model performance. Copyright © 2015 Elsevier B.V. All rights reserved.
Drift Reduction in Pedestrian Navigation System by Exploiting Motion Constraints and Magnetic Field
Ilyas, Muhammad; Cho, Kuk; Baeg, Seung-Ho; Park, Sangdeok
2016-01-01
Pedestrian navigation systems (PNS) using foot-mounted MEMS inertial sensors use zero-velocity updates (ZUPTs) to reduce drift in navigation solutions and estimate inertial sensor errors. However, it is well known that ZUPTs cannot reduce all errors, especially as heading error is not observable. Hence, the position estimates tend to drift and even cyclic ZUPTs are applied in updated steps of the Extended Kalman Filter (EKF). This urges the use of other motion constraints for pedestrian gait and any other valuable heading reduction information that is available. In this paper, we exploit two more motion constraints scenarios of pedestrian gait: (1) walking along straight paths; (2) standing still for a long time. It is observed that these motion constraints (called “virtual sensor”), though considerably reducing drift in PNS, still need an absolute heading reference. One common absolute heading estimation sensor is the magnetometer, which senses the Earth’s magnetic field and, hence, the true heading angle can be calculated. However, magnetometers are susceptible to magnetic distortions, especially in indoor environments. In this work, an algorithm, called magnetic anomaly detection (MAD) and compensation is designed by incorporating only healthy magnetometer data in the EKF updating step, to reduce drift in zero-velocity updated INS. Experiments are conducted in GPS-denied and magnetically distorted environments to validate the proposed algorithms. PMID:27618056
Technical Evaluation of the NASA Model for Cancer Risk to Astronauts Due to Space Radiation
NASA Technical Reports Server (NTRS)
2012-01-01
At the request of NASA, the National Research Council's (NRC's) Committee for Evaluation of Space Radiation Cancer Risk Model1 reviewed a number of changes that NASA proposes to make to its model for estimating the risk of radiation-induced cancer in astronauts. The NASA model in current use was last updated in 2005, and the proposed model would incorporate recent research directed at improving the quantification and understanding of the health risks posed by the space radiation environment. NASA's proposed model is defined by the 2011 NASA report Space Radiation Cancer Risk Projections and Uncertainties--2010 . The committee's evaluation is based primarily on this source, which is referred to hereafter as the 2011 NASA report, with mention of specific sections or tables. The overall process for estimating cancer risks due to low linear energy transfer (LET) radiation exposure has been fully described in reports by a number of organizations. The approaches described in the reports from all of these expert groups are quite similar. NASA's proposed space radiation cancer risk assessment model calculates, as its main output, age- and gender-specific risk of exposure-induced death (REID) for use in the estimation of mission and astronaut-specific cancer risk. The model also calculates the associated uncertainties in REID. The general approach for estimating risk and uncertainty in the proposed model is broadly similar to that used for the current (2005) NASA model and is based on recommendations by the National Council on Radiation Protection and Measurements. However, NASA's proposed model has significant changes with respect to the following: the integration of new findings and methods into its components by taking into account newer epidemiological data and analyses, new radiobiological data indicating that quality factors differ for leukemia and solid cancers, an improved method for specifying quality factors in terms of radiation track structure concepts as opposed to the previous approach based on linear energy transfer, the development of a new solar particle event (SPE) model, and the updates to galactic cosmic ray (GCR) and shielding transport models. The newer epidemiological information includes updates to the cancer incidence rates from the life span study (LSS) of the Japanese atomic bomb survivors, transferred to the U.S. population and converted to cancer mortality rates from U.S. population statistics. In addition, the proposed model provides an alternative analysis applicable to lifetime never-smokers (NSs). Details of the uncertainty analysis in the model have also been updated and revised. NASA's proposed model and associated uncertainties are complex in their formulation and as such require a very clear and precise set of descriptions. The committee found the 2011 NASA report challenging to review largely because of the lack of clarity in the model descriptions and derivation of the various parameters used. The committee requested some clarifications from NASA throughout its review and was able to resolve many, but not all, of the ambiguities in the written description.
Shahar, Nitzan; Meiran, Nachshon
2015-01-01
Few studies have addressed action control training. In the current study, participants were trained over 19 days in an adaptive training task that demanded constant switching, maintenance and updating of novel action rules. Participants completed an executive functions battery before and after training that estimated processing speed, working memory updating, set-shifting, response inhibition and fluid intelligence. Participants in the training group showed greater improvement than a no-contact control group in processing speed, indicated by reduced reaction times in speeded classification tasks. No other systematic group differences were found across the different pre-post measurements. Ex-Gaussian fitting of the reaction-time distribution revealed that the reaction time reduction observed among trained participants was restricted to the right tail of the distribution, previously shown to be related to working memory. Furthermore, training effects were only found in classification tasks that required participants to maintain novel stimulus-response rules in mind, supporting the notion that the training improved working memory abilities. Training benefits were maintained in a 10-month follow-up, indicating relatively long-lasting effects. The authors conclude that training improved action-related working memory abilities. PMID:25799443
Predictors of Responsiveness to Early Literacy Intervention: A 10-Year Update
ERIC Educational Resources Information Center
Lam, Elizabeth A.; McMaster, Kristen L.
2014-01-01
The purpose of this review was to update previous reviews on factors related to students' responsiveness to early literacy intervention. The 14 studies in this synthesis used experimental designs, provided small-group or one-on-one reading interventions, and analyzed factors related to responsiveness to those interventions. Participants were…
ERIC Educational Resources Information Center
Davis-Berman, Jennifer; Berman, Dene
1996-01-01
Updated description of 38 wilderness orientation programs currently affiliated with U.S. colleges and universities includes program enrollment, length, cost, types of leaders, training, and sponsorship. Discusses program philosophies, goals, reasons for using the wilderness, and critical and emerging issues. Compares data to previous research.…
42 CFR 413.40 - Ceiling on the rate of increase in hospital inpatient costs.
Code of Federal Regulations, 2010 CFR
2010-10-01
... October 1, 2002, is the percentage increase projected by the hospital market basket index. (4) Target... target amount for the previous cost reporting period, updated by the market basket percentage increase... each cost reporting period, the ceiling is determined by multiplying the updated target amount, as...
Estimating Gender Wage Gaps: A Data Update
ERIC Educational Resources Information Center
McDonald, Judith A.; Thornton, Robert J.
2016-01-01
In the authors' 2011 "JEE" article, "Estimating Gender Wage Gaps," they described an interesting class project that allowed students to estimate the current gender earnings gap for recent college graduates using data from the National Association of Colleges and Employers (NACE). Unfortunately, since 2012, NACE no longer…
Barth, Nancy A.; Veilleux, Andrea G.
2012-01-01
The U.S. Geological Survey (USGS) is currently updating at-site flood frequency estimates for USGS streamflow-gaging stations in the desert region of California. The at-site flood-frequency analysis is complicated by short record lengths (less than 20 years is common) and numerous zero flows/low outliers at many sites. Estimates of the three parameters (mean, standard deviation, and skew) required for fitting the log Pearson Type 3 (LP3) distribution are likely to be highly unreliable based on the limited and heavily censored at-site data. In a generalization of the recommendations in Bulletin 17B, a regional analysis was used to develop regional estimates of all three parameters (mean, standard deviation, and skew) of the LP3 distribution. A regional skew value of zero from a previously published report was used with a new estimated mean squared error (MSE) of 0.20. A weighted least squares (WLS) regression method was used to develop both a regional standard deviation and a mean model based on annual peak-discharge data for 33 USGS stations throughout California’s desert region. At-site standard deviation and mean values were determined by using an expected moments algorithm (EMA) method for fitting the LP3 distribution to the logarithms of annual peak-discharge data. Additionally, a multiple Grubbs-Beck (MGB) test, a generalization of the test recommended in Bulletin 17B, was used for detecting multiple potentially influential low outliers in a flood series. The WLS regression found that no basin characteristics could explain the variability of standard deviation. Consequently, a constant regional standard deviation model was selected, resulting in a log-space value of 0.91 with a MSE of 0.03 log units. Yet drainage area was found to be statistically significant at explaining the site-to-site variability in mean. The linear WLS regional mean model based on drainage area had a Pseudo- 2 R of 51 percent and a MSE of 0.32 log units. The regional parameter estimates were then used to develop a set of equations for estimating flows with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities for ungaged basins. The final equations are functions of drainage area.Average standard errors of prediction for these regression equations range from 214.2 to 856.2 percent.
Hess, Glen W.
2002-01-01
Techniques for estimating monthly streamflow-duration characteristics at ungaged and partial-record sites in central Nevada have been updated. These techniques were developed using streamflow records at six continuous-record sites, basin physical and climatic characteristics, and concurrent streamflow measurements at four partial-record sites. Two methods, the basin-characteristic method and the concurrent-measurement method, were developed to provide estimating techniques for selected streamflow characteristics at ungaged and partial-record sites in central Nevada. In the first method, logarithmic-regression analyses were used to relate monthly mean streamflows (from all months and by month) from continuous-record gaging sites of various percent exceedence levels or monthly mean streamflows (by month) to selected basin physical and climatic variables at ungaged sites. Analyses indicate that the total drainage area and percent of drainage area at altitudes greater than 10,000 feet are the most significant variables. For the equations developed from all months of monthly mean streamflow, the coefficient of determination averaged 0.84 and the standard error of estimate of the relations for the ungaged sites averaged 72 percent. For the equations derived from monthly means by month, the coefficient of determination averaged 0.72 and the standard error of estimate of the relations averaged 78 percent. If standard errors are compared, the relations developed in this study appear generally to be less accurate than those developed in a previous study. However, the new relations are based on additional data and the slight increase in error may be due to the wider range of streamflow for a longer period of record, 1995-2000. In the second method, streamflow measurements at partial-record sites were correlated with concurrent streamflows at nearby gaged sites by the use of linear-regression techniques. Statistical measures of results using the second method typically indicated greater accuracy than for the first method. However, to make estimates for individual months, the concurrent-measurement method requires several years additional streamflow data at more partial-record sites. Thus, exceedence values for individual months are not yet available due to the low number of concurrent-streamflow-measurement data available. Reliability, limitations, and applications of both estimating methods are described herein.
Selective updating of working memory content modulates meso-cortico-striatal activity.
Murty, Vishnu P; Sambataro, Fabio; Radulescu, Eugenia; Altamura, Mario; Iudicello, Jennifer; Zoltick, Bradley; Weinberger, Daniel R; Goldberg, Terry E; Mattay, Venkata S
2011-08-01
Accumulating evidence from non-human primates and computational modeling suggests that dopaminergic signals arising from the midbrain (substantia nigra/ventral tegmental area) mediate striatal gating of the prefrontal cortex during the selective updating of working memory. Using event-related functional magnetic resonance imaging, we explored the neural mechanisms underlying the selective updating of information stored in working memory. Participants were scanned during a novel working memory task that parses the neurophysiology underlying working memory maintenance, overwriting, and selective updating. Analyses revealed a functionally coupled network consisting of a midbrain region encompassing the substantia nigra/ventral tegmental area, caudate, and dorsolateral prefrontal cortex that was selectively engaged during working memory updating compared to the overwriting and maintenance of working memory content. Further analysis revealed differential midbrain-dorsolateral prefrontal interactions during selective updating between low-performing and high-performing individuals. These findings highlight the role of this meso-cortico-striatal circuitry during the selective updating of working memory in humans, which complements previous research in behavioral neuroscience and computational modeling. Published by Elsevier Inc.
Peterson, M.D.; Mueller, C.S.
2011-01-01
The USGS National Seismic Hazard Maps are updated about every six years by incorporating newly vetted science on earthquakes and ground motions. The 2008 hazard maps for the central and eastern United States region (CEUS) were updated by using revised New Madrid and Charleston source models, an updated seismicity catalog and an estimate of magnitude uncertainties, a distribution of maximum magnitudes, and several new ground-motion prediction equations. The new models resulted in significant ground-motion changes at 5 Hz and 1 Hz spectral acceleration with 5% damping compared to the 2002 version of the hazard maps. The 2008 maps have now been incorporated into the 2009 NEHRP Recommended Provisions, the 2010 ASCE-7 Standard, and the 2012 International Building Code. The USGS is now planning the next update of the seismic hazard maps, which will be provided to the code committees in December 2013. Science issues that will be considered for introduction into the CEUS maps include: 1) updated recurrence models for New Madrid sources, including new geodetic models and magnitude estimates; 2) new earthquake sources and techniques considered in the 2010 model developed by the nuclear industry; 3) new NGA-East ground-motion models (currently under development); and 4) updated earthquake catalogs. We will hold a regional workshop in late 2011 or early 2012 to discuss these and other issues that will affect the seismic hazard evaluation in the CEUS.
Richards, V. M.; Dai, W.
2014-01-01
A MATLAB toolbox for the efficient estimation of the threshold, slope, and lapse rate of the psychometric function is described. The toolbox enables the efficient implementation of the updated maximum-likelihood (UML) procedure. The toolbox uses an object-oriented architecture for organizing the experimental variables and computational algorithms, which provides experimenters with flexibility in experimental design and data management. Descriptions of the UML procedure and the UML Toolbox are provided, followed by toolbox use examples. Finally, guidelines and recommendations of parameter configurations are given. PMID:24671826
Multiple Chronic Conditions Among US Adults: A 2012 Update
Schiller, Jeannine S.; Goodman, Richard A.
2014-01-01
The objective of this research was to update earlier estimates of prevalence rates of single chronic conditions and multiple (>2) chronic conditions (MCC) among the noninstitutionalized, civilian US adult population. Data from the 2012 National Health Interview Survey (NHIS) were used to generate estimates of MCC for US adults and by select demographic characteristics. Approximately half (117 million) of US adults have at least one of the 10 chronic conditions examined (ie, hypertension, coronary heart disease, stroke, diabetes, cancer, arthritis, hepatitis, weak or failing kidneys, current asthma, or chronic obstructive pulmonary disease [COPD]). Furthermore, 1 in 4 adults has MCC. PMID:24742395
NASA Astrophysics Data System (ADS)
El Gharamti, M.; Bethke, I.; Tjiputra, J.; Bertino, L.
2016-02-01
Given the recent strong international focus on developing new data assimilation systems for biological models, we present in this comparative study the application of newly developed state-parameters estimation tools to an ocean ecosystem model. It is quite known that the available physical models are still too simple compared to the complexity of the ocean biology. Furthermore, various biological parameters remain poorly unknown and hence wrong specifications of such parameters can lead to large model errors. Standard joint state-parameters augmentation technique using the ensemble Kalman filter (Stochastic EnKF) has been extensively tested in many geophysical applications. Some of these assimilation studies reported that jointly updating the state and the parameters might introduce significant inconsistency especially for strongly nonlinear models. This is usually the case for ecosystem models particularly during the period of the spring bloom. A better handling of the estimation problem is often carried out by separating the update of the state and the parameters using the so-called Dual EnKF. The dual filter is computationally more expensive than the Joint EnKF but is expected to perform more accurately. Using a similar separation strategy, we propose a new EnKF estimation algorithm in which we apply a one-step-ahead smoothing to the state. The new state-parameters estimation scheme is derived in a consistent Bayesian filtering framework and results in separate update steps for the state and the parameters. Unlike the classical filtering path, the new scheme starts with an update step and later a model propagation step is performed. We test the performance of the new smoothing-based schemes against the standard EnKF in a one-dimensional configuration of the Norwegian Earth System Model (NorESM) in the North Atlantic. We use nutrients profile (up to 2000 m deep) data and surface partial CO2 measurements from Mike weather station (66o N, 2o E) to estimate different biological parameters of phytoplanktons and zooplanktons. We analyze the performance of the filters in terms of complexity and accuracy of the state and parameters estimates.
NASA Technical Reports Server (NTRS)
Soman, Vishwas V.; Crosson, William L.; Laymon, Charles; Tsegaye, Teferi
1998-01-01
Soil moisture is an important component of analysis in many Earth science disciplines. Soil moisture information can be obtained either by using microwave remote sensing or by using a hydrologic model. In this study, we combined these two approaches to increase the accuracy of profile soil moisture estimation. A hydrologic model was used to analyze the errors in the estimation of soil moisture using the data collected during Huntsville '96 microwave remote sensing experiment in Huntsville, Alabama. Root mean square errors (RMSE) in soil moisture estimation increase by 22% with increase in the model input interval from 6 hr to 12 hr for the grass-covered plot. RMSEs were reduced for given model time step by 20-50% when model soil moisture estimates were updated using remotely-sensed data. This methodology has a potential to be employed in soil moisture estimation using rainfall data collected by a space-borne sensor, such as the Tropical Rainfall Measuring Mission (TRMM) satellite, if remotely-sensed data are available to update the model estimates.
Users' Guide to USDA Estimates of the Cost of Raising a Child.
ERIC Educational Resources Information Center
Edwards, Carolyn S.
In this article, estimates of the cost of raising a child, that are available from the U.S. Department of Agriculture, are described; the most widely requested estimates updated to current price levels are provided; and the most frequently asked questions about the use and interpretation of these estimates are answered. Information on additional…
Lee, Casey J.; Murphy, Jennifer C.; Crawford, Charles G.; Deacon, Jeffrey R.
2017-10-24
The U.S. Geological Survey publishes information on concentrations and loads of water-quality constituents at 111 sites across the United States as part of the U.S. Geological Survey National Water Quality Network (NWQN). This report details historical and updated methods for computing water-quality loads at NWQN sites. The primary updates to historical load estimation methods include (1) an adaptation to methods for computing loads to the Gulf of Mexico; (2) the inclusion of loads computed using the Weighted Regressions on Time, Discharge, and Season (WRTDS) method; and (3) the inclusion of loads computed using continuous water-quality data. Loads computed using WRTDS and continuous water-quality data are provided along with those computed using historical methods. Various aspects of method updates are evaluated in this report to help users of water-quality loading data determine which estimation methods best suit their particular application.
An Updated Algorithm for Estimation of Pesticide Exposure Intensity in the Agricultural Health Study
An algorithm developed to estimate pesticide exposure intensity for use in epidemiologic analyses was revised based on data from two exposure monitoring studies. In the first study, we estimated relative exposure intensity based on the results of measurements taken during the app...
Richard H. Widmann
2016-01-01
This resource update provides an overview of the forest resources in New York based on inventories conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program of the Northern Research Station. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly.1(See footnote on page 4). Information about the...
Thomas A. Albright; Anthony C. Olsen
2017-01-01
This resource update provides an overview of the forest resources in New York based on inventories conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program of the Northern Research Station. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly.1Information about the national and regional FIA...
Richard H. Widmann
2015-01-01
This resource update provides an overview of the forest resources in New York based on inventories conducted by the U.S. Forest Service, Forest Inventory and Analysis (FIA) program of the Northern Research Station. Estimates are based on field data collected using the FIA annualized sample design and are updated yearly. (See footnote on page 4). Information about the...
Forests of South Carolina, 2014
Anita K. Rose
2015-01-01
This resource update provides an overview of forest resources in South Carolina. Information for this factsheet was updated by means of the Forest Inventory and Analysis (FIA) annualized sample design. Each year 20 percent of the sample plots (one panel) in South Carolina are measured by field crews, the data compiled, and new estimates produced. After 5 years of...
Forests of South Carolina, 2013
Anita K. Rose
2015-01-01
This resource update provides an overview of forest resources in South Carolina. Information for this factsheet was updated by means of the Forest Inventory and Analysis (FIA) annualized sample design. Each year 20 percent of the sample plots (one panel) in South Carolina are measured by field crews, the data compiled, and new estimates produced. After 5 years of...
NASA Astrophysics Data System (ADS)
Skeie, R. B.; Berntsen, T.; Aldrin, M.; Holden, M.; Myhre, G.
2012-04-01
A key question in climate science is to quantify the sensitivity of the climate system to perturbation in the radiative forcing (RF). This sensitivity is often represented by the equilibrium climate sensitivity, but this quantity is poorly constrained with significant probabilities for high values. In this work the equilibrium climate sensitivity (ECS) is estimated based on observed near-surface temperature change from the instrumental record, changes in ocean heat content and detailed RF time series. RF time series from pre-industrial times to 2010 for all main anthropogenic and natural forcing mechanisms are estimated and the cloud lifetime effect and the semi-direct effect, which are not RF mechanisms in a strict sense, are included in the analysis. The RF time series are linked to the observations of ocean heat content and temperature change through an energy balance model and a stochastic model, using a Bayesian approach to estimate the ECS from the data. The posterior mean of the ECS is 1.9˚C with 90% credible interval (C.I.) ranging from 1.2 to 2.9˚C, which is tighter than previously published estimates. Observational data up to and including year 2010 are used in this study. This is at least ten additional years compared to the majority of previously published studies that have used the instrumental record in attempts to constrain the ECS. We show that the additional 10 years of data, and especially 10 years of additional ocean heat content data, have significantly narrowed the probability density function of the ECS. If only data up to and including year 2000 are used in the analysis, the 90% C.I. is 1.4 to 10.6˚C with a pronounced heavy tail in line with previous estimates of ECS constrained by observations in the 20th century. Also the transient climate response (TCR) is estimated in this study. Using observational data up to and including year 2010 gives a 90% C.I. of 1.0 to 2.1˚C, while the 90% C.I. is significantly broader ranging from 1.1 to 3.4 ˚C if only data up to and including year 2000 is used.
75 FR 70213 - Gulf of Mexico Fishery Management Council; Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-17
... was used. In addition, the original update assessment estimated dead discards from the commercial gag.... However, more recent estimates of dead discards based on observer data have been on the order of 200,000...
A Bayesian approach to tracking patients having changing pharmacokinetic parameters
NASA Technical Reports Server (NTRS)
Bayard, David S.; Jelliffe, Roger W.
2004-01-01
This paper considers the updating of Bayesian posterior densities for pharmacokinetic models associated with patients having changing parameter values. For estimation purposes it is proposed to use the Interacting Multiple Model (IMM) estimation algorithm, which is currently a popular algorithm in the aerospace community for tracking maneuvering targets. The IMM algorithm is described, and compared to the multiple model (MM) and Maximum A-Posteriori (MAP) Bayesian estimation methods, which are presently used for posterior updating when pharmacokinetic parameters do not change. Both the MM and MAP Bayesian estimation methods are used in their sequential forms, to facilitate tracking of changing parameters. Results indicate that the IMM algorithm is well suited for tracking time-varying pharmacokinetic parameters in acutely ill and unstable patients, incurring only about half of the integrated error compared to the sequential MM and MAP methods on the same example.
Steingart, Karen R.; Flores, Laura L.; Dendukuri, Nandini; Schiller, Ian; Laal, Suman; Ramsay, Andrew; Hopewell, Philip C.; Pai, Madhukar
2011-01-01
Background Serological (antibody detection) tests for tuberculosis (TB) are widely used in developing countries. As part of a World Health Organization policy process, we performed an updated systematic review to assess the diagnostic accuracy of commercial serological tests for pulmonary and extrapulmonary TB with a focus on the relevance of these tests in low- and middle-income countries. Methods and Findings We used methods recommended by the Cochrane Collaboration and GRADE approach for rating quality of evidence. In a previous review, we searched multiple databases for papers published from 1 January 1990 to 30 May 2006, and in this update, we add additional papers published from that period until 29 June 2010. We prespecified subgroups to address heterogeneity and summarized test performance using bivariate random effects meta-analysis. For pulmonary TB, we included 67 studies (48% from low- and middle-income countries) with 5,147 participants. For all tests, estimates were variable for sensitivity (0% to 100%) and specificity (31% to 100%). For anda-TB IgG, the only test with enough studies for meta-analysis, pooled sensitivity was 76% (95% CI 63%–87%) in smear-positive (seven studies) and 59% (95% CI 10%–96%) in smear-negative (four studies) patients; pooled specificities were 92% (95% CI 74%–98%) and 91% (95% CI 79%–96%), respectively. Compared with ELISA (pooled sensitivity 60% [95% CI 6%–65%]; pooled specificity 98% [95% CI 96%–99%]), immunochromatographic tests yielded lower pooled sensitivity (53%, 95% CI 42%–64%) and comparable pooled specificity (98%, 95% CI 94%–99%). For extrapulmonary TB, we included 25 studies (40% from low- and middle-income countries) with 1,809 participants. For all tests, estimates were variable for sensitivity (0% to 100%) and specificity (59% to 100%). Overall, quality of evidence was graded very low for studies of pulmonary and extrapulmonary TB. Conclusions Despite expansion of the literature since 2006, commercial serological tests continue to produce inconsistent and imprecise estimates of sensitivity and specificity. Quality of evidence remains very low. These data informed a recently published World Health Organization policy statement against serological tests. Please see later in the article for the Editors' Summary PMID:21857806
Estimates of internal-dose equivalent from inhalation and ingestion of selected radionuclides
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunning, D.E.
1982-01-01
This report presents internal radiation dose conversion factors for radionuclides of interest in environmental assessments of nuclear fuel cycles. This volume provides an updated summary of estimates of committed dose equivalent for radionuclides considered in three previous Oak Ridge National Laboratory (ORNL) reports. Intakes by inhalation and ingestion are considered. The International Commission on Radiological Protection (ICRP) Task Group Lung Model has been used to simulate the deposition and retention of particulate matter in the respiratory tract. Results corresponding to activity median aerodynamic diameters (AMAD) of 0.3, 1.0, and 5.0 ..mu..m are given. The gastorintestinal (GI) tract has been representedmore » by a four-segment catenary model with exponential transfer of radioactivity from one segment to the next. Retention of radionuclides in systemic organs is characterized by linear combinations of decaying exponential functions, recommended in ICRP Publication 30. The first-year annual dose rate, maximum annual dose rate, and fifty-year dose commitment per microcurie intake of each radionuclide is given for selected target organs and the effective dose equivalent. These estimates include contributions from specified source organs plus the systemic activity residing in the rest of the body; cross irradiation due to penetrating radiations has been incorporated into these estimates. 15 references.« less
A Bayesian Approach for Population Pharmacokinetic Modeling of Alcohol in Japanese Individuals.
Nemoto, Asuka; Masaaki, Matsuura; Yamaoka, Kazue
2017-01-01
Blood alcohol concentration data that were previously obtained from 34 healthy Japanese subjects with limited sampling times were reanalyzed. Characteristics of the data were that the concentrations were obtained from only the early part of the time-concentration curve. To explore significant covariates for the population pharmacokinetic analysis of alcohol by incorporating external data using a Bayesian method, and to estimate effects of the covariates. The data were analyzed using a Markov chain Monte Carlo Bayesian estimation with NONMEM 7.3 (ICON Clinical Research LLC, North Wales, Pennsylvania). Informative priors were obtained from the external study. A 1-compartment model with Michaelis-Menten elimination was used. The typical value for the apparent volume of distribution was 49.3 L at the age of 29.4 years. Volume of distribution was estimated to be 20.4 L smaller in subjects with the ALDH2*1/*2 genotype than in subjects with the ALDH2*1/*1 genotype. A population pharmacokinetic model for alcohol was updated. A Bayesian approach allowed interpretation of significant covariate relationships, even if the current dataset is not informative about all parameters. This is the first study reporting an estimate of the effect of the ALDH2 genotype in a PPK model.
Scarborough, Peter; Bhatnagar, Prachi; Wickramasinghe, Kremlin K; Allender, Steve; Foster, Charlie; Rayner, Mike
2011-12-01
Estimates of the economic cost of risk factors for chronic disease to the NHS provide evidence for prioritization of resources for prevention and public health. Previous comparable estimates of the economic costs of poor diet, physical inactivity, smoking, alcohol and overweight/obesity were based on economic data from 1992-93. Diseases associated with poor diet, physical inactivity, smoking, alcohol and overweight/obesity were identified. Risk factor-specific population attributable fractions for these diseases were applied to disease-specific estimates of the economic cost to the NHS in the UK in 2006-07. In 2006-07, poor diet-related ill health cost the NHS in the UK £5.8 billion. The cost of physical inactivity was £0.9 billion. Smoking cost was £3.3 billion, alcohol cost £3.3 billion, overweight and obesity cost £5.1 billion. The estimates of the economic cost of risk factors for chronic disease presented here are based on recent financial data and are directly comparable. They suggest that poor diet is a behavioural risk factor that has the highest impact on the budget of the NHS, followed by alcohol consumption, smoking and physical inactivity.
Harwell, Mark A.; Gentile, John H.
2014-01-01
The Exxon Valdez oil spill occurred more than two decades ago, and the Prince William Sound ecosystem has essentially recovered. Nevertheless, discussion continues on whether or not localized effects persist on sea otters (Enhydra lutris) at northern Knight Island (NKI) and, if so, what are the associated attributable risks. A recent study estimated new rates of sea otter encounters with subsurface oil residues (SSOR) from the oil spill. We previously demonstrated that a potential pathway existed for exposures to polycyclic aromatic hydrocarbons (PAHs) and conducted a quantitative ecological risk assessment using an individual-based model that simulated this and other plausible exposure pathways. Here we quantitatively update the potential for this exposure pathway to constitute an ongoing risk to sea otters using the new estimates of SSOR encounters. Our conservative model predicted that the assimilated doses of PAHs to the 1-in-1000th most-exposed sea otters would remain 1–2 orders of magnitude below the chronic effects thresholds. We re-examine the baseline estimates, post-spill surveys, recovery status, and attributable risks for this subpopulation. We conclude that the new estimated frequencies of encountering SSOR do not constitute a plausible risk for sea otters at NKI and these sea otters have fully recovered from the oil spill. PMID:24587690
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, K.T.
2002-10-18
There have been numerous health studies or related activities over time that have involved workers at the Savannah River Site (SRS) or the surrounding public. While most of these epidemiology studies or activities have been performed by external agencies, it has proved useful to provide interested parties an overall summary of such activities. The first such summary was provided in an October 1998 report. The 1998 summary was updated in a February 2000 report. This report provides an update on the status or findings of epidemiology studies or activities involving SRS workers or the surrounding public, as an update tomore » the previous summaries.« less
Update: Transitional Bilingual Instruction Program (TBIP), 2012-2013
ERIC Educational Resources Information Center
Malagon, Helen; McCold, Paul; Nelson, Joan Johnston
2013-01-01
This report provides an update on the Transitional Bilingual Instruction Program (TBIP). In 2012-13, 104,025 English language learners (ELLs) received state services through the TBIP. This was an increase of just over 9% from the previous year. Most students live in urban areas along Interstate 5 corridor and in rural areas like the Yakima Valley.…
A comparison of several techniques for imputing tree level data
David Gartner
2002-01-01
As Forest Inventory and Analysis (FIA) changes from periodic surveys to the multipanel annual survey, new analytical methods become available. The current official statistic is the moving average. One alternative is an updated moving average. Several methods of updating plot per acre volume have been discussed previously. However, these methods may not be appropriate...
47 CFR 54.1306 - Updating Information Submitted to the National Exchange Carrier Association.
Code of Federal Regulations, 2014 CFR
2014-10-01
... incumbent local exchange carrier subject to § 54.1301(a) may update the information submitted to the... on a rolling year basis according to the schedule. (1) Submit data covering the last nine months of... September 30th of the existing year; (2) Submit data covering the last six months of the previous calendar...
2010 Strategic Plan for Autism Spectrum Disorder Research. NIH Publication No. 10-7573
ERIC Educational Resources Information Center
Interagency Autism Coordinating Committee, 2010
2010-01-01
In developing the 2010 Strategic Plan for ASD (Autism Spectrum Disorder) Research, the Interagency Autism Coordinating Committee (IACC) updated the previous Plan to highlight the most pressing research needs and opportunities for the field today. The Plan, which must be annually updated in accordance with the Combating Autism Act (CAA) of 2006,…
Mortality and Causes of Death in Autism Spectrum Disorders: An Update
ERIC Educational Resources Information Center
Mouridsen, Svend Erik; Bronnum-Hansen, Henrik; Rich, Bente; Isager, Torben
2008-01-01
This study compared mortality among Danish citizens with autism spectrum disorders (ASDs) with that of the general population. A clinical cohort of 341 Danish individuals with variants of ASD, previously followed over the period 1960-93, now on average 43 years of age, were updated with respect to mortality and causes of death. Standardized…
Updated Multistate Review of Professional Teaching Standards. REL Technical Brief. REL 2010-No. 014
ERIC Educational Resources Information Center
White, Melissa Eiler; Makkonen, Reino; Stewart, Kari Becker
2010-01-01
This review of teaching standards in six states updates a 2009 review (White, Makkonen, and Stewart 2009) by incorporating California's recently adopted teaching standards alongside those from Florida, Illinois, North Carolina, Ohio, and Texas. The previous review was developed at the request of key education agencies in California to inform the…
ERIC Educational Resources Information Center
Garcia-DeLaTorre, Paola; Rodriguez-Ortiz, Carlos J.; Arreguin-Martinez, Jose L.; Cruz-Castaneda, Paulina; Bermudez-Rattoni, Federico
2009-01-01
Reconsolidation has been described as a process where a consolidated memory returns to a labile state when retrieved. Growing evidence suggests that reconsolidation is, in fact, a destabilization/stabilization process that incorporates updated information to a previously consolidated memory. We used the conditioned taste aversion (CTA) task in…
Overview and Evaluation of the Community Multiscale Air Quality (CMAQ) Modeling System Version 5.2
A new version of the Community Multiscale Air Quality (CMAQ) model, version 5.2 (CMAQv5.2), is currently being developed, with a planned release date in 2017. The new model includes numerous updates from the previous version of the model (CMAQv5.1). Specific updates include a new...
Hurricane Harvey Rainfall, Did It Exceed PMP and What are the Implications?
NASA Astrophysics Data System (ADS)
Kappel, B.; Hultstrand, D.; Muhlestein, G.
2017-12-01
Rainfall resulting from Hurricane Harvey reached historic levels over the coastal regions of Texas and Louisiana during the last week of August 2017. Although extreme rainfall from this landfalling tropical system is not uncommon in the region, Harvey was unique in that it persisted over the same general location for several days, producing volumes of rainfall not previously observed in the United States. Devastating flooding and severe stress to infrastructure in the region was the result. Coincidentally, Applied Weather Associates had recently completed an updated statewide Probable Maximum Precipitation (PMP) study for Texas. This storm proved to be a real-time test of the adequacy of those values. AWA calculates PMP following a storm-based approach. This same approach was use in the HMRs. Therefore inclusion of all PMP-type storms is critically important to ensuring that appropriate PMP values are produced. This presentation will discuss the analysis of the Harvey rainfall using the Storm Precipitation Analysis System (SPAS) program used to analyze all storms used in PMP development, compare the results of the Harvey rainfall analysis against previous similar storms, and provide comparisons of the Harvey rainfall against previous and current PMP depths. Discussion will be included regarding the implications of the storm on previous and future PMP estimates, dam safety design, and infrastructure vulnerable to extreme flooding.
Input-output model for MACCS nuclear accident impacts estimation¹
DOE Office of Scientific and Technical Information (OSTI.GOV)
Outkin, Alexander V.; Bixler, Nathan E.; Vargas, Vanessa N
Since the original economic model for MACCS was developed, better quality economic data (as well as the tools to gather and process it) and better computational capabilities have become available. The update of the economic impacts component of the MACCS legacy model will provide improved estimates of business disruptions through the use of Input-Output based economic impact estimation. This paper presents an updated MACCS model, bases on Input-Output methodology, in which economic impacts are calculated using the Regional Economic Accounting analysis tool (REAcct) created at Sandia National Laboratories. This new GDP-based model allows quick and consistent estimation of gross domesticmore » product (GDP) losses due to nuclear power plant accidents. This paper outlines the steps taken to combine the REAcct Input-Output-based model with the MACCS code, describes the GDP loss calculation, and discusses the parameters and modeling assumptions necessary for the estimation of long-term effects of nuclear power plant accidents.« less
Radiation dose optimization in the decommissioning plan for Loviisa NPP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holmberg, R.; Eurajoki, T.
1995-03-01
Finnish rules for nuclear power require a detailed decommissioning plan to be made and kept up to date already during plant operation. The main reasons for this {open_quotes}premature{close_quotes} plan, is, firstly, the need to demonstrate the feasibility of decommissioning, and, secondly, to make realistic cost estimates in order to fund money for this future operation. The decomissioning for Lovissa Nuclear Power Plant (NPP) (2{times}445 MW, PWR) was issued in 1987. It must be updated about every five years. One important aspect of the plant is an estimate of radiation doses to the decomissioning workers. The doses were recently re-estimated becausemore » of a need to decrease the total collective dose estimate in the original plan, 23 manSv. In the update, the dose was reduced by one-third. Part of the reduction was due to changes in the protection and procedures, in which ALARA considerations were taken into account, and partly because of re-estimation of the doses.« less
Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckers, Koenraad J; McCabe, Kevin
This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal Energy for Production of Heat and electricity (IR) Economically Simulated). GEOPHIRES combines reservoir, wellbore, surface plant and economic models to estimate the capital, and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy of a geothermal plant. The available end-use options are electricity, direct-use heat and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to couple to an external reservoir simulator, updated cost correlations, and more flexibility in selecting themore » time step and number of injection and production wells. An overview of all the updates and two case-studies to illustrate the tool's new capabilities are provided in this paper.« less
Zhang, Kui; Wiener, Howard; Beasley, Mark; George, Varghese; Amos, Christopher I; Allison, David B
2006-08-01
Individual genome scans for quantitative trait loci (QTL) mapping often suffer from low statistical power and imprecise estimates of QTL location and effect. This lack of precision yields large confidence intervals for QTL location, which are problematic for subsequent fine mapping and positional cloning. In prioritizing areas for follow-up after an initial genome scan and in evaluating the credibility of apparent linkage signals, investigators typically examine the results of other genome scans of the same phenotype and informally update their beliefs about which linkage signals in their scan most merit confidence and follow-up via a subjective-intuitive integration approach. A method that acknowledges the wisdom of this general paradigm but formally borrows information from other scans to increase confidence in objectivity would be a benefit. We developed an empirical Bayes analytic method to integrate information from multiple genome scans. The linkage statistic obtained from a single genome scan study is updated by incorporating statistics from other genome scans as prior information. This technique does not require that all studies have an identical marker map or a common estimated QTL effect. The updated linkage statistic can then be used for the estimation of QTL location and effect. We evaluate the performance of our method by using extensive simulations based on actual marker spacing and allele frequencies from available data. Results indicate that the empirical Bayes method can account for between-study heterogeneity, estimate the QTL location and effect more precisely, and provide narrower confidence intervals than results from any single individual study. We also compared the empirical Bayes method with a method originally developed for meta-analysis (a closely related but distinct purpose). In the face of marked heterogeneity among studies, the empirical Bayes method outperforms the comparator.
METHOD FOR THE ANALYSIS OF ASBESTOS IN WATER USING MCE FILTERS
The current Federal Drinking Water Standard makes possible the use of methyl cellulose ester filters rather than the previously proposed Nuclepore™ filter. Updating of the previous counting rules brings them closer to AHERA specifications.
Energy expenditure estimates during school physical education: Potential vs. reality?
Kahan, David; McKenzie, Thomas L
2017-02-01
Schools are salient locations for addressing the high prevalence of overweight and obesity. Most US states require some physical education (PE) and the energy expended during PE has potential to positively affect energy balance. We previously used 2012 data to examine state policies for PE to calculate estimated student energy expenditure (EEE) under potential (i.e., recommendations followed) and existing conditions. Since then, data have been updated on both state policies and the conduct of PE. Based on updated data, we used PE frequency, duration, and intensity, student mass, and class size to calculate EEE for the delivery of PE under (a) national professional recommendations, (b) 2016 state policies, and (c) school-reported conditions. Although increased from four years ago, only 22 states currently have policies mandating specific PE minutes. EEE over 10years shows the enormous impact PE could have on energy balance. For the average recommended-size PE class, resultant annual EEE based on professional recommendations for min/week far exceeded those based on average state (n=22) policy for min/week by 44.5% for elementary, 62.7% for middle, and 59.5% for high schools. Since 2012 more states adopted policies for PE minutes than dropped them, however, EEE over 10years showed a net loss of 1200kcal/student. With no overall recent improvements in state PE policy and professional recommendations currently not being met, PE remains an underutilized public health resource for EEE. Strong policies, coupled with enhanced accountability of PE teachers and administrators, are needed to ensure PE exists in schools. Copyright © 2016 Elsevier Inc. All rights reserved.
Standard big bang nucleosynthesis and primordial CNO abundances after Planck
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coc, Alain; Uzan, Jean-Philippe; Vangioni, Elisabeth, E-mail: coc@csnsm.in2p3.fr, E-mail: uzan@iap.fr, E-mail: vangioni@iap.fr
Primordial or big bang nucleosynthesis (BBN) is one of the three historical strong evidences for the big bang model. The recent results by the Planck satellite mission have slightly changed the estimate of the baryonic density compared to the previous WMAP analysis. This article updates the BBN predictions for the light elements using the cosmological parameters determined by Planck, as well as an improvement of the nuclear network and new spectroscopic observations. There is a slight lowering of the primordial Li/H abundance, however, this lithium value still remains typically 3 times larger than its observed spectroscopic abundance in halo starsmore » of the Galaxy. According to the importance of this ''lithium problem{sup ,} we trace the small changes in its BBN calculated abundance following updates of the baryonic density, neutron lifetime and networks. In addition, for the first time, we provide confidence limits for the production of {sup 6}Li, {sup 9}Be, {sup 11}B and CNO, resulting from our extensive Monte Carlo calculation with our extended network. A specific focus is cast on CNO primordial production. Considering uncertainties on the nuclear rates around the CNO formation, we obtain CNO/H ≈ (5-30)×10{sup -15}. We further improve this estimate by analyzing correlations between yields and reaction rates and identified new influential reaction rates. These uncertain rates, if simultaneously varied could lead to a significant increase of CNO production: CNO/H∼10{sup -13}. This result is important for the study of population III star formation during the dark ages.« less
Simulation of the shallow groundwater-flow system near Mole Lake, Forest County, Wisconsin
Fienen, Michael N.; Juckem, Paul F.; Hunt, Randall J.
2011-01-01
The shallow groundwater system near Mole Lake, Forest County, Wis. was simulated using a previously calibrated regional model. The previous model was updated using newly collected water-level measurements and refinements to surface-water features. The updated model was then used to calculate the area contributing recharge for one existing and two proposed pumping locations on lands of the Sokaogon Chippewa Community. Delineated 1-, 5-, and 10-year areas contributing recharge for existing and proposed wells extend from the areas of pumping to the northeast of the pumping locations. Steady-state pumping was simulated for two scenarios: a base pumping scenario using pumping rates that reflect what the Tribe expects to pump and a high pumping scenario, in which the rate was set to the maximum expected from wells installed in this area. In the base pumping scenario, pumping rates of 32 gallons per minute (gal/min; 46,000 gallons per day (gal/d)) from the existing well and 30 gal/min (43,000 gal/d) at each of the two proposed wells were simulated. The high pumping scenario simulated a rate of 70 gal/min (101,000 gal/d) from each of the three pumping wells to estimate of the largest areas contributing recharge that might be expected given what is currently known about the shallow groundwater system. The areas contributing recharge for both the base and high pumping scenarios did not intersect any modeled surface-water bodies; however, the high pumping scenario had a larger areal extent than the base pumping scenario and intersected a septic separator.
Lowe, John; Watkins, W John; Edwards, Martin O; Spiller, O Brad; Jacqz-Aigrain, Evelyne; Kotecha, Sarah J; Kotecha, Sailesh
2014-07-01
Previous meta-analyses have reported a significant association between pulmonary colonization with Ureaplasma and development of bronchopulmonary dysplasia (BPD). However, because few studies reporting oxygen dependency at 36 weeks corrected gestation were previously available, we updated the systematic review and meta-analyses to evaluate the association between presence of pulmonary Ureaplasma and development of BPD. Five databases were searched for articles reporting the incidence of BPD at 36 weeks postmenstrual age (BPD36) and/or BPD at 28 days of life (BPD28) in Ureaplasma colonized and noncolonized groups. Pooled estimates were produced using random effects meta-analysis. Meta-regression was used to assess the influence of difference in gestational age between the Ureaplasma-positive and Ureaplasma-negative groups. The effects of potential sources of heterogeneity were also investigated. Of 39 studies included, 8 reported BPD36, 22 reported BPD28 and 9 reported both. The quality of studies was assessed as moderate to good. There was a significant association between Ureaplasma and development of BPD36 (odds ratio = 2.22; 95% confidence intervals: 1.42-3.47) and BPD28 (odds ratio = 3.04; 95% confidence intervals: 2.41-3.83). Sample size influenced the odds ratio, but no significant association was noted between BPD28 rates and difference in gestational age between Ureaplasma colonized and noncolonized infants (P = 0.96). Pulmonary colonization with Ureaplasma continues to be significantly associated with development of BPD in preterm infants at both 36 weeks postmenstrual age and at 28 days of life. This association at BPD28 persists regardless of difference in gestational age.
Updated radiometric calibration for the Landsat-5 thematic mapper reflective bands
Helder, D.L.; Markham, B.L.; Thome, K.J.; Barsi, J.A.; Chander, G.; Malla, R.
2008-01-01
The Landsat-5 Thematic Mapper (TM) has been the workhorse of the Landsat system. Launched in 1984, it continues collecting data through the time frame of this paper. Thus, it provides an invaluable link to the past history of the land features of the Earth's surface, and it becomes imperative to provide an accurate radiometric calibration of the reflective bands to the user community. Previous calibration has been based on information obtained from prelaunch, the onboard calibrator, vicarious calibration attempts, and cross-calibration with Landsat-7. Currently, additional data sources are available to improve this calibration. Specifically, improvements in vicarious calibration methods and development of the use of pseudoinvariant sites for trending provide two additional independent calibration sources. The use of these additional estimates has resulted in a consistent calibration approach that ties together all of the available calibration data sources. Results from this analysis indicate a simple exponential, or a constant model may be used for all bands throughout the lifetime of Landsat-5 TM. Where previously time constants for the exponential models were approximately one year, the updated model has significantly longer time constants in bands 1-3. In contrast, bands 4, 5, and 7 are shown to be best modeled by a constant. The models proposed in this paper indicate calibration knowledge of 5% or better early in life, decreasing to nearly 2% later in life. These models have been implemented at the U.S. Geological Survey Earth Resources Observation and Science (EROS) and are the default calibration used for all Landsat TM data now distributed through EROS. ?? 2008 IEEE.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shehabi, Arman; Ganeshalingam, Mohan; DeMates, Lauren
Laboratories are estimated to be 3-5 times more energy intensive than typical office buildings and offer significant opportunities for energy use reductions. Although energy intensity varies widely, laboratories are generally energy intensive due to ventilation requirements, the research instruments used, and other health and safety concerns. Because the requirements of laboratory facilities differ so dramatically from those of other buildings, a clear need exists for an initiative exclusively targeting these facilities. The building stock of laboratories in the United States span different economic sectors, include governmental and academic institution, and are often defined differently by different groups. Information on laboratorymore » buildings is often limited to a small subsection of the total building stock making aggregate estimates of the total U.S. laboratories and their energy use challenging. Previous estimates of U.S. laboratory space vary widely owing to differences in how laboratories are defined and categorized. A 2006 report on fume hoods provided an estimate of 150,000 laboratories populating the U.S. based in part on interviews of industry experts, however, a 2009 analysis of the 2003 Commercial Buildings Energy Consumption Survey (CBECS) generated an estimate of only 9,000 laboratory buildings. This report draws on multiple data sources that have been evaluated to construct an understanding of U.S. laboratories across different sizes and markets segments. This 2016 analysis is an update to draft reports released in October and December 2016.« less
Progress in navigation filter estimate fusion and its application to spacecraft rendezvous
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell
1994-01-01
A new derivation of an algorithm which fuses the outputs of two Kalman filters is presented within the context of previous research in this field. Unlike other works, this derivation clearly shows the combination of estimates to be optimal, minimizing the trace of the fused covariance matrix. The algorithm assumes that the filters use identical models, and are stable and operating optimally with respect to their own local measurements. Evidence is presented which indicates that the error ellipsoid derived from the covariance of the optimally fused estimate is contained within the intersections of the error ellipsoids of the two filters being fused. Modifications which reduce the algorithm's data transmission requirements are also presented, including a scalar gain approximation, a cross-covariance update formula which employs only the two contributing filters' autocovariances, and a form of the algorithm which can be used to reinitialize the two Kalman filters. A sufficient condition for using the optimally fused estimates to periodically reinitialize the Kalman filters in this fashion is presented and proved as a theorem. When these results are applied to an optimal spacecraft rendezvous problem, simulated performance results indicate that the use of optimally fused data leads to significantly improved robustness to initial target vehicle state errors. The following applications of estimate fusion methods to spacecraft rendezvous are also described: state vector differencing, and redundancy management.
Feaster, Toby D.; Gotvald, Anthony J.; Weaver, J. Curtis
2014-01-01
Reliable estimates of the magnitude and frequency of floods are essential for the design of transportation and water-conveyance structures, flood insurance studies, and flood-plain management. Flood-frequency estimates are particularly important in densely populated urban areas. The U.S. Geological Survey (USGS) used a multistate approach to update methods for determining the magnitude and frequency of floods in urban and small, rural streams that are not substantially affected by regulation or tidal fluctuations in Georgia, South Carolina, and North Carolina (Feaster and others, 2014). The multistate approach has the advantage over a single state approach of increasing the number of streamflow-gaging station (streamgages) available for analysis, expanding the geographical coverage that would allow for application of regional regression equations across state boundaries, and building on a previous flood-frequency investigation of rural streamgages in the Southeastern United States. This investigation was funded as part of a cooperative program of water-resources investigations between the USGS, the South Carolina Department of Transportation, and the North Carolina Department of Transportation. In addition, much of the data and information for the Georgia streamgages was funded through a similar cooperative program with the Georgia Department of Transportation.
New Methodology for Natural Gas Production Estimates
2010-01-01
A new methodology is implemented with the monthly natural gas production estimates from the EIA-914 survey this month. The estimates, to be released April 29, 2010, include revisions for all of 2009. The fundamental changes in the new process include the timeliness of the historical data used for estimation and the frequency of sample updates, both of which are improved.
Kalman filter for statistical monitoring of forest cover across sub-continental regions
Raymond L. Czaplewski
1991-01-01
The Kalman filter is a multivariate generalization of the composite estimator which recursively combines a current direct estimate with a past estimate that is updated for expected change over time with a prediction model. The Kalman filter can estimate proportions of different cover types for sub-continental regions each year. A random sample of high-resolution...
Meier, G; Gregg, M; Poulsen Nautrup, B
2015-01-01
To update an earlier evaluation estimating the cost-effectiveness of quadrivalent influenza vaccination (QIV) compared with trivalent influenza vaccination (TIV) in the adult population currently recommended for influenza vaccination in the UK (all people aged ≥65 years and people aged 18-64 years with clinical risk conditions). This analysis takes into account updated vaccine prices, reference costs, influenza strain circulation, and burden of illness data. A lifetime, multi-cohort, static Markov model was constructed with seven age groups. The model was run in 1-year cycles for a lifetime, i.e., until the youngest patients at entry reached the age of 100 years. The base-case analysis was from the perspective of the UK National Health Service, with a secondary analysis from the societal perspective. Costs and benefits were discounted at 3.5%. Herd effects were not included. Inputs were derived from systematic reviews, peer-reviewed articles, and government publications and databases. One-way and probabilistic sensitivity analyses were performed. In the base-case, QIV would be expected to avoid 1,413,392 influenza cases, 41,780 hospitalizations, and 19,906 deaths over the lifetime horizon, compared with TIV. The estimated incremental cost-effectiveness ratio (ICER) was £14,645 per quality-adjusted life-year (QALY) gained. From the societal perspective, the estimated ICER was £13,497/QALY. A strategy of vaccinating only people aged ≥65 years had an estimated ICER of £11,998/QALY. Sensitivity analysis indicated that only two parameters, seasonal variation in influenza B matching and influenza A circulation, had a substantial effect on the ICER. QIV would be likely to be cost-effective compared with TIV in 68% of simulations with a willingness-to-pay threshold of <£20,000/QALY and 87% with a willingness-to-pay threshold of <£30,000/QALY. In this updated analysis, QIV was estimated to be cost-effective compared with TIV in the U.K.
Estimating tag loss of the Atlantic Horseshoe crab, Limulus polyphemus, using a multi-state model
Butler, Catherine Alyssa; McGowan, Conor P.; Grand, James B.; Smith, David
2012-01-01
The Atlantic Horseshoe crab, Limulus polyphemus, is a valuable resource along the Mid-Atlantic coast which has, in recent years, experienced new management paradigms due to increased concern about this species role in the environment. While current management actions are underway, many acknowledge the need for improved and updated parameter estimates to reduce the uncertainty within the management models. Specifically, updated and improved estimates of demographic parameters such as adult crab survival in the regional population of interest, Delaware Bay, could greatly enhance these models and improve management decisions. There is however, some concern that difficulties in tag resighting or complete loss of tags could be occurring. As apparent from the assumptions of a Jolly-Seber model, loss of tags can result in a biased estimate and underestimate a survival rate. Given that uncertainty, as a first step towards estimating an unbiased estimate of adult survival, we first took steps to estimate the rate of tag loss. Using data from a double tag mark-resight study conducted in Delaware Bay and Program MARK, we designed a multi-state model to allow for the estimation of mortality of each tag separately and simultaneously.
Response Surface Model (RSM)-based Benefit Per Ton Estimates
The tables below are updated versions of the tables appearing in The influence of location, source, and emission type in estimates of the human health benefits of reducing a ton of air pollution (Fann, Fulcher and Hubbell 2009).
Ladtap XL Version 2017: A Spreadsheet For Estimating Dose Resulting From Aqueous Releases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Minter, K.; Jannik, T.
LADTAP XL© is an EXCEL© spreadsheet used to estimate dose to offsite individuals and populations resulting from routine and accidental releases of radioactive materials to the Savannah River. LADTAP XL© contains two worksheets: LADTAP and IRRIDOSE. The LADTAP worksheet estimates dose for environmental pathways including external exposure resulting from recreational activities on the Savannah River and internal exposure resulting from ingestion of water, fish, and invertebrates originating from the Savannah River. IRRIDOSE estimates offsite dose to individuals and populations from irrigation of foodstuffs with contaminated water from the Savannah River. In 2004, a complete description of the LADTAP XL© codemore » and an associated user’s manual was documented in LADTAP XL©: A Spreadsheet for Estimating Dose Resulting from Aqueous Release (WSRC-TR-2004-00059) and revised input parameters, dose coefficients, and radionuclide decay constants were incorporated into LADTAP XL© Version 2013 (SRNL-STI-2011-00238). LADTAP XL© Version 2017 is a slight modification to Version 2013 with minor changes made for more user-friendly parameter inputs and organization, updates in the time conversion factors used within the dose calculations, and fixed an issue with the expected time build-up parameter referenced within the population shoreline dose calculations. This manual has been produced to update the code description, verification of the models, and provide an updated user’s manual. LADTAP XL© Version 2017 has been verified by Minter (2017) and is ready for use at the Savannah River Site (SRS).« less
This analysis updates EPA's standard VSL estimate by using a more comprehensive collection of VSL studies that include studies published between 1992 and 2000, as well as applying a more appropriate statistical method. We provide a pooled effect VSL estimate by applying the empi...
Eye movement sequence generation in humans: Motor or goal updating?
Quaia, Christian; Joiner, Wilsaan M.; FitzGibbon, Edmond J.; Optican, Lance M.; Smith, Maurice A.
2011-01-01
Saccadic eye movements are often grouped in pre-programmed sequences. The mechanism underlying the generation of each saccade in a sequence is currently poorly understood. Broadly speaking, two alternative schemes are possible: first, after each saccade the retinotopic location of the next target could be estimated, and an appropriate saccade could be generated. We call this the goal updating hypothesis. Alternatively, multiple motor plans could be pre-computed, and they could then be updated after each movement. We call this the motor updating hypothesis. We used McLaughlin’s intra-saccadic step paradigm to artificially create a condition under which these two hypotheses make discriminable predictions. We found that in human subjects, when sequences of two saccades are planned, the motor updating hypothesis predicts the landing position of the second saccade in two-saccade sequences much better than the goal updating hypothesis. This finding suggests that the human saccadic system is capable of executing sequences of saccades to multiple targets by planning multiple motor commands, which are then updated by serial subtraction of ongoing motor output. PMID:21191134
Update to core reporting practices in structural equation modeling.
Schreiber, James B
This paper is a technical update to "Core Reporting Practices in Structural Equation Modeling." 1 As such, the content covered in this paper includes, sample size, missing data, specification and identification of models, estimation method choices, fit and residual concerns, nested, alternative, and equivalent models, and unique issues within the SEM family of techniques. Copyright © 2016 Elsevier Inc. All rights reserved.
Virginia, 2012 - forest inventory and analysis factsheet
Anita K. Rose
2014-01-01
This science update is a brief look at some of the basic metrics that describe the status of and changes in forest resources in Virginia. Estimates presented here are for the measurement year 2012. Information for the factsheets is updated by means of the Forest Inventory and Analysis (FIA) annualized sample design. Each year 20 percent of the sample plots (one panel)...
Virginia, 2011 forest inventory and analysis factsheet
Anita K. Rose
2013-01-01
This science update is a brief look at some of the basic metrics that describe the status and trends of forest resources in Virginia. Estimates presented here are for the measurement year 2011. Information for the factsheets is updated by means of the Forest Inventory and Analysis (FIA) annualized sample design. Each year 20 percent of the sample plots (one panel) in...
Virginia, 2010 forest inventory and analysis factsheet
Anita K. Rose
2012-01-01
This science update is a brief look at some of the basic metrics that describe the status of forest resources in Virginia. Estimates presented here are for the measurement year 2010. Information for this factsheet is updated by means of the Forest Inventory and Analysis (FIA) annualized sample design. Virginia has about 4,600 sample plots across the State and each year...
Virginia, 2009 forest inventory and analysis factsheet
Anita K. Rose
2011-01-01
This science update is a brief look at some of the basic metrics that describe forest resources in Virginia. Estimates presented here are for the measurement year 2009. Information for the factsheet is updated by means of the Forest Inventory and Analysis (FIA) annualized sample design. Virginia has about 4,600 sample plots across the State, and each year 20 percent of...
Update on 2005-06 State Financial Aid Program Activity and 2006-07 Estimates
ERIC Educational Resources Information Center
Washington Higher Education Coordinating Board, 2006
2006-01-01
The state of Washington is committed to higher education opportunity for all students, regardless of income, through its state financial aid programs. The purpose of this report is to provide the members of the Higher Education Coordinating Board (HECB) with an overview of state and federal financial aid in Washington, an update on state financial…
Preliminary results from a method to update timber resource statistics in North Carolina
Glenn P. Catts; Noel D. Cost; Raymond L. Czaplewski; Paul W. Snook
1987-01-01
Forest Inventory and Analysis units of the USDA Forest Service produce timber resource statistics every 8 to 10 years. Midcycle surveys are often performed to update inventory estimates. This requires timely identification of forest lands. There are several kinds of remotely sensed data that are suitable for this purpose. Medium scale color infrared aerial photography...
Turnaround operations analysis for OTV. Volume 2: Detailed technical report
NASA Technical Reports Server (NTRS)
1988-01-01
The objectives and accomplishments were to adapt and apply the newly created database of Shuttle/Centaur ground operations. Previously defined turnaround operations analyses were to be updated for ground-based OTVs (GBOTVs) and space-based OTVs (SBOTVs), design requirements identified for both OTV and Space Station accommodations hardware, turnaround operations costs estimated, and a technology development plan generated to develop the required capabilities. Technical and programmatic data were provided for NASA pertinent to OTV round and space operations requirements, turnaround operations, task descriptions, timelines and manpower requirements, OTV modular design and booster and Space Station interface requirements. SBOTV accommodations development schedule, cost and turnaround operations requirements, and a technology development plan for ground and space operations and space-based accommodations facilities and support equipment. Significant conclusion are discussed.
Direct coal liquefaction baseline design and system analysis. Quarterly report, January--March 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-04-01
The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less
Direct coal liquefaction baseline design and system analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-04-01
The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less
Observed cloud reflectivities and liquid water paths: An update
NASA Technical Reports Server (NTRS)
Coakley, James A., Jr.; Snider, Jack B.
1990-01-01
The FIRE microwave radiometer observations of liquid water path from San Nicolas Island and simultaneous NOAA AVHRR observations of cloud reflectivity were used to test a relationship between cloud liquid water path and cloud reflectivity that is often used in general circulation climate models (Stephens, 1978). The results of attempts to improve the data analysis which was described at the previous FIRE Science Team Workshop and elsewhere (Coakley and Snider, 1989) are reported. The improvements included the analysis of additional satellite passes over San Nicolas and sensitivity studies to estimate the effects on the observed reflectivities due to: (1) nonzero surface reflectivities beneath the clouds; (2) the anisotropy of the reflected radiances observed by the AVHRR; (3) small scale spatial structure in the liquid water path; and (4) adjustments to the calibration of AVHRR.
Divisions of geologic time-major chronostratigraphic and geochronologic units
,
2010-01-01
Effective communication in the geosciences requires consistent uses of stratigraphic nomenclature, especially divisions of geologic time. A geologic time scale is composed of standard stratigraphic divisions based on rock sequences and is calibrated in years. Over the years, the development of new dating methods and the refinement of previous methods have stimulated revisions to geologic time scales. Advances in stratigraphy and geochronology require that any time scale be periodically updated. Therefore, Divisions of Geologic Time, which shows the major chronostratigraphic (position) and geochronologic (time) units, is intended to be a dynamic resource that will be modified to include accepted changes of unit names and boundary age estimates. This fact sheet is a modification of USGS Fact Sheet 2007-3015 by the U.S. Geological Survey Geologic Names Committee.
Novel approach to improve the attitude update rate of a star tracker.
Zhang, Shuo; Xing, Fei; Sun, Ting; You, Zheng; Wei, Minsong
2018-03-05
The star tracker is widely used in attitude control systems of spacecraft for attitude measurement. The attitude update rate of a star tracker is important to guarantee the attitude control performance. In this paper, we propose a novel approach to improve the attitude update rate of a star tracker. The electronic Rolling Shutter (RS) imaging mode of the complementary metal-oxide semiconductor (CMOS) image sensor in the star tracker is applied to acquire star images in which the star spots are exposed with row-to-row time offsets, thereby reflecting the rotation of star tracker at different times. The attitude estimation method with a single star spot is developed to realize the multiple attitude updates by a star image, so as to reach a high update rate. The simulation and experiment are performed to verify the proposed approaches. The test results demonstrate that the proposed approach is effective and the attitude update rate of a star tracker is increased significantly.
Emergent constraints for aerosol indirect effects
NASA Astrophysics Data System (ADS)
Wang, M.; Zhang, S.; Gong, C.; Ghan, S. J.
2016-12-01
Methane in the U.S. GHG Inventory The EPA's annual Inventory of U.S. Greenhouse Gas Emissions and Sinks (GHG Inventory) includes detailed national estimates of anthropogenic methane emissions. In recent years, new data have become available on methane emissions across a number of anthropogenic sources in the U.S. The GHG Inventory has incorporated newly available data and includes updated emissions estimates from a number of categories. This presentation will discuss the latest GHG Inventory results, including results for the oil and gas, waste, and agriculture sectors. The presentation will also discuss key areas for research, and processes for updating data in the GHG Inventory.
Methane Emissions in the U.S. GHG Inventory
NASA Astrophysics Data System (ADS)
Weitz, M.
2017-12-01
Methane in the U.S. GHG Inventory The EPA's annual Inventory of U.S. Greenhouse Gas Emissions and Sinks (GHG Inventory) includes detailed national estimates of anthropogenic methane emissions. In recent years, new data have become available on methane emissions across a number of anthropogenic sources in the U.S. The GHG Inventory has incorporated newly available data and includes updated emissions estimates from a number of categories. This presentation will discuss the latest GHG Inventory results, including results for the oil and gas, waste, and agriculture sectors. The presentation will also discuss key areas for research, and processes for updating data in the GHG Inventory.
Adaptive Metropolis Sampling with Product Distributions
NASA Technical Reports Server (NTRS)
Wolpert, David H.; Lee, Chiu Fan
2005-01-01
The Metropolis-Hastings (MH) algorithm is a way to sample a provided target distribution pi(z). It works by repeatedly sampling a separate proposal distribution T(x,x') to generate a random walk {x(t)}. We consider a modification of the MH algorithm in which T is dynamically updated during the walk. The update at time t uses the {x(t' less than t)} to estimate the product distribution that has the least Kullback-Leibler distance to pi. That estimate is the information-theoretically optimal mean-field approximation to pi. We demonstrate through computer experiments that our algorithm produces samples that are superior to those of the conventional MH algorithm.
Global cost of child survival: estimates from country-level validation
van Ekdom, Liselore; Scherpbier, Robert W; Niessen, Louis W
2011-01-01
Abstract Objective To cross-validate the global cost of scaling up child survival interventions to achieve the fourth Millennium Development Goal (MDG4) as estimated by the World Health Organization (WHO) in 2007 by using the latest country-provided data and new assumptions. Methods After the main cost categories for each country were identified, validation questionnaires were sent to 32 countries with high child mortality. Publicly available estimates for disease incidence, intervention coverage, prices and resources for individual-level and programme-level activities were validated against local data. Nine updates to the 2007 WHO model were generated using revised assumptions. Finally, estimates were extrapolated to 75 countries and combined with cost estimates for immunization and malaria programmes and for programmes for the prevention of mother-to-child transmission of the human immunodeficiency virus (HIV). Findings Twenty-six countries responded. Adjustments were largest for system- and programme-level data and smallest for patient data. Country-level validation caused a 53% increase in original cost estimates (i.e. 9 billion 2004 United States dollars [US$]) for 26 countries owing to revised system and programme assumptions, especially surrounding community health worker costs. The additional effect of updated population figures was small; updated epidemiologic figures increased costs by US$ 4 billion (+15%). New unit prices in the 26 countries that provided data increased estimates by US$ 4.3 billion (+16%). Extrapolation to 75 countries increased the original price estimate by US$ 33 billion (+80%) for 2010–2015. Conclusion Country-level validation had a significant effect on the cost estimate. Price adaptations and programme-related assumptions contributed substantially. An additional 74 billion US$ 2005 (representing a 12% increase in total health expenditure) would be needed between 2010 and 2015. Given resource constraints, countries will need to prioritize health activities within their national resource envelope. PMID:21479091
LSAT® Scores of Economics Majors: The 2015-16 Class Update and 15-Year History
ERIC Educational Resources Information Center
Nieswiadomy, Michael
2017-01-01
In this article, the author updates his prior studies of LSAT® scores (Nieswiadomy 1998, 2006, 2010, 2014) using current data for 2015-16 law school applicants, finding that economics majors remain at or near the top of all applicants. Results of the previous studies showing economics majors scored well on the LSAT® have been posted often on…
Participation in Kentucky's College Preparatory Transition Courses: An Update. REL 2017-211
ERIC Educational Resources Information Center
Flory, Michael; Cramer, Eric
2017-01-01
Kentucky offers college preparatory transition courses in math, reading, and English to grade 12 students. The courses are designed as one possible intervention for students who do not meet state college readiness benchmarks in one or more of those subjects on the ACT in grade 11. This study updates a previous Regional Educational Laboratory (REL)…
True fir spacing and yield trials—20-year update
Robert O. Curtis
2013-01-01
This report updates data and comparisons from previous reports (Curtis and others 2000, Curtis 2008) on a series of precommercial thinning and yield trials in high-elevation true firâhemlock stands, using data from the 12 replicates for which 20-year data are now available. The stands were varying mixtures of Pacific silver fir (Abies amabilis (Douglas ex Loudon)...
ERIC Educational Resources Information Center
Harris, Robert; Phillips, Alan
A project sought to develop a means of updating and retraining those required to comply with Britain's 1985 Building Regulations, which are substantially different from the previous ones in regard to procedures and technical content. The training needs analysis conducted indicated that the new training should be flexible and use practical and…
School Officials and the Courts: Update 2001. ERS Monograph.
ERIC Educational Resources Information Center
Thompson, David P.; Hartmeister, Fredric J.
This is the 22nd in a series of yearly updates of judicial decision summaries for case law related to elementary and secondary education issues. One can use previous and future editions to track decisions on appeal or to see trends in case law. With few exceptions, the cases were selected from court decisions found in federal and regional…
School Officials and the Courts: Update 2002. ERS Monograph.
ERIC Educational Resources Information Center
Thompson, David P.; Hartmeister, Fredric J.
This is the 23rd in a series of yearly updates of judicial decision summaries for case law related to elementary- and secondary-education issues. One can use previous and future editions to track decisions on appeal or to spot trends in case law. With few exceptions, the cases were selected from court decisions found in federal and regional…
ERIC Educational Resources Information Center
Tallot, Lucille; Diaz-Mataix, Lorenzo; Perry, Rosemarie E.; Wood, Kira; LeDoux, Joseph E.; Mouly, Anne-Marie; Sullivan, Regina M.; Doyère, Valérie
2017-01-01
The updating of a memory is triggered whenever it is reactivated and a mismatch from what is expected (i.e., prediction error) is detected, a process that can be unraveled through the memory's sensitivity to protein synthesis inhibitors (i.e., reconsolidation). As noted in previous studies, in Pavlovian threat/aversive conditioning in adult rats,…
Realistic decision-making processes in a vaccination game
NASA Astrophysics Data System (ADS)
Iwamura, Yoshiro; Tanimoto, Jun
2018-03-01
Previous studies of vaccination games have nearly always assumed a pairwise comparison between a focal and neighboring player for the strategy updating rule, which comes from numerous compiled studies on spatial versions of 2-player and 2-strategy (2 × 2) games such as the spatial prisoner's dilemma (SPD). We propose, in this study, new update rules because the human decision-making process of whether to commit to a vaccination is obviously influenced by a "sense of crisis" or "fear" urging him/her toward vaccination, otherwise they will likely be infected. The rule assumes that an agent evaluates whether getting a vaccination or trying to free ride should be attempted based on observations of whether neighboring non-vaccinators were able to successfully free ride during the previous time-step. Compared to the conventional updating rule (standard pairwise comparison assuming a Fermi function), the new rules generally realize higher vaccination coverage and smaller final epidemic sizes. One rule in particular shows very good performance with significantly smaller epidemic sizes despite comparable levels of vaccination coverage. This is because the specific update rule helps vaccinators spread widely in the domain, which effectively hampers the spread of epidemics.
Societal costs of traffic crashes and crime in Michigan : 2011 update.
DOT National Transportation Integrated Search
2011-06-01
"Cost estimates, including both monetary and nonmonetary quality-of-life costs specific to Michigan, were : estimated for overall traffic crashes and index crimes by experts in the field of economics of traffic crashes : and crimes. These cost estima...
Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckers, Koenraad J; McCabe, Kevin
This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal energy for Production of Heat and electricity ('IR') Economically Simulated). GEOPHIRES combines engineering models of the reservoir, wellbores, and surface plant facilities of a geothermal plant with an economic model to estimate the capital and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy. The available end-use options are electricity, direct-use heat, and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to import temperature data (e.g., measured or from stand-alonemore » reservoir simulator), updated cost correlations, and more flexibility in selecting the time step and number of injection and production wells. In this paper, we provide an overview of all the updates and two case studies to illustrate the tool's new capabilities.« less
NASA Astrophysics Data System (ADS)
Wang, Xing; Hill, Thomas L.; Neild, Simon A.; Shaw, Alexander D.; Haddad Khodaparast, Hamed; Friswell, Michael I.
2018-02-01
This paper proposes a model updating strategy for localised nonlinear structures. It utilises an initial finite-element (FE) model of the structure and primary harmonic response data taken from low and high amplitude excitations. The underlying linear part of the FE model is first updated using low-amplitude test data with established techniques. Then, using this linear FE model, the nonlinear elements are localised, characterised, and quantified with primary harmonic response data measured under stepped-sine or swept-sine excitations. Finally, the resulting model is validated by comparing the analytical predictions with both the measured responses used in the updating and with additional test data. The proposed strategy is applied to a clamped beam with a nonlinear mechanism and good agreements between the analytical predictions and measured responses are achieved. Discussions on issues of damping estimation and dealing with data from amplitude-varying force input in the updating process are also provided.
Automated wind load characterization of wind turbine structures by embedded model updating
NASA Astrophysics Data System (ADS)
Swartz, R. Andrew; Zimmerman, Andrew T.; Lynch, Jerome P.
2010-04-01
The continued development of renewable energy resources is for the nation to limit its carbon footprint and to enjoy independence in energy production. Key to that effort are reliable generators of renewable energy sources that are economically competitive with legacy sources. In the area of wind energy, a major contributor to the cost of implementation is large uncertainty regarding the condition of wind turbines in the field due to lack of information about loading, dynamic response, and fatigue life of the structure expended. Under favorable circumstances, this uncertainty leads to overly conservative designs and maintenance schedules. Under unfavorable circumstances, it leads to inadequate maintenance schedules, damage to electrical systems, or even structural failure. Low-cost wireless sensors can provide more certainty for stakeholders by measuring the dynamic response of the structure to loading, estimating the fatigue state of the structure, and extracting loading information from the structural response without the need of an upwind instrumentation tower. This study presents a method for using wireless sensor networks to estimate the spectral properties of a wind turbine tower loading based on its measured response and some rudimentary knowledge of its structure. Structural parameters are estimated via model-updating in the frequency domain to produce an identification of the system. The updated structural model and the measured output spectra are then used to estimate the input spectra. Laboratory results are presented indicating accurate load characterization.
Monitoring bald eagles using lists of nests: Response to Watts and Duerr
Sauer, John R.; Otto, Mark C.; Kendall, William L.; Zimmerman, Guthrie S.
2011-01-01
The post-delisting monitoring plan for bald eagles (Haliaeetus leucocephalus) roposed use of a dual-frame sample design, in which sampling of known nest sites in combination with additional area-based sampling is used to estimate total number of nesting bald eagle pairs. Watts and Duerr (2010) used data from repeated observations of bald eagle nests in Virginia, USA to estimate a nest turnover rate and used this rate to simulate decline in number of occupied nests in list nests over time. Results of Watts and Duerr suggest that, given the rates of loss of nests from the list of known nest sites in Virginia, the list information will be of little value to sampling unless lists are constantly updated. Those authors criticize the plan for not placing sufficient emphasis on updating and maintaining lists of bald eagle nests. Watts and Duerr's metric of turnover rate does not distinguish detectability or temporary nonuse of nests from permanent loss of nests and likely overestimates turnover rate. We describe a multi-state capture–recapture model that allows appropriate estimation of rates of loss of nests, and we use the model to estimate rates of loss from a sample of nests from Maine, USA. The post-delisting monitoring plan addresses the need to maintain and update the lists of nests, and we show that dual frame sampling is an effective approach for sampling nesting bald eagle populations.
Improved estimates of ocean heat content from 1960 to 2015.
Cheng, Lijing; Trenberth, Kevin E; Fasullo, John; Boyer, Tim; Abraham, John; Zhu, Jiang
2017-03-01
Earth's energy imbalance (EEI) drives the ongoing global warming and can best be assessed across the historical record (that is, since 1960) from ocean heat content (OHC) changes. An accurate assessment of OHC is a challenge, mainly because of insufficient and irregular data coverage. We provide updated OHC estimates with the goal of minimizing associated sampling error. We performed a subsample test, in which subsets of data during the data-rich Argo era are colocated with locations of earlier ocean observations, to quantify this error. Our results provide a new OHC estimate with an unbiased mean sampling error and with variability on decadal and multidecadal time scales (signal) that can be reliably distinguished from sampling error (noise) with signal-to-noise ratios higher than 3. The inferred integrated EEI is greater than that reported in previous assessments and is consistent with a reconstruction of the radiative imbalance at the top of atmosphere starting in 1985. We found that changes in OHC are relatively small before about 1980; since then, OHC has increased fairly steadily and, since 1990, has increasingly involved deeper layers of the ocean. In addition, OHC changes in six major oceans are reliable on decadal time scales. All ocean basins examined have experienced significant warming since 1998, with the greatest warming in the southern oceans, the tropical/subtropical Pacific Ocean, and the tropical/subtropical Atlantic Ocean. This new look at OHC and EEI changes over time provides greater confidence than previously possible, and the data sets produced are a valuable resource for further study.
Improved estimates of ocean heat content from 1960 to 2015
Cheng, Lijing; Trenberth, Kevin E.; Fasullo, John; Boyer, Tim; Abraham, John; Zhu, Jiang
2017-01-01
Earth’s energy imbalance (EEI) drives the ongoing global warming and can best be assessed across the historical record (that is, since 1960) from ocean heat content (OHC) changes. An accurate assessment of OHC is a challenge, mainly because of insufficient and irregular data coverage. We provide updated OHC estimates with the goal of minimizing associated sampling error. We performed a subsample test, in which subsets of data during the data-rich Argo era are colocated with locations of earlier ocean observations, to quantify this error. Our results provide a new OHC estimate with an unbiased mean sampling error and with variability on decadal and multidecadal time scales (signal) that can be reliably distinguished from sampling error (noise) with signal-to-noise ratios higher than 3. The inferred integrated EEI is greater than that reported in previous assessments and is consistent with a reconstruction of the radiative imbalance at the top of atmosphere starting in 1985. We found that changes in OHC are relatively small before about 1980; since then, OHC has increased fairly steadily and, since 1990, has increasingly involved deeper layers of the ocean. In addition, OHC changes in six major oceans are reliable on decadal time scales. All ocean basins examined have experienced significant warming since 1998, with the greatest warming in the southern oceans, the tropical/subtropical Pacific Ocean, and the tropical/subtropical Atlantic Ocean. This new look at OHC and EEI changes over time provides greater confidence than previously possible, and the data sets produced are a valuable resource for further study. PMID:28345033
Improved estimates of ocean heat content from 1960 to 2015
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, Lijing; Trenberth, Kevin E.; Fasullo, John
Earth’s energy imbalance (EEI) drives the ongoing global warming and can best be assessed across the historical record (that is, since 1960) from ocean heat content (OHC) changes. An accurate assessment of OHC is a challenge, mainly because of insufficient and irregular data coverage. We provide here updated OHC estimates with the goal of minimizing associated sampling error. We performed a subsample test, in which subsets of data during the datarich Argo era are colocated with locations of earlier ocean observations, to quantify this error. Our results provide a new OHC estimate with an unbiased mean sampling error and withmore » variability on decadal and multidecadal time scales (signal) that can be reliably distinguished fromsampling error (noise) with signal-to-noise ratios higher than 3. The inferred integrated EEI is greater than that reported in previous assessments and is consistent with a reconstruction of the radiative imbalance at the top of atmosphere starting in 1985. We found that changes in OHC are relatively small before about 1980; since then, OHC has increased fairly steadily and, since 1990, has increasingly involved deeper layers of the ocean. In addition,OHC changes in sixmajor oceans are reliable on decadal timescales. All ocean basins examined have experienced significant warming since 1998, with the greatest warming in the southern oceans, the tropical/subtropical Pacific Ocean, and the tropical/subtropical Atlantic Ocean. This new look at OHC and EEI changes over time provides greater confidence than previously possible, and the data sets produced are a valuable resource for further study.« less
Improved estimates of ocean heat content from 1960 to 2015
Cheng, Lijing; Trenberth, Kevin E.; Fasullo, John; ...
2017-03-10
Earth’s energy imbalance (EEI) drives the ongoing global warming and can best be assessed across the historical record (that is, since 1960) from ocean heat content (OHC) changes. An accurate assessment of OHC is a challenge, mainly because of insufficient and irregular data coverage. We provide here updated OHC estimates with the goal of minimizing associated sampling error. We performed a subsample test, in which subsets of data during the datarich Argo era are colocated with locations of earlier ocean observations, to quantify this error. Our results provide a new OHC estimate with an unbiased mean sampling error and withmore » variability on decadal and multidecadal time scales (signal) that can be reliably distinguished fromsampling error (noise) with signal-to-noise ratios higher than 3. The inferred integrated EEI is greater than that reported in previous assessments and is consistent with a reconstruction of the radiative imbalance at the top of atmosphere starting in 1985. We found that changes in OHC are relatively small before about 1980; since then, OHC has increased fairly steadily and, since 1990, has increasingly involved deeper layers of the ocean. In addition,OHC changes in sixmajor oceans are reliable on decadal timescales. All ocean basins examined have experienced significant warming since 1998, with the greatest warming in the southern oceans, the tropical/subtropical Pacific Ocean, and the tropical/subtropical Atlantic Ocean. This new look at OHC and EEI changes over time provides greater confidence than previously possible, and the data sets produced are a valuable resource for further study.« less
Aeroservoelastic Uncertainty Model Identification from Flight Data
NASA Technical Reports Server (NTRS)
Brenner, Martin J.
2001-01-01
Uncertainty modeling is a critical element in the estimation of robust stability margins for stability boundary prediction and robust flight control system development. There has been a serious deficiency to date in aeroservoelastic data analysis with attention to uncertainty modeling. Uncertainty can be estimated from flight data using both parametric and nonparametric identification techniques. The model validation problem addressed in this paper is to identify aeroservoelastic models with associated uncertainty structures from a limited amount of controlled excitation inputs over an extensive flight envelope. The challenge to this problem is to update analytical models from flight data estimates while also deriving non-conservative uncertainty descriptions consistent with the flight data. Multisine control surface command inputs and control system feedbacks are used as signals in a wavelet-based modal parameter estimation procedure for model updates. Transfer function estimates are incorporated in a robust minimax estimation scheme to get input-output parameters and error bounds consistent with the data and model structure. Uncertainty estimates derived from the data in this manner provide an appropriate and relevant representation for model development and robust stability analysis. This model-plus-uncertainty identification procedure is applied to aeroservoelastic flight data from the NASA Dryden Flight Research Center F-18 Systems Research Aircraft.
The distance effect in numerical memory-updating tasks.
Lendínez, Cristina; Pelegrina, Santiago; Lechuga, Teresa
2011-05-01
Two experiments examined the role of numerical distance in updating numerical information in working memory. In the first experiment, participants had to memorize a new number only when it was smaller than a previously memorized number. In the second experiment, updating was based on an external signal, which removed the need to perform any numerical comparison. In both experiments, distance between the memorized number and the new one was manipulated. The results showed that smaller distances between the new and the old information led to shorter updating times. This graded facilitation suggests that the process by which information is substituted in the focus of attention involves maintaining the shared features between the new and the old number activated and selecting other new features to be activated. Thus, the updating cost may be related to amount of new features to be activated in the focus of attention.
Fehrenbacher, L; von Pawel, J; Park, K; Rittmeyer, A; Gandara, D R; Ponce Aix, S; Han, J-Y; Gadgeel, S M; Hida, T; Cortinovis, D L; Cobo, M; Kowalski, D M; De Marinis, F; Gandhi, M; Danner, B; Matheny, C; Kowanetz, M; He, P; Felizzi, F; Patel, H; Sandler, A; Ballinger, M; Barlesi, F
2018-05-16
The efficacy and safety of atezolizumab vs docetaxel as second- or third-line treatment in patients with advanced non-small cell lung cancer in the primary (n=850; ITT850) and secondary (n=1225; ITT1225) efficacy populations of the randomized phase III OAK study at an updated data cutoff were assessed. Patients received atezolizumab 1200mg or docetaxel 75mg/m 2 intravenously every 3 weeks until loss of clinical benefit or disease progression, respectively. The primary endpoint was overall survival (OS) in the intention-to-treat (ITT) population and programmed death-ligand 1 (PD-L1)-expressing subgroup. A sensitivity analysis was conducted to evaluate the impact of subsequent immunotherapy use in the docetaxel arm on observed survival benefit with atezolizumab. Atezolizumab demonstrated OS benefit vs docetaxel in the updated ITT850 (hazard ratio [HR] 0.75; 95% CI 0.64-0.89; P = .0006) and the ITT1225 (HR 0.80; 0.70-0.92; P = .0012) after minimum follow-up of 26 and 21 months, respectively. Improved survival with atezolizumab was observed across PD-L1 and histology subgroups. The relative OS benefit with atezolizumab was slightly greater in the immunotherapy sensitivity analysis in the ITT850 (HR 0.69) and ITT1225 (HR 0.74) compared to conventional OS estimation. Fewer patients receiving atezolizumab experienced grade 3/4 treatment-related adverse events (14.9%) than those receiving docetaxel (42.4%); no grade 5 events related to atezolizumab were observed. The updated ITT850 and initial ITT1225 analyses were consistent with the primary efficacy analysis demonstrating survival benefit with atezolizumab vs docetaxel. Atezolizumab continued to demonstrate a favorable safety profile after longer treatment exposure and follow-up. Copyright © 2018. Published by Elsevier Inc.
15 CFR 90.1 - Scope and applicability.
Code of Federal Regulations, 2014 CFR
2014-01-01
... CENSUS, DEPARTMENT OF COMMERCE PROCEDURE FOR CHALLENGING POPULATION ESTIMATES § 90.1 Scope and... number of people residing in states and their governmental units. In general, these estimates are developed by updating the population counts produced in the most recent decennial census with demographic...
Motor vehicle traffic crash fatality counts and estimates of people injured for 2005
DOT National Transportation Integrated Search
2006-08-22
This report updates the 2005 Projections released in April 2006, which : were based on a statistical procedure using incomplete/partial data. : This report also compares fatality counts and estimates of people : injured resulting from motor vehicle t...
Cosmic microwave background reconstruction from WMAP and Planck PR2 data
NASA Astrophysics Data System (ADS)
Bobin, J.; Sureau, F.; Starck, J.-L.
2016-06-01
We describe a new estimate of the cosmic microwave background (CMB) intensity map reconstructed by a joint analysis of the full Planck 2015 data (PR2) and nine years of WMAP data. The proposed map provides more than a mere update of the CMB map introduced in a previous paper since it benefits from an improvement of the component separation method L-GMCA (Local-Generalized Morphological Component Analysis), which facilitates efficient separation of correlated components. Based on the most recent CMB data, we further confirm previous results showing that the proposed CMB map estimate exhibits appealing characteristics for astrophysical and cosmological applications: I) it is a full-sky map as it did not require any inpainting or interpolation postprocessing; II) foreground contamination is very low even on the galactic center; and III) the map does not exhibit any detectable trace of thermal Sunyaev-Zel'dovich contamination. We show that its power spectrum is in good agreement with the Planck PR2 official theoretical best-fit power spectrum. Finally, following the principle of reproducible research, we provide the codes to reproduce the L-GMCA, which makes it the only reproducible CMB map. The reconstructed CMB map and the code are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/591/A50
Atlantic Bluefin Tuna (Thunnus thynnus) Biometrics and Condition.
Rodriguez-Marin, Enrique; Ortiz, Mauricio; Ortiz de Urbina, José María; Quelle, Pablo; Walter, John; Abid, Noureddine; Addis, Piero; Alot, Enrique; Andrushchenko, Irene; Deguara, Simeon; Di Natale, Antonio; Gatt, Mark; Golet, Walter; Karakulak, Saadet; Kimoto, Ai; Macias, David; Saber, Samar; Santos, Miguel Neves; Zarrad, Rafik
2015-01-01
The compiled data for this study represents the first Atlantic and Mediterranean-wide effort to pool all available biometric data for Atlantic bluefin tuna (Thunnus thynnus) with the collaboration of many countries and scientific groups. Biometric relationships were based on an extensive sampling (over 140,000 fish sampled), covering most of the fishing areas for this species in the North Atlantic Ocean and Mediterranean Sea. Sensitivity analyses were carried out to evaluate the representativeness of sampling and explore the most adequate procedure to fit the weight-length relationship (WLR). The selected model for the WLRs by stock included standardized data series (common measurement types) weighted by the inverse variability. There was little difference between annual stock-specific round weight-straight fork length relationships, with an overall difference of 6% in weight. The predicted weight by month was estimated as an additional component in the exponent of the weight-length function. The analyses of monthly variations of fish condition by stock, maturity state and geographic area reflect annual cycles of spawning and feeding behavior. We update and improve upon the biometric relationships for bluefin currently used by the International Commission for the Conservation of Atlantic Tunas, by incorporating substantially larger datasets than ever previously compiled, providing complete documentation of sources and employing robust statistical fitting. WLRs and other conversion factors estimated in this study differ from the ones used in previous bluefin stock assessments.
Real-time stereo vision-based lane detection system
NASA Astrophysics Data System (ADS)
Fan, Rui; Dahnoun, Naim
2018-07-01
The detection of multiple curved lane markings on a non-flat road surface is still a challenging task for vehicular systems. To make an improvement, depth information can be used to enhance the robustness of the lane detection systems. In this paper, a proposed lane detection system is developed from our previous work where the estimation of the dense vanishing point is further improved using the disparity information. However, the outliers in the least squares fitting severely affect the accuracy when estimating the vanishing point. Therefore, in this paper we use random sample consensus to update the parameters of the road model iteratively until the percentage of the inliers exceeds our pre-set threshold. This significantly helps the system to overcome some suddenly changing conditions. Furthermore, we propose a novel lane position validation approach which computes the energy of each possible solution and selects all satisfying lane positions for visualisation. The proposed system is implemented on a heterogeneous system which consists of an Intel Core i7-4720HQ CPU and an NVIDIA GTX 970M GPU. A processing speed of 143 fps has been achieved, which is over 38 times faster than our previous work. Moreover, in order to evaluate the detection precision, we tested 2495 frames including 5361 lanes. It is shown that the overall successful detection rate is increased from 98.7% to 99.5%.
Cost of rotavirus diarrhea for programmatic evaluation of vaccination in Vietnam.
Riewpaiboon, Arthorn; Shin, Sunheang; Le, Thi Phuong Mai; Vu, Dinh Thiem; Nguyen, Thi Hien Anh; Alexander, Neal; Dang, Duc Anh
2016-08-11
Rotavirus is the most common etiology of diarrhea-associated hospitalizations and clinic visits in Vietnamese children < 5 years old. To estimate the economic burden of rotavirus-associated formal healthcare encounters, an economic study was conducted. A cost-of-illness study was performed from a societal perspective. Data were collected from children below the age of five years who presented to a clinic or hospital with symptoms of acute gastroenteritis (AGE). Patient-specific information on resource use and cost was obtained through caregiver interviews and medical chart review. Costs are presented in 2014 US dollar ($). A total of 557 children with symptoms of AGE were enrolled from March through June 2009, with mean age of 16.5 months. Of the 340 outpatients and 217 admitted patients enrolled, 41 % tested rotavirus positive. It was found that, from a societal perspective, the mean total cost of AGE was $175. Costs of patients with and without rotavirus were $217 and $158, respectively. From multiple regression analysis, it was found that rotavirus infection, patient age and receiving oral rehydration solution before visiting health facility had significant effect on the costs. This study clearly demonstrated substantial economic burden of AGE including rotavirus disease. They were significantly greater than the previously reported cost estimates in Vietnam. These updated costs of illness result in more favorable vaccine cost-effectiveness than in previous economic evaluations.
Worldwide Ocean Optics Database (WOOD)
2001-09-30
user can obtain values computed from empirical algorithms (e.g., beam attenuation estimated from diffuse attenuation and backscatter data). Error ...from empirical algorithms (e.g., beam attenuation estimated from diffuse attenuation and backscatter data). Error estimates will also be provided for...properties, including diffuse attenuation, beam attenuation, and scattering. The database shall be easy to use, Internet accessible, and frequently updated
Updating national forest inventory estimates of growing stock volume using hybrid inference
Sonia Condés; Ronald E. McRoberts
2017-01-01
International organizations increasingly require estimates of forest parameters to monitor the state of and changes in forest resources, the sustainability of forest practices and the role of forests in the carbon cycle. Most countries rely on data from their national forest inventories (NFI) to produce these estimates. However, because NFI survey years may not match...
The North American Carbon Budget Past, Present and Future
NASA Astrophysics Data System (ADS)
Hayes, D. J.; Vargas, R.; Alin, S. R.; Conant, R. T.; Hutyra, L.; Jacobson, A. R.; Kurz, W. A.; Liu, S.; McGuire, A. D.; Poulter, B.; Woodall, C. W.
2016-12-01
Scientific information quantifying and characterizing the continental-scale carbon budget is necessary for developing national and international policy on climate change. The North American continent (NA) has been considered to be a significant net source of carbon to the atmosphere, with fossil fuel emissions from the U.S., Canada and Mexico far outpacing uptake on land, inland waters and adjacent coastal oceans. As reported in the First State of the Carbon Cycle Report (SOCCR-1), the three countries combined to emit approximately 1.8 billion tons of carbon in 2003, or 27% of the global total fossil fuel inventory. Based on inventory data from various sectors, SOCCR-1 estimated a 500 MtC/yr natural sink that offset about 30% of emissions primarily through forest growth, storage in wood products and sequestration in agricultural soils. Here we present a synthesis of the NA carbon budget for the next report (SOCCR-2) based on updated inventory data and new research over the last decade. After increasing at a rate of 1% per year over the previous 30 years, the combined fossil fuel emissions from the three countries show a decreasing trend over the last decade. The decline is due to the economic recession along with increasing carbon efficiency, and the result is a lower share (20%) of the global total. Synthesizing inventory-based data from forest, agriculture and other sectors over the past decade results in a smaller estimate for terrestrial C uptake (350 MtC/yr, or about 20% of emissions) than SOCCR-1, but excludes potential sinks of highly uncertain magnitude. Estimates from atmospheric and biosphere models suggest stronger sinks on the order of 30 to 50% of emissions, but these vary widely within and across the ensembles. This updated report draws attention to key data gaps in carbon accounting frameworks and uncertainties in modeling approaches, but also highlights integrated approaches for improving our understanding of the NA carbon cycle.
Summary Analysis: Hanford Site Composite Analysis Update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nichols, W. E.; Lehman, L. L.
2017-06-05
The Hanford Site’s currently maintained Composite Analysis, originally completed in 1998, requires an update. A previous update effort was undertaken by the U.S. Department of Energy (DOE) in 2001-2005, but was ended before completion to allow the Tank Closure & Waste Management Environmental Impact Statement (TC&WM EIS) (DOE/EIS-0391) to be prepared without potential for conflicting sitewide models. This EIS was issued in 2012, and the deferral was ended with guidance in memorandum “Modeling to Support Regulatory Decision Making at Hanford” (Williams, 2012) provided with the aim of ensuring subsequent modeling is consistent with the EIS.
Sonja N. Oswalt; W. Brad Smith; Patrick D. Miles; Scott A. Pugh
2014-01-01
Forest resource statistics from the 2010 Resources Planning Act (RPA) Assessment were updated to provide current information on the Nation's forests as a baseline for the 2015 national assessment. Resource tables present estimates of forest area, volume, mortality, growth, removals, and timber products output in various ways, such as by ownership, region, or State...
The New Global Gapless GLASS Albedo Product from 1981 to 2014
NASA Astrophysics Data System (ADS)
Dou, B.; Liu, Q.; Qu, Y.; Wang, L.; Feng, Y.; Nie, A.; Li, X.; Zhang, J.; Niu, H.; Cai, E.; Zhao, L.
2016-12-01
Long-time series and various spatial resolution albedo products are needed for climate change and environmental studies at both global and regional scale. To meet these requirements, GLASS (Global LAnd Surface Satellites) gapless albedo product from 1981 to 2010 was firstly released in 2012 and widely used in long-term earth change researches. However, only shortwave albedo product in spatial resolution of 0.05 degree and 1 km were provided, which limits extensive applications for visible and near-infrared bands. Thus, new GLASS albedo product are produced and comprehensively enhanced in time series, algorithm and product content. Five major updates are conducted: 1) Time region is expanded from 1981-2010 to 1981-2014; 2) Physically ART (radiative transfer theory) and TCOWA (Three-Component Ocean Water Albedo) models rather than previous RTLSR (Rose-Thick Li-Sparse Reciprocal kernel combination) model are adopted for snow and inland water albedo estimation, respectively; 3) global shortwave, visible, and near-infrared albedos in spatial resolution of 0.05 degree and 1 km are released; 4) Clear-sky albedo is provided beyond the traditional black-sky albedo and white sky-albedo for amateurish user; 5) 250 m albedo product is provided in part of global for regional application. In this study, we firstly detail the updates of this inspiring product. Then the product is compared with the previous GLASS albedo product and preliminary assessed against field measurements under various land covers. Significant improvements are reported for snow and water albedo. The results demonstrate that the new GLASS albedo product is a gapless, long-term continuous, and self-consistent data-set. Comparing to previous GLASS albedo product, lower black-sky albedo and higher white-sky albedo are proved for permanent snow-cover region. Moreover, higher albedo of inland water and seasonal snow-cover mountain are captured. This product brings new chance and view to understanding long-term earth process and change.
Haksari, Ekawaty L; Lafeber, Harrie N; Hakimi, Mohammad; Pawirohartono, Endy P; Nyström, Lennarth
2016-11-21
The birth weight reference curve to estimate the newborns at risk in need of assessment and monitoring has been established. The previous reference curves from Indonesia, approximately 8 years ago, were based on the data collected from teaching hospitals only with limited gestational ages. The aims of the study were to update the reference curves for birth weight, supine length and head circumference for Indonesia, and to compare birth weight curves of boys and girls, first child and later children, and the ones in the previous studies. Data were extracted from the Maternal-Perinatal database between 1998-2007. Only live singletons with recorded gestational ages of 26 to 42 weeks and the exact time of admission to the neonatal facilities delivered or referred within 24 h of age to Sardjito Hospital, five district hospitals and five health centers in Yogyakarta Special Territory were included. Newborns with severely ill conditions, congenital anomaly and chromosomal abnormality were excluded. Smoothening of the curves was accomplished using a third-order polynomial equation. Our study included 54,599 singleton live births. Growth curves were constructed for boys (53.3%) and girls (46.7%) for birth weight, supine length, and head circumference. At term, mean birth weight for each gestational age of boys was significantly higher than that of girls. While mean birth weight for each gestational age of first-born-children, on the other hand was significantly lower than that of later-born-children. The mean birth weight was lower than that of Lubchenco's study. Compared with the previous Indonesian study by Alisyahbana, no differences were observed for the aterm infants, but lower mean birth weight was observed in preterm infants. Updated neonatal reference curves for birth weight, supine length and head circumference are important to classify high risk newborns in specific area and to identify newborns requiring attention.
NASA Astrophysics Data System (ADS)
Gurung, H.; Banerjee, A.
2016-02-01
This report presents the development of an extended Kalman filter (EKF) to harness the self-sensing capability of a shape memory alloy (SMA) wire, actuating a linear spring. The stress and temperature of the SMA wire, constituting the state of the system, are estimated using the EKF, from the measured change in electrical resistance (ER) of the SMA. The estimated stress is used to compute the change in length of the spring, eliminating the need for a displacement sensor. The system model used in the EKF comprises the heat balance equation and the constitutive relation of the SMA wire coupled with the force-displacement behavior of a spring. Both explicit and implicit approaches are adopted to evaluate the system model at each time-update step of the EKF. Next, in the measurement-update step, estimated states are updated based on the measured electrical resistance. It has been observed that for the same time step, the implicit approach consumes less computational time than the explicit method. To verify the implementation, EKF estimated states of the system are compared with those of an established model for different inputs to the SMA wire. An experimental setup is developed to measure the actual spring displacement and ER of the SMA, for any time-varying voltage applied to it. The process noise covariance is decided using a heuristic approach, whereas the measurement noise covariance is obtained experimentally. Finally, the EKF is used to estimate the spring displacement for a given input and the corresponding experimentally obtained ER of the SMA. The qualitative agreement between the EKF estimated displacement with that obtained experimentally reveals the true potential of this approach to harness the self-sensing capability of the SMA.
Evangeliou, Nikolaos; Balkanski, Yves; Cozic, Anne; Møller, Anders Pape
2014-12-01
The present paper studies how a random event (earthquake) and the subsequent disaster in Japan affect transport and deposition of fallout and the resulting health consequences. Therefore, except for the original accident in March 2011, three additional scenarios are assessed assuming that the same releases took place in winter 2010, summer 2011 and autumn 2011 in order to cover a full range of annual seasonality. This is also the first study where a large number of fission products released from the accident are used to assess health risks with the maximum possible efficiency. Xenon-133 and (137)Cs are directly estimated within the model, whereas 15 other radionuclides are calculated indirectly using reported isotopic ratios. As much as 85% of the released (137)Cs would be deposited in continental regions worldwide if the accident occurred in winter 2010, 22% in spring 2011 (when it actually happened), 55% in summer 2011 and 48% if it occurred during autumn 2011. Solid cancer incidents and mortalities from Fukushima are estimated to be between 160 and 880 and from 110 to 640 close to previous estimations. By adding thyroid cancers, the total number rises from 230 to 850 for incidents and from 120 to 650 for mortalities. Fatalities due to worker exposure and mandatory evacuation have been reported to be around 610 increasing total estimated mortalities to 730-1260. These estimates are 2.8 times higher than previously reported ones for radiocaesium and (131)I and 16% higher than those reported based on radiocaesium only. Total expected fatalities from Fukushima are 32% lower than in the winter scenario, 5% that in the summer scenario and 30% lower than in the autumn scenario. Nevertheless, cancer fatalities are expected to be less than 5% of those from the tsunami (~20,000). Copyright © 2014 Elsevier B.V. All rights reserved.
A validation of ground ambulance pre-hospital times modeled using geographic information systems.
Patel, Alka B; Waters, Nigel M; Blanchard, Ian E; Doig, Christopher J; Ghali, William A
2012-10-03
Evaluating geographic access to health services often requires determining the patient travel time to a specified service. For urgent care, many research studies have modeled patient pre-hospital time by ground emergency medical services (EMS) using geographic information systems (GIS). The purpose of this study was to determine if the modeling assumptions proposed through prior United States (US) studies are valid in a non-US context, and to use the resulting information to provide revised recommendations for modeling travel time using GIS in the absence of actual EMS trip data. The study sample contained all emergency adult patient trips within the Calgary area for 2006. Each record included four components of pre-hospital time (activation, response, on-scene and transport interval). The actual activation and on-scene intervals were compared with those used in published models. The transport interval was calculated within GIS using the Network Analyst extension of Esri ArcGIS 10.0 and the response interval was derived using previously established methods. These GIS derived transport and response intervals were compared with the actual times using descriptive methods. We used the information acquired through the analysis of the EMS trip data to create an updated model that could be used to estimate travel time in the absence of actual EMS trip records. There were 29,765 complete EMS records for scene locations inside the city and 529 outside. The actual median on-scene intervals were longer than the average previously reported by 7-8 minutes. Actual EMS pre-hospital times across our study area were significantly higher than the estimated times modeled using GIS and the original travel time assumptions. Our updated model, although still underestimating the total pre-hospital time, more accurately represents the true pre-hospital time in our study area. The widespread use of generalized EMS pre-hospital time assumptions based on US data may not be appropriate in a non-US context. The preference for researchers should be to use actual EMS trip records from the proposed research study area. In the absence of EMS trip data researchers should determine which modeling assumptions more accurately reflect the EMS protocols across their study area.
A recursive Bayesian updating model of haptic stiffness perception.
Wu, Bing; Klatzky, Roberta L
2018-06-01
Stiffness of many materials follows Hooke's Law, but the mechanism underlying the haptic perception of stiffness is not as simple as it seems in the physical definition. The present experiments support a model by which stiffness perception is adaptively updated during dynamic interaction. Participants actively explored virtual springs and estimated their stiffness relative to a reference. The stimuli were simulations of linear springs or nonlinear springs created by modulating a linear counterpart with low-amplitude, half-cycle (Experiment 1) or full-cycle (Experiment 2) sinusoidal force. Experiment 1 showed that subjective stiffness increased (decreased) as a linear spring was positively (negatively) modulated by a half-sinewave force. In Experiment 2, an opposite pattern was observed for full-sinewave modulations. Modeling showed that the results were best described by an adaptive process that sequentially and recursively updated an estimate of stiffness using the force and displacement information sampled over trajectory and time. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
An updated comprehensive techno-economic analysis of algae biodiesel.
Nagarajan, Sanjay; Chou, Siaw Kiang; Cao, Shenyan; Wu, Chen; Zhou, Zhi
2013-10-01
Algae biodiesel is a promising but expensive alternative fuel to petro-diesel. To overcome cost barriers, detailed cost analyses are needed. A decade-old cost analysis by the U.S. National Renewable Energy Laboratory indicated that the costs of algae biodiesel were in the range of $0.53-0.85/L (2012 USD values). However, the cost of land and transesterification were just roughly estimated. In this study, an updated comprehensive techno-economic analysis was conducted with optimized processes and improved cost estimations. Latest process improvement, quotes from vendors, government databases, and other relevant data sources were used to calculate the updated algal biodiesel costs, and the final costs of biodiesel are in the range of $0.42-0.97/L. Additional improvements on cost-effective biodiesel production around the globe to cultivate algae was also recommended. Overall, the calculated costs seem promising, suggesting that a single step biodiesel production process is close to commercial reality. Copyright © 2012 Elsevier Ltd. All rights reserved.
iTree-Hydro: Snow hydrology update for the urban forest hydrology model
Yang Yang; Theodore A. Endreny; David J. Nowak
2011-01-01
This article presents snow hydrology updates made to iTree-Hydro, previously called the Urban Forest EffectsâHydrology model. iTree-Hydro Version 1 was a warm climate model developed by the USDA Forest Service to provide a process-based planning tool with robust water quantity and quality predictions given data limitations common to most urban areas. Cold climate...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Xiubin; Gao, Yaozong; Shen, Dinggang, E-mail: dgshen@med.unc.edu
2015-05-15
Purpose: In image guided radiation therapy, it is crucial to fast and accurately localize the prostate in the daily treatment images. To this end, the authors propose an online update scheme for landmark-guided prostate segmentation, which can fully exploit valuable patient-specific information contained in the previous treatment images and can achieve improved performance in landmark detection and prostate segmentation. Methods: To localize the prostate in the daily treatment images, the authors first automatically detect six anatomical landmarks on the prostate boundary by adopting a context-aware landmark detection method. Specifically, in this method, a two-layer regression forest is trained as amore » detector for each target landmark. Once all the newly detected landmarks from new treatment images are reviewed or adjusted (if necessary) by clinicians, they are further included into the training pool as new patient-specific information to update all the two-layer regression forests for the next treatment day. As more and more treatment images of the current patient are acquired, the two-layer regression forests can be continually updated by incorporating the patient-specific information into the training procedure. After all target landmarks are detected, a multiatlas random sample consensus (multiatlas RANSAC) method is used to segment the entire prostate by fusing multiple previously segmented prostates of the current patient after they are aligned to the current treatment image. Subsequently, the segmented prostate of the current treatment image is again reviewed (or even adjusted if needed) by clinicians before including it as a new shape example into the prostate shape dataset for helping localize the entire prostate in the next treatment image. Results: The experimental results on 330 images of 24 patients show the effectiveness of the authors’ proposed online update scheme in improving the accuracies of both landmark detection and prostate segmentation. Besides, compared to the other state-of-the-art prostate segmentation methods, the authors’ method achieves the best performance. Conclusions: By appropriate use of valuable patient-specific information contained in the previous treatment images, the authors’ proposed online update scheme can obtain satisfactory results for both landmark detection and prostate segmentation.« less
Systemic autoimmune rheumatic disease prevalence in Canada: updated analyses across 7 provinces.
Broten, Laurel; Aviña-Zubieta, J Antonio; Lacaille, Diane; Joseph, Lawrence; Hanly, John G; Lix, Lisa; O'Donnell, Siobhan; Barnabe, Cheryl; Fortin, Paul R; Hudson, Marie; Jean, Sonia; Peschken, Christine; Edworthy, Steven M; Svenson, Larry; Pineau, Christian A; Clarke, Ann E; Smith, Mark; Bélisle, Patrick; Badley, Elizabeth M; Bergeron, Louise; Bernatsky, Sasha
2014-04-01
To estimate systemic autoimmune rheumatic disease (SARD) prevalence across 7 Canadian provinces using population-based administrative data evaluating both regional variations and the effects of age and sex. Using provincial physician billing and hospitalization data, cases of SARD (systemic lupus erythematosus, scleroderma, primary Sjögren syndrome, polymyositis/dermatomyositis) were ascertained. Three case definitions (rheumatology billing, 2-code physician billing, and hospital diagnosis) were combined to derive a SARD prevalence estimate for each province, categorized by age, sex, and rural/urban status. A hierarchical Bayesian latent class regression model was fit to account for the imperfect sensitivity and specificity of each case definition. The model also provided sensitivity estimates of different case definition approaches. Prevalence estimates for overall SARD ranged between 2 and 5 cases per 1000 residents across provinces. Similar demographic trends were evident across provinces, with greater prevalence in women and in persons over 45 years old. SARD prevalence in women over 45 was close to 1%. Overall sensitivity was poor, but estimates for each of the 3 case definitions improved within older populations and were slightly higher for men compared to women. Our results are consistent with previous estimates and other North American findings, and provide results from coast to coast, as well as useful information about the degree of regional and demographic variations that can be seen within a single country. Our work demonstrates the usefulness of using multiple data sources, adjusting for the error in each, and providing estimates of the sensitivity of different case definition approaches.
Cost of services provided by the National Breast and Cervical Cancer Early Detection Program.
Ekwueme, Donatus U; Subramanian, Sujha; Trogdon, Justin G; Miller, Jacqueline W; Royalty, Janet E; Li, Chunyu; Guy, Gery P; Crouse, Wesley; Thompson, Hope; Gardner, James G
2014-08-15
The National Breast and Cervical Cancer Early Detection Program (NBCCEDP) is the largest cancer screening program for low-income women in the United States. This study updates previous estimates of the costs of delivering preventive cancer screening services in the NBCCEDP. We developed a standardized web-based cost-assessment tool to collect annual activity-based cost data on screening for breast and cervical cancer in the NBCCEDP. Data were collected from 63 of the 66 programs that received funding from the Centers for Disease Control and Prevention during the 2006/2007 fiscal year. We used these data to calculate costs of delivering preventive public health services in the program. We estimated the total cost of all NBCCEDP services to be $296 (standard deviation [SD], $123) per woman served (including the estimated value of in-kind donations, which constituted approximately 15% of this total estimated cost). The estimated cost of screening and diagnostic services was $145 (SD, $38) per women served, which represented 57.7% of the total cost excluding the value of in-kind donations. Including the value of in-kind donations, the weighted mean cost of screening a woman for breast cancer was $110 with an office visit and $88 without, the weighted mean cost of a diagnostic procedure was $401, and the weighted mean cost per breast cancer detected was $35,480. For cervical cancer, the corresponding cost estimates were $61, $21, $415, and $18,995, respectively. These NBCCEDP cost estimates may help policy makers in planning and implementing future costs for various potential changes to the program. © 2014 American Cancer Society.
Atlantis, Evan; Cheema, Birinder S
2015-03-01
: Audience response system (ARS) technology is a recent innovation that is increasingly being used by health educators to improve learning outcomes. Equivocal results from previous systematic review research provide weak support for the use of ARS for improving learning outcomes at both short and long terms. This review sought to update and critically review the body of controlled experimental evidence on the use of ARS technology on learning outcomes in health students and professionals. This review searched using all identified keywords both electronic databases (CINAHL, Embase, ERIC, Medline, Science Direct, Scopus, and Web of Science) and reference lists of retrieved articles to find relevant published studies for review, from 2010 to April 2014. A descriptive synthesis of important study characteristics and effect estimates for learning outcomes was done. Three controlled trials in 321 participants from the United States were included for review. ARS knowledge retention scores were lower than the control group in one study, higher than control group provided that immediate feedback was given about each question in one study, and equivalent between intervention and control groups in another study. There is an absence of good quality evidence on effectiveness of ARS technologies for improving learning outcomes in health students and professionals.
Model-free and model-based reward prediction errors in EEG.
Sambrook, Thomas D; Hardwick, Ben; Wills, Andy J; Goslin, Jeremy
2018-05-24
Learning theorists posit two reinforcement learning systems: model-free and model-based. Model-based learning incorporates knowledge about structure and contingencies in the world to assign candidate actions with an expected value. Model-free learning is ignorant of the world's structure; instead, actions hold a value based on prior reinforcement, with this value updated by expectancy violation in the form of a reward prediction error. Because they use such different learning mechanisms, it has been previously assumed that model-based and model-free learning are computationally dissociated in the brain. However, recent fMRI evidence suggests that the brain may compute reward prediction errors to both model-free and model-based estimates of value, signalling the possibility that these systems interact. Because of its poor temporal resolution, fMRI risks confounding reward prediction errors with other feedback-related neural activity. In the present study, EEG was used to show the presence of both model-based and model-free reward prediction errors and their place in a temporal sequence of events including state prediction errors and action value updates. This demonstration of model-based prediction errors questions a long-held assumption that model-free and model-based learning are dissociated in the brain. Copyright © 2018 Elsevier Inc. All rights reserved.
Revised spatially distributed global livestock emissions
NASA Astrophysics Data System (ADS)
Asrar, G.; Wolf, J.; West, T. O.
2015-12-01
Livestock play an important role in agricultural carbon cycling through consumption of biomass and emissions of methane. Quantification and spatial distribution of methane and carbon dioxide produced by livestock is needed to develop bottom-up estimates for carbon monitoring. These estimates serve as stand-alone international emissions estimates, as input to global emissions modeling, and as comparisons or constraints to flux estimates from atmospheric inversion models. Recent results for the US suggest that the 2006 IPCC default coefficients may underestimate livestock methane emissions. In this project, revised coefficients were calculated for cattle and swine in all global regions, based on reported changes in body mass, quality and quantity of feed, milk production, and management of living animals and manure for these regions. New estimates of livestock methane and carbon dioxide emissions were calculated using the revised coefficients and global livestock population data. Spatial distribution of population data and associated fluxes was conducted using the MODIS Land Cover Type 5, version 5.1 (i.e. MCD12Q1 data product), and a previously published downscaling algorithm for reconciling inventory and satellite-based land cover data at 0.05 degree resolution. Preliminary results for 2013 indicate greater emissions than those calculated using the IPCC 2006 coefficients. Global total enteric fermentation methane increased by 6%, while manure management methane increased by 38%, with variation among species and regions resulting in improved spatial distributions of livestock emissions. These new estimates of total livestock methane are comparable to other recently reported studies for the entire US and the State of California. These new regional/global estimates will improve the ability to reconcile top-down and bottom-up estimates of methane production as well as provide updated global estimates for use in development and evaluation of Earth system models.
Saha, Dibakar; Alluri, Priyanka; Gan, Albert
2017-01-01
The Highway Safety Manual (HSM) presents statistical models to quantitatively estimate an agency's safety performance. The models were developed using data from only a few U.S. states. To account for the effects of the local attributes and temporal factors on crash occurrence, agencies are required to calibrate the HSM-default models for crash predictions. The manual suggests updating calibration factors every two to three years, or preferably on an annual basis. Given that the calibration process involves substantial time, effort, and resources, a comprehensive analysis of the required calibration factor update frequency is valuable to the agencies. Accordingly, the objective of this study is to evaluate the HSM's recommendation and determine the required frequency of calibration factor updates. A robust Bayesian estimation procedure is used to assess the variation between calibration factors computed annually, biennially, and triennially using data collected from over 2400 miles of segments and over 700 intersections on urban and suburban facilities in Florida. Bayesian model yields a posterior distribution of the model parameters that give credible information to infer whether the difference between calibration factors computed at specified intervals is credibly different from the null value which represents unaltered calibration factors between the comparison years or in other words, zero difference. The concept of the null value is extended to include the range of values that are practically equivalent to zero. Bayesian inference shows that calibration factors based on total crash frequency are required to be updated every two years in cases where the variations between calibration factors are not greater than 0.01. When the variations are between 0.01 and 0.05, calibration factors based on total crash frequency could be updated every three years. Copyright © 2016 Elsevier Ltd. All rights reserved.
Kennedy, Paula L; Woodbury, Allan D
2002-01-01
In ground water flow and transport modeling, the heterogeneous nature of porous media has a considerable effect on the resulting flow and solute transport. Some method of generating the heterogeneous field from a limited dataset of uncertain measurements is required. Bayesian updating is one method that interpolates from an uncertain dataset using the statistics of the underlying probability distribution function. In this paper, Bayesian updating was used to determine the heterogeneous natural log transmissivity field for a carbonate and a sandstone aquifer in southern Manitoba. It was determined that the transmissivity in m2/sec followed a natural log normal distribution for both aquifers with a mean of -7.2 and - 8.0 for the carbonate and sandstone aquifers, respectively. The variograms were calculated using an estimator developed by Li and Lake (1994). Fractal nature was not evident in the variogram from either aquifer. The Bayesian updating heterogeneous field provided good results even in cases where little data was available. A large transmissivity zone in the sandstone aquifer was created by the Bayesian procedure, which is not a reflection of any deterministic consideration, but is a natural outcome of updating a prior probability distribution function with observations. The statistical model returns a result that is very reasonable; that is homogeneous in regions where little or no information is available to alter an initial state. No long range correlation trends or fractal behavior of the log-transmissivity field was observed in either aquifer over a distance of about 300 km.
Methodological Framework for Analysis of Buildings-Related Programs with BEAMS, 2008
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elliott, Douglas B.; Dirks, James A.; Hostick, Donna J.
The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official “benefits estimates” for each of its major programs using its Planning, Analysis, and Evaluation (PAE) Team. PAE conducts an annual integrated modeling and analysis effort to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. These estimates are part of EERE’s budget request and are also used in the formulation of EERE’s performance measures. Two of EERE’s major programs are the Building Technologies Program (BT) and the Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports PAEmore » by developing the program characterizations and other market information necessary to provide input to the EERE integrated modeling analysis as part of PAE’s Portfolio Decision Support (PDS) effort. Additionally, PNNL also supports BT by providing line-item estimates for the Program’s internal use. PNNL uses three modeling approaches to perform these analyses. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits using one of those methods: the Building Energy Analysis and Modeling System (BEAMS). BEAMS is a PC-based accounting model that was built in Visual Basic by PNNL specifically for estimating the benefits of buildings-related projects. It allows various types of projects to be characterized including whole-building, envelope, lighting, and equipment projects. This document contains an overview section that describes the estimation process and the models used to estimate energy savings. The body of the document describes the algorithms used within the BEAMS software. This document serves both as stand-alone documentation for BEAMS, and also as a supplemental update of a previous document, Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics Effort, (Elliott et al. 2004b). The areas most changed since the publication of that previous document are those discussing the calculation of lighting and HVAC interactive effects (for both lighting and envelope/whole-building projects). This report does not attempt to convey inputs to BEAMS or the methodology of their derivation.« less
DOT National Transportation Integrated Search
2001-07-01
This working paper has been prepared to provide new estimates of the costs to deploy Intelligent Transportation System (ITS) infrastructure elements in the largest metropolitan areas in the United States. It builds upon estimates that were distribute...
DOT National Transportation Integrated Search
2000-08-01
This working paper has been prepared to provide new estimates of the costs to deploy Intelligent Transportation System (ITS) infrastructure elements in the largest metropolitan areas in the United States. It builds upon estimates that were distribute...
Marino, Alexandria C.; Chun, Marvin M.
2011-01-01
During natural vision, eye movements can drastically alter the retinotopic (eye-centered) coordinates of locations and objects, yet the spatiotopic (world-centered) percept remains stable. Maintaining visuospatial attention in spatiotopic coordinates requires updating of attentional representations following each eye movement. However, this updating is not instantaneous; attentional facilitation temporarily lingers at the previous retinotopic location after a saccade, a phenomenon known as the retinotopic attentional trace. At various times after a saccade, we probed attention at an intermediate location between the retinotopic and spatiotopic locations to determine whether a single locus of attentional facilitation slides progressively from the previous retinotopic location to the appropriate spatiotopic location, or whether retinotopic facilitation decays while a new, independent spatiotopic locus concurrently becomes active. Facilitation at the intermediate location was not significant at any time, suggesting that top-down attention can result in enhancement of discrete retinotopic and spatiotopic locations without passing through intermediate locations. PMID:21258903
Himmelstoss, Emily A.; Kratzmann, Meredith G.; Thieler, E. Robert
2017-07-18
Long-term rates of shoreline change for the Gulf of Mexico and Southeast Atlantic regions of the United States have been updated as part of the U.S. Geological Survey’s National Assessment of Shoreline Change project. Additional shoreline position data were used to compute rates where the previous rate-of-change assessment only included four shoreline positions at a given location. The long-term shoreline change rates also incorporate the proxy-datum bias correction to account for the unidirectional onshore bias of the proxy-based high water line shorelines relative to the datum-based mean high water shorelines. The calculation of uncertainty associated with the long-term average rates has also been updated to match refined methods used in other study regions of the National Assessment project. The average rates reported here have a reduced amount of uncertainty relative to those presented in the previous assessments for these two regions.
French, Michael T.; Popovici, Ioana; Tapsell, Lauren
2008-01-01
Federal, State, and local government agencies require current and accurate cost information for publicly funded substance abuse treatment programs to guide program assessments and reimbursement decisions. The Center for Substance Abuse Treatment (CSAT) published a list of modality-specific cost bands for this purpose in 2002. However, the upper and lower values in these ranges are so wide that they offer little practical guidance for funding agencies. Thus, the dual purpose of this investigation was to assemble the most current and comprehensive set of economic cost estimates from the readily-available literature and then use these estimates to develop updated modality-specific cost bands for more reasonable reimbursement policies. Although cost estimates were scant for some modalities, the recommended cost bands are based on the best available economic research, and we believe these new ranges will be more useful and pertinent for all stakeholders of publicly-funded substance abuse treatment. PMID:18294803
Aerosol–climate interactions in the Norwegian Earth System Model – NorESM1-M
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kirkevåg, A.; Iversen, T.; Seland, Ø.
2013-01-01
The objective of this study is to document and evaluate recent changes and updates to the module for aerosols and aerosol–cloud–radiation interactions in the atmospheric module CAM4-Oslo of the core version of the Norwegian Earth System Model (NorESM), NorESM1-M. Particular attention is paid to the role of natural organics, sea salt, and mineral dust in determining the gross aerosol properties as well as the anthropogenic contribution to these properties and the associated direct and indirect radiative forcing. The aerosol module is extended from earlier versions that have been published, and includes life-cycling of sea salt, mineral dust, particulate sulphate, blackmore » carbon, and primary and secondary organics. The impacts of most of the numerous changes since previous versions are thoroughly explored by sensitivity experiments. The most important changes are: modified prognostic sea salt emissions; updated treatment of precipitation scavenging and gravitational settling; inclusion of biogenic primary organics and methane sulphonic acid (MSA) from oceans; almost doubled production of land-based biogenic secondary organic aerosols (SOA); and increased ratio of organic matter to organic carbon (OM/OC) for biomass burning aerosols from 1.4 to 2.6. Compared with in situ measurements and remotely sensed data, the new treatments of sea salt and dust aerosols give smaller biases in near-surface mass concentrations and aerosol optical depth than in the earlier model version. The model biases for mass concentrations are approximately unchanged for sulphate and BC. The enhanced levels of modeled OM yield improved overall statistics, even though OM is still underestimated in Europe and overestimated in North America. The global anthropogenic aerosol direct radiative forcing (DRF) at the top of the atmosphere has changed from a small positive value to -0.08 W m-2 in CAM4-Oslo. The sensitivity tests suggest that this change can be attributed to the new treatment of biomass burning aerosols and gravitational settling. Although it has not been a goal in this study, the new DRF estimate is closer both to the median model estimate from the AeroCom intercomparison and the best estimate in IPCC AR4. Estimated DRF at the ground surface has increased by ca. 60%, to -1.89 W m-2. We show that this can be explained by new emission data and omitted mixing of constituents between updrafts and downdrafts in convective clouds. The increased abundance of natural OM and the introduction of a cloud droplet spectral dispersion formulation are the most important contributions to a considerably decreased estimate of the indirect radiative forcing (IndRF). The IndRF is also found to be sensitive to assumptions about the coating of insoluble aerosols by sulphate and OM. The IndRF of -1.2 W m-2, which is closer to the IPCC AR4 estimates than the previous estimate of -1.9 W m-2, has thus been obtained without imposing unrealistic artificial lower bounds on cloud droplet number concentrations.« less
ERIC Educational Resources Information Center
White, Margaret
2010-01-01
In March of each year, the ministry publishes the Operating Grants Manual showing estimated funding allocations for school districts for the upcoming school year. These estimates are based on enrolment projections. On September 30 of the new school year, enrolment is counted and the grants are recalculated based on actual enrolment. The ministry…
Updating visual memory across eye movements for ocular and arm motor control.
Thompson, Aidan A; Henriques, Denise Y P
2008-11-01
Remembered object locations are stored in an eye-fixed reference frame, so that every time the eyes move, spatial representations must be updated for the arm-motor system to reflect the target's new relative position. To date, studies have not investigated how the brain updates these spatial representations during other types of eye movements, such as smooth-pursuit. Further, it is unclear what information is used in spatial updating. To address these questions we investigated whether remembered locations of pointing targets are updated following smooth-pursuit eye movements, as they are following saccades, and also investigated the role of visual information in estimating eye-movement amplitude for updating spatial memory. Misestimates of eye-movement amplitude were induced when participants visually tracked stimuli presented with a background that moved in either the same or opposite direction of the eye before pointing or looking back to the remembered target location. We found that gaze-dependent pointing errors were similar following saccades and smooth-pursuit and that incongruent background motion did result in a misestimate of eye-movement amplitude. However, the background motion had no effect on spatial updating for pointing, but did when subjects made a return saccade, suggesting that the oculomotor and arm-motor systems may rely on different sources of information for spatial updating.
Medendorp, W. P.
2015-01-01
It is known that the brain uses multiple reference frames to code spatial information, including eye-centered and body-centered frames. When we move our body in space, these internal representations are no longer in register with external space, unless they are actively updated. Whether the brain updates multiple spatial representations in parallel, or whether it restricts its updating mechanisms to a single reference frame from which other representations are constructed, remains an open question. We developed an optimal integration model to simulate the updating of visual space across body motion in multiple or single reference frames. To test this model, we designed an experiment in which participants had to remember the location of a briefly presented target while being translated sideways. The behavioral responses were in agreement with a model that uses a combination of eye- and body-centered representations, weighted according to the reliability in which the target location is stored and updated in each reference frame. Our findings suggest that the brain simultaneously updates multiple spatial representations across body motion. Because both representations are kept in sync, they can be optimally combined to provide a more precise estimate of visual locations in space than based on single-frame updating mechanisms. PMID:26490289
NASA Astrophysics Data System (ADS)
Emter, Thomas; Petereit, Janko
2014-05-01
An integrated multi-sensor fusion framework for localization and mapping for autonomous navigation in unstructured outdoor environments based on extended Kalman filters (EKF) is presented. The sensors for localization include an inertial measurement unit, a GPS, a fiber optic gyroscope, and wheel odometry. Additionally a 3D LIDAR is used for simultaneous localization and mapping (SLAM). A 3D map is built while concurrently a localization in a so far established 2D map is estimated with the current scan of the LIDAR. Despite of longer run-time of the SLAM algorithm compared to the EKF update, a high update rate is still guaranteed by sophisticatedly joining and synchronizing two parallel localization estimators.
APOSTLE: 11 TRANSIT OBSERVATIONS OF TrES-3b
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kundurthy, P.; Becker, A. C.; Agol, E.
2013-02-10
The Apache Point Survey of Transit Lightcurves of Exoplanets (APOSTLE) observed 11 transits of TrES-3b over two years in order to constrain system parameters and look for transit timing and depth variations. We describe an updated analysis protocol for APOSTLE data, including the reduction pipeline, transit model, and Markov Chain Monte Carlo analyzer. Our estimates of the system parameters for TrES-3b are consistent with previous estimates to within the 2{sigma} confidence level. We improved the errors (by 10%-30%) on system parameters such as the orbital inclination (i {sub orb}), impact parameter (b), and stellar density ({rho}{sub *}) compared to previousmore » measurements. The near-grazing nature of the system, and incomplete sampling of some transits, limited our ability to place reliable uncertainties on individual transit depths and hence we do not report strong evidence for variability. Our analysis of the transit timing data shows no evidence for transit timing variations and our timing measurements are able to rule out super-Earth and gas giant companions in low-order mean motion resonance with TrES-3b.« less
Heuristic reusable dynamic programming: efficient updates of local sequence alignment.
Hong, Changjin; Tewfik, Ahmed H
2009-01-01
Recomputation of the previously evaluated similarity results between biological sequences becomes inevitable when researchers realize errors in their sequenced data or when the researchers have to compare nearly similar sequences, e.g., in a family of proteins. We present an efficient scheme for updating local sequence alignments with an affine gap model. In principle, using the previous matching result between two amino acid sequences, we perform a forward-backward alignment to generate heuristic searching bands which are bounded by a set of suboptimal paths. Given a correctly updated sequence, we initially predict a new score of the alignment path for each contour to select the best candidates among them. Then, we run the Smith-Waterman algorithm in this confined space. Furthermore, our heuristic alignment for an updated sequence shows that it can be further accelerated by using reusable dynamic programming (rDP), our prior work. In this study, we successfully validate "relative node tolerance bound" (RNTB) in the pruned searching space. Furthermore, we improve the computational performance by quantifying the successful RNTB tolerance probability and switch to rDP on perturbation-resilient columns only. In our searching space derived by a threshold value of 90 percent of the optimal alignment score, we find that 98.3 percent of contours contain correctly updated paths. We also find that our method consumes only 25.36 percent of the runtime cost of sparse dynamic programming (sDP) method, and to only 2.55 percent of that of a normal dynamic programming with the Smith-Waterman algorithm.