Sample records for national average results

  1. Comparing State and District Test Results to National Norms: Interpretations of Scoring "Above the National Average."

    ERIC Educational Resources Information Center

    Linn, Robert L.; And Others

    Norm-referenced test results reported by states and school districts and factors related to those scores were studied through mail and telephone surveys of 35 states and a nationally representative sample of 153 school districts to determine the degree to which "above average" results were being reported. Part of the stimulus for this…

  2. 7 CFR 760.640 - National average market price.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false National average market price. 760.640 Section 760.640....640 National average market price. (a) The Deputy Administrator will establish the National Average Market Price (NAMP) using the best sources available, as determined by the Deputy Administrator, which...

  3. Update of Ireland's national average indoor radon concentration - Application of a new survey protocol.

    PubMed

    Dowdall, A; Murphy, P; Pollard, D; Fenton, D

    2017-04-01

    In 2002, a National Radon Survey (NRS) in Ireland established that the geographically weighted national average indoor radon concentration was 89 Bq m -3 . Since then a number of developments have taken place which are likely to have impacted on the national average radon level. Key among these was the introduction of amending Building Regulations in 1998 requiring radon preventive measures in new buildings in High Radon Areas (HRAs). In 2014, the Irish Government adopted the National Radon Control Strategy (NRCS) for Ireland. A knowledge gap identified in the NRCS was to update the national average for Ireland given the developments since 2002. The updated national average would also be used as a baseline metric to assess the effectiveness of the NRCS over time. A new national survey protocol was required that would measure radon in a sample of homes representative of radon risk and geographical location. The design of the survey protocol took into account that it is not feasible to repeat the 11,319 measurements carried out for the 2002 NRS due to time and resource constraints. However, the existence of that comprehensive survey allowed for a new protocol to be developed, involving measurements carried out in unbiased randomly selected volunteer homes. This paper sets out the development and application of that survey protocol. The results of the 2015 survey showed that the current national average indoor radon concentration for homes in Ireland is 77 Bq m -3 , a decrease from the 89 Bq m -3 reported in the 2002 NRS. Analysis of the results by build date demonstrate that the introduction of the amending Building Regulations in 1998 have led to a reduction in the average indoor radon level in Ireland. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. 42 CFR 423.279 - National average monthly bid amount.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... bid amounts for each prescription drug plan (not including fallbacks) and for each MA-PD plan...(h) of the Act. (b) Calculation of weighted average. (1) The national average monthly bid amount is a....258(c)(1) of this chapter) and the denominator equal to the total number of Part D eligible...

  5. 42 CFR 423.279 - National average monthly bid amount.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... bid amounts for each prescription drug plan (not including fallbacks) and for each MA-PD plan...(h) of the Act. (b) Calculation of weighted average. (1) The national average monthly bid amount is a....258(c)(1) of this chapter) and the denominator equal to the total number of Part D eligible...

  6. 42 CFR 423.279 - National average monthly bid amount.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... each MA-PD plan described in section 1851(a)(2)(A)(i) of the Act. The calculation does not include bids... section 1876(h) of the Act. (b) Calculation of weighted average. (1) The national average monthly bid... defined in § 422.258(c)(1) of this chapter) and the denominator equal to the total number of Part D...

  7. 42 CFR 423.279 - National average monthly bid amount.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... each MA-PD plan described in section 1851(a)(2)(A)(i) of the Act. The calculation does not include bids... section 1876(h) of the Act. (b) Calculation of weighted average. (1) The national average monthly bid... defined in § 422.258(c)(1) of this chapter) and the denominator equal to the total number of Part D...

  8. 42 CFR 423.279 - National average monthly bid amount.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... each MA-PD plan described in section 1851(a)(2)(A)(i) of the Act. The calculation does not include bids... section 1876(h) of the Act. (b) Calculation of weighted average. (1) The national average monthly bid... defined in § 422.258(c)(1) of this chapter) and the denominator equal to the total number of Part D...

  9. 47 CFR 36.622 - National and study area average unseparated loop costs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... companies which did not make an update filing by the most recent filing date. (b) Study Area Average... 47 Telecommunication 2 2011-10-01 2011-10-01 false National and study area average unseparated... Universal Service Fund Calculation of Loop Costs for Expense Adjustment § 36.622 National and study area...

  10. 47 CFR 36.622 - National and study area average unseparated loop costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... companies which did not make an update filing by the most recent filing date. (b) Study Area Average... 47 Telecommunication 2 2010-10-01 2010-10-01 false National and study area average unseparated... Universal Service Fund Calculation of Loop Costs for Expense Adjustment § 36.622 National and study area...

  11. Mental health care and average happiness: strong effect in developed nations.

    PubMed

    Touburg, Giorgio; Veenhoven, Ruut

    2015-07-01

    Mental disorder is a main cause of unhappiness in modern society and investment in mental health care is therefore likely to add to average happiness. This prediction was checked in a comparison of 143 nations around 2005. Absolute investment in mental health care was measured using the per capita number of psychiatrists and psychologists working in mental health care. Relative investment was measured using the share of mental health care in the total health budget. Average happiness in nations was measured with responses to survey questions about life-satisfaction. Average happiness appeared to be higher in countries that invest more in mental health care, both absolutely and relative to investment in somatic medicine. A data split by level of development shows that this difference exists only among developed nations. Among these nations the link between mental health care and happiness is quite strong, both in an absolute sense and compared to other known societal determinants of happiness. The correlation between happiness and share of mental health care in the total health budget is twice as strong as the correlation between happiness and size of the health budget. A causal effect is likely, but cannot be proved in this cross-sectional analysis.

  12. 23 CFR Appendix D to Part 1240 - Determination of National Average Seat Belt Use Rate

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 23 Highways 1 2010-04-01 2010-04-01 false Determination of National Average Seat Belt Use Rate D Appendix D to Part 1240 Highways NATIONAL HIGHWAY TRAFFIC SAFETY ADMINISTRATION AND FEDERAL HIGHWAY... BASED ON SEAT BELT USE RATES Pt. 1240, App. D Appendix D to Part 1240—Determination of National Average...

  13. National Highway Traffic Safety Administration Corporate Average Fuel Economy (CAFE) Standards

    DOT National Transportation Integrated Search

    2003-01-01

    The National Highway Traffic Safety Administration (NHTSA) must set Corporate Average Fuel Economy (CAFE) standards for light trucks. This was authorized by the Energy Policy and Conservation Act, which added Title V: Imporving Automotive Fuel Effici...

  14. 78 FR 45178 - National School Lunch, Special Milk, and School Breakfast Programs, National Average Payments...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-26

    ... ``national average payments,'' the amount of money the Federal Government provides States for lunches... institutions with pricing programs that elect to serve milk free to eligible children continue to receive the... during the second preceding school year were served free or at a reduced price. The higher payment level...

  15. 76 FR 43256 - National School Lunch, Special Milk, and School Breakfast Programs, National Average Payments...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-20

    ... ``national average payments,'' the amount of money the Federal Government provides States for lunches... institutions with pricing programs that elect to serve milk free to eligible children continue to receive the... during the second preceding school year were served free or at a reduced price. The higher payment level...

  16. 77 FR 43232 - National School Lunch, Special Milk, and School Breakfast Programs, National Average Payments...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-24

    ... ``national average payments,'' the amount of money the Federal Government provides States for lunches... with pricing programs that elect to serve milk free to eligible children continue to receive the... during the second preceding school year were served free or at a reduced price. The higher payment level...

  17. 78 FR 956 - National Vaccine Injury Compensation Program: Revised Amount of the Average Cost of a Health...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-07

    ... Vaccine Injury Compensation Program: Revised Amount of the Average Cost of a Health Insurance Policy The... average cost of a health insurance policy as it relates to the National Vaccine Injury Compensation... revised amounts of an average cost of a health insurance policy, as determined by the Secretary, are to be...

  18. 75 FR 2551 - National Vaccine Injury Compensation Program: Revised Amount of the Average Cost of a Health...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-15

    ... Vaccine Injury Compensation Program: Revised Amount of the Average Cost of a Health Insurance Policy The... average cost of a health insurance policy as it relates to the National Vaccine Injury Compensation... revised amounts of an average cost of a health insurance policy, as determined by the Secretary, are to be...

  19. 77 FR 801 - National Vaccine Injury Compensation Program: Revised Amount of the Average Cost of a Health...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-06

    ... Vaccine Injury Compensation Program: Revised Amount of the Average Cost of a Health Insurance Policy The... average cost of a health insurance policy as it relates to the National Vaccine Injury Compensation... revised amounts of an average cost of a health insurance policy, as determined by the Secretary, are to be...

  20. 76 FR 5180 - National Vaccine Injury Compensation Program: Revised Amount of the Average Cost of a Health...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-28

    ... Vaccine Injury Compensation Program: Revised Amount of the Average Cost of a Health Insurance Policy The... average cost of a health insurance policy as it relates to the National Vaccine Injury Compensation... revised amounts of an average cost of a health insurance policy, as determined by the Secretary, are to be...

  1. Alternatives to national average income data as eligibility criteria for international subsidies: a social justice perspective.

    PubMed

    Shebaya, Sirine; Sutherland, Andrea; Levine, Orin; Faden, Ruth

    2010-12-01

    Current strategies to address global inequities in access to life-saving vaccines use averaged national income data to determine eligibility. While largely successful in the lowest income countries, we argue that this approach could lead to significant inefficiencies from the standpoint of justice if applied to middle-income countries, where income inequalities are large and lead to national averages that obscure truly needy populations. Instead, we suggest alternative indicators more sensitive to social justice concerns that merit consideration by policy-makers developing new initiatives to redress health inequities in middle-income countries. © 2009 Blackwell Publishing Ltd.

  2. Using National Data to Estimate Average Cost Effectiveness of EFNEP Outcomes by State/Territory

    ERIC Educational Resources Information Center

    Baral, Ranju; Davis, George C.; Blake, Stephanie; You, Wen; Serrano, Elena

    2013-01-01

    This report demonstrates how existing national data can be used to first calculate upper limits on the average cost per participant and per outcome per state/territory for the Expanded Food and Nutrition Education Program (EFNEP). These upper limits can then be used by state EFNEP administrators to obtain more precise estimates for their states,…

  3. Beyond Horse Race Comparisons of National Performance Averages: Math Performance Variation within and between Classrooms in 38 Countries

    ERIC Educational Resources Information Center

    Huang, Min-Hsiung

    2009-01-01

    Reports of international studies of student achievement often receive public attention worldwide. However, this attention overly focuses on the national rankings of average student performance. To move beyond the simplistic comparison of national mean scores, this study investigates (a) country differences in the measures of variability as well as…

  4. National Assessment of Educational Progress. 1969-1970 Writing: National Results.

    ERIC Educational Resources Information Center

    Norris, Eleanor L.; And Others

    National results for writing, one of the three subject areas assessed during the first year of data collection by National Assessment, are presented in this volume. National results for the other two subjects, science and citizenship, are presented in separate volumes. The purpose of this project is to explore whether an assessment of educational…

  5. 76 FR 44573 - Child and Adult Care Food Program: National Average Payment Rates, Day Care Home Food Service...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-26

    ... DEPARTMENT OF AGRICULTURE Food and Nutrition Service Child and Adult Care Food Program: National Average Payment Rates, Day Care Home Food Service Payment Rates, and Administrative Reimbursement Rates for Sponsoring Organizations of Day Care Homes for the Period July 1, 2011 Through June 30, 2012...

  6. 78 FR 45176 - Child and Adult Care Food Program: National Average Payment Rates, Day Care Home Food Service...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-26

    ... DEPARTMENT OF AGRICULTURE Food and Nutrition Service Child and Adult Care Food Program: National Average Payment Rates, Day Care Home Food Service Payment Rates, and Administrative Reimbursement Rates for Sponsoring Organizations of Day Care Homes for the Period July 1, 2013 Through June 30, 2014...

  7. 76 FR 43254 - Child and Adult Care Food Program: National Average Payment Rates, Day Care Home Food Service...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-20

    ... DEPARTMENT OF AGRICULTURE Food and Nutrition Service Child and Adult Care Food Program: National Average Payment Rates, Day Care Home Food Service Payment Rates, and Administrative Reimbursement Rates for Sponsoring Organizations of Day Care Homes for the Period July 1, 2011 Through June 30, 2012...

  8. Results of external quality-assurance program for the National Atmospheric Deposition Program and National Trends Network during 1985

    USGS Publications Warehouse

    Brooks, M.H.; Schroder, L.J.; Willoughby, T.C.

    1988-01-01

    External quality assurance monitoring of the National Atmospheric Deposition Program (NADP) and National Trends Network (NTN) was performed by the U.S. Geological Survey during 1985. The monitoring consisted of three primary programs: (1) an intersite comparison program designed to assess the precision and accuracy of onsite pH and specific conductance measurements made by NADP and NTN site operators; (2) a blind audit sample program designed to assess the effect of routine field handling on the precision and bias of NADP and NTN wet deposition data; and (3) an interlaboratory comparison program designed to compare analytical data from the laboratory processing NADP and NTN samples with data produced by other laboratories routinely analyzing wet deposition samples and to provide estimates of individual laboratory precision. An average of 94% of the site operators participated in the four voluntary intersite comparisons during 1985. A larger percentage of participating site operators met the accuracy goal for specific conductance measurements (average, 87%) than for pH measurements (average, 67%). Overall precision was dependent on the actual specific conductance of the test solution and independent of the pH of the test solution. Data for the blind audit sample program indicated slight positive biases resulting from routine field handling for all analytes except specific conductance. These biases were not large enough to be significant for most data users. Data for the blind audit sample program also indicated that decreases in hydrogen ion concentration were accompanied by decreases in specific conductance. Precision estimates derived from the blind audit sample program indicate that the major source of uncertainty in wet deposition data is the routine field handling that each wet deposition sample receives. Results of the interlaboratory comparison program were similar to results of previous years ' evaluations, indicating that the participating laboratories

  9. 75 FR 41793 - Child and Adult Care Food Program: National Average Payment Rates, Day Care Home Food Service...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-19

    ... adjustments to the national average payment rates for meals and snacks served in child care centers, outside... payment rates for meals and snacks served in day care homes; and the administrative reimbursement rates...] Lunch and Centers Breakfast supper \\1\\ Snack Contingous States: Paid 0.26 0.26 0.06 Reduced Price 1.18 2...

  10. Computer usage and national energy consumption: Results from a field-metering study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Desroches, Louis-Benoit; Fuchs, Heidi; Greenblatt, Jeffery

    The electricity consumption of miscellaneous electronic loads (MELs) in the home has grown in recent years, and is expected to continue rising. Consumer electronics, in particular, are characterized by swift technological innovation, with varying impacts on energy use. Desktop and laptop computers make up a significant share of MELs electricity consumption, but their national energy use is difficult to estimate, given uncertainties around shifting user behavior. This report analyzes usage data from 64 computers (45 desktop, 11 laptop, and 8 unknown) collected in 2012 as part of a larger field monitoring effort of 880 households in the San Francisco Baymore » Area, and compares our results to recent values from the literature. We find that desktop computers are used for an average of 7.3 hours per day (median = 4.2 h/d), while laptops are used for a mean 4.8 hours per day (median = 2.1 h/d). The results for laptops are likely underestimated since they can be charged in other, unmetered outlets. Average unit annual energy consumption (AEC) for desktops is estimated to be 194 kWh/yr (median = 125 kWh/yr), and for laptops 75 kWh/yr (median = 31 kWh/yr). We estimate national annual energy consumption for desktop computers to be 20 TWh. National annual energy use for laptops is estimated to be 11 TWh, markedly higher than previous estimates, likely reflective of laptops drawing more power in On mode in addition to greater market penetration. This result for laptops, however, carries relatively higher uncertainty compared to desktops. Different study methodologies and definitions, changing usage patterns, and uncertainty about how consumers use computers must be considered when interpreting our results with respect to existing analyses. Finally, as energy consumption in On mode is predominant, we outline several energy savings opportunities: improved power management (defaulting to low-power modes after periods of inactivity as well as power scaling), matching the rated

  11. Concussion Incidence and Return-to-Play Time in National Basketball Association Players: Results From 2006 to 2014.

    PubMed

    Padaki, Ajay S; Cole, Brian J; Ahmad, Christopher S

    2016-09-01

    Various research efforts have studied concussions in the National Football League, Major League Baseball, and the National Hockey League. However, no study has investigated the incidence and return-to-play trends in the National Basketball Association (NBA), which this study aims to do. Increased media scrutiny and public awareness, in addition to the institution of a league-wide concussion protocol, may have resulted in more conservative return-to-play practices. Descriptive epidemiology study. All concussions to NBA players that were publicly reported in the media from the beginning of the 2006 NBA season to the end of the 2014 season were included. The incidence and return-to-play statistics were generated by synthesizing information from publicly available records. There were 134 publicly reported concussions to NBA players from the beginning of the 2006 season to the conclusion of the 2014 season, resulting in an average of 14.9 concussions per season. The incidence has not changed significantly during this time span. The average games missed after a concussion from 2006 to 2010 was 1.6, significantly less than the 5.0 games missed from 2011 to 2014, following the institution of the NBA concussion protocol (P = .023). Although the incidence of publicly reported concussions in the NBA has not changed appreciably over the past 9 seasons, the time missed after a concussion has. While players often returned in the same game in the 2006 season, the combination of implemented policy, national coverage, medical staff awareness, and player education may have contributed to players now missing an average of 4 to 6 games after a concussion. A multitude of factors has resulted in more conservative return-to-play practices for NBA players after concussions. © 2016 The Author(s).

  12. Estimation of national forest visitor spending averages from National Visitor Use Monitoring: round 2

    Treesearch

    Eric M. White; Darren B. Goodding; Daniel J. Stynes

    2013-01-01

    The economic linkages between national forests and surrounding communities have become increasingly important in recent years. One way national forests contribute to the economies of surrounding communities is by attracting recreation visitors who, as part of their trip, spend money in communities on the periphery of the national forest. We use survey data collected...

  13. Nasal potential difference: Best or average result for CFTR function as diagnostic criteria for cystic fibrosis?

    PubMed

    Keenan, Katherine; Avolio, Julie; Rueckes-Nilges, Claudia; Tullis, Elizabeth; Gonska, Tanja; Naehrlich, Lutz

    2015-05-01

    The current practice of averaging the nasal potential difference (NPD) results of right and left nostril measurements reduce inter-individual variability but may underestimate individual CFTR function. Best NPD response to Cl(-)-free and isoproterenol perfusion (=largest ΔPD(0Cl/Iso)) from the right and left nostril was compared to the average result in 13 cystic fibrosis (CF), 78 query-CF patients and 22 healthy controls from 2 cohorts. Despite moderate to good correlation (p<0.001) between right and left measured ΔPD(0Cl/Iso), we observed large differences in some individuals. A comparison of average versus best ΔPD(0Cl/Iso) showed only moderate agreement (Giessen κ=0.538; Toronto κ=0.607). Averaging ΔPD(0Cl/Iso) showed a lower composite chloride response compared to best ΔPD(0Cl/Iso) and altered diagnostic NPD interpretation in 30 of 113 (27%) subjects. The current practice of averaging the NPD results of right and left nostril measurements leads to an underestimation of the individual CFTR function and should be reconsidered. Copyright © 2014 European Cystic Fibrosis Society. Published by Elsevier B.V. All rights reserved.

  14. Virtual Averaging Making Nonframe-Averaged Optical Coherence Tomography Images Comparable to Frame-Averaged Images

    PubMed Central

    Chen, Chieh-Li; Ishikawa, Hiroshi; Wollstein, Gadi; Bilonick, Richard A.; Kagemann, Larry; Schuman, Joel S.

    2016-01-01

    Purpose Developing a novel image enhancement method so that nonframe-averaged optical coherence tomography (OCT) images become comparable to active eye-tracking frame-averaged OCT images. Methods Twenty-one eyes of 21 healthy volunteers were scanned with noneye-tracking nonframe-averaged OCT device and active eye-tracking frame-averaged OCT device. Virtual averaging was applied to nonframe-averaged images with voxel resampling and adding amplitude deviation with 15-time repetitions. Signal-to-noise (SNR), contrast-to-noise ratios (CNR), and the distance between the end of visible nasal retinal nerve fiber layer (RNFL) and the foveola were assessed to evaluate the image enhancement effect and retinal layer visibility. Retinal thicknesses before and after processing were also measured. Results All virtual-averaged nonframe-averaged images showed notable improvement and clear resemblance to active eye-tracking frame-averaged images. Signal-to-noise and CNR were significantly improved (SNR: 30.5 vs. 47.6 dB, CNR: 4.4 vs. 6.4 dB, original versus processed, P < 0.0001, paired t-test). The distance between the end of visible nasal RNFL and the foveola was significantly different before (681.4 vs. 446.5 μm, Cirrus versus Spectralis, P < 0.0001) but not after processing (442.9 vs. 446.5 μm, P = 0.76). Sectoral macular total retinal and circumpapillary RNFL thicknesses showed systematic differences between Cirrus and Spectralis that became not significant after processing. Conclusion The virtual averaging method successfully improved nontracking nonframe-averaged OCT image quality and made the images comparable to active eye-tracking frame-averaged OCT images. Translational Relevance Virtual averaging may enable detailed retinal structure studies on images acquired using a mixture of nonframe-averaged and frame-averaged OCT devices without concerning about systematic differences in both qualitative and quantitative aspects. PMID:26835180

  15. [Analysis of results of Assessment on National Parasitic Disease Control and Prevention Techniques in 2015].

    PubMed

    Yao, Ruan; Li-Ying, Wang; Ting-Jun, Zhu; Men-Bao, Qian; Chun-Li, Cao; Yu-Wan, Hao; Tian, Tian; Shi-Zhu, Li

    2017-03-01

    To assess the theoretical knowledge and practical skills of parasitic diseases among technicians from disease control and prevention institutions. The Assessment on National Parasitic Disease Control and Prevention Techniques was organized in September, 2015. Together, 124 subjects from disease control and prevention institutions at province, prefecture or county levels in 31 provinces joined the assessment. A database was built consisting of subjects' basic information and assessment scores. Statistical analysis was used to analyze the scores by gender, age, professional title, institutions and places of participants. The average total score of all the subjects was 123.3, with a passing rate of 57.3%. The average scores of male subjects (48 subjects) and female subjects (76 subjects) were 125.9 and 121.7 respectively; the average scores of the subjects aged under 30 years (57 subjects), between 30 and 40 years (61 subjects) and above 40 years (6 subjects) were 119.6, 128.1 and 111.2 respectively; the average scores of persons with junior (94 subjects), intermediate (28 subjects) and senior (2 subjects) professional titles were 119.2, 135.9 and 140.5 respectively. The average theoretical assessment score of all the subjects was 61.9, with a passing rate of 62.9%. The average practical skill assessment score of all the subjects was 61.4, with a passing rate of 58.1%. The theoretical assessment results range widely. The theoretical knowledge results of technicians from disease control and prevention institutions are low in general. Therefore, the specific training based on daily work needs to be enhanced.

  16. Quaternion Averaging

    NASA Technical Reports Server (NTRS)

    Markley, F. Landis; Cheng, Yang; Crassidis, John L.; Oshman, Yaakov

    2007-01-01

    Many applications require an algorithm that averages quaternions in an optimal manner. For example, when combining the quaternion outputs of multiple star trackers having this output capability, it is desirable to properly average the quaternions without recomputing the attitude from the the raw star tracker data. Other applications requiring some sort of optimal quaternion averaging include particle filtering and multiple-model adaptive estimation, where weighted quaternions are used to determine the quaternion estimate. For spacecraft attitude estimation applications, derives an optimal averaging scheme to compute the average of a set of weighted attitude matrices using the singular value decomposition method. Focusing on a 4-dimensional quaternion Gaussian distribution on the unit hypersphere, provides an approach to computing the average quaternion by minimizing a quaternion cost function that is equivalent to the attitude matrix cost function Motivated by and extending its results, this Note derives an algorithm that deterniines an optimal average quaternion from a set of scalar- or matrix-weighted quaternions. Rirthermore, a sufficient condition for the uniqueness of the average quaternion, and the equivalence of the mininiization problem, stated herein, to maximum likelihood estimation, are shown.

  17. Monthly average polar sea-ice concentration

    USGS Publications Warehouse

    Schweitzer, Peter N.

    1995-01-01

    The data contained in this CD-ROM depict monthly averages of sea-ice concentration in the modern polar oceans. These averages were derived from the Scanning Multichannel Microwave Radiometer (SMMR) and Special Sensor Microwave/Imager (SSM/I) instruments aboard satellites of the U.S. Air Force Defense Meteorological Satellite Program from 1978 through 1992. The data are provided as 8-bit images using the Hierarchical Data Format (HDF) developed by the National Center for Supercomputing Applications.

  18. External quality-assurance results for the National Atmospheric Deposition Program and the National Trends Network during 1986

    USGS Publications Warehouse

    See, Randolph B.; Schroder, LeRoy J.; Willoughby, Timothy C.

    1988-01-01

    During 1986, the U.S. Geological Survey operated three programs to provide external quality-assurance monitoring of the National Atmospheric Deposition Program and National Trends Network. An intersite-comparison program was used to assess the accuracy of onsite pH and specific-conductance determinations at quarterly intervals. The blind-audit program was used to assess the effect of routine sample handling on the precision and bias of program and network wet-deposition data. Analytical results from four laboratories, which routinely analyze wet-deposition samples, were examined to determine if differences existed between laboratory analytical results and to provide estimates of the analytical precision of each laboratory. An average of 78 and 89 percent of the site operators participating in the intersite-comparison met the network goals for pH and specific conductance. A comparison of analytical values versus actual values for samples submitted as part of the blind-audit program indicated that analytical values were slightly but significantly (a = 0.01) larger than actual values for pH, magnesium, sodium, and sulfate; analytical values for specific conductance were slightly less than actual values. The decreased precision in the analyses of blind-audit samples when compared to interlaboratory studies indicates that a large amount of uncertainty in network deposition data may be a result of routine field operations. The results of the interlaboratory comparison study indicated that the magnitude of the difference between laboratory analyses was small for all analytes. Analyses of deionized, distilled water blanks by participating laboratories indicated that the laboratories had difficulty measuring analyte concentrations near their reported detection limits. (USGS)

  19. 75 FR 41796 - National School Lunch, Special Milk, and School Breakfast Programs, National Average Payments...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-19

    ..., afterschool snacks and breakfasts served to children participating in the National School Lunch and School... Factors and to the maximum Federal reimbursement rates for lunches and afterschool snacks served to... afterschool snacks served under the National School Lunch Program are rounded down to the nearest whole cent...

  20. Focus on Teacher Salaries: What Teacher Salary Averages Don't Show.

    ERIC Educational Resources Information Center

    Gaines, Gale

    Traditional comparisons of teacher salary averages fail to consider factors beyond pay raises that affect those averages. Salary averages do not show: regional and national variations among states' average salaries; the variation of salaries within an individual state; variations in the cost of living; the highest degree earned by teachers and the…

  1. Manufacturer's Policies Concerning Average Fuel Economy Standards

    DOT National Transportation Integrated Search

    1979-01-01

    The National Highway Traffic Safety Administration (NHTSA) has been given the responsibility for implementing the average fuel economy standards for passenger automobiles mandated by the Energy Policy and Conservation Act (P.L. 94-163). The standards...

  2. Vegetable and Fruit Intakes of On-Reserve First Nations Schoolchildren Compared to Canadian Averages and Current Recommendations

    PubMed Central

    Gates, Allison; Hanning, Rhona M.; Gates, Michelle; Skinner, Kelly; Martin, Ian D.; Tsuji, Leonard J. S.

    2012-01-01

    This study investigated, in on-reserve First Nations (FN) youth in Ontario, Canada, the following: (a) the intakes of vegetable and fruit, “other” foods and relevant nutrients as compared to current recommendations and national averages, (b) current prevalence rates of overweight and obesity and (c) the relationship between latitude and dietary intakes. Twenty-four-hour diet recalls were collected via the Waterloo Web-Based Eating Behaviour Questionnaire (WEB-Q) (n = 443). Heights and weights of participants were self reported using measured values and Body Mass Index was categorized using the International Obesity Task Force cutoffs. Food group and nutrient intakes were compared to current standards, Southern Ontario Food Behaviour data and the Canadian Community Health Survey, Cycle 2.2, using descriptive statistics. Mean vegetable and fruit, fibre and folate intakes were less than current recommendations. Girls aged 14–18 years had mean intakes of vitamin A below current recommendations for this sub-group; for all sub-groups, mean intakes of vegetables and fruit were below Canadian averages. All sub-groups also had intakes of all nutrients and food groups investigated that were less than those observed in non-FN youth from Southern Ontario, with the exception of “other” foods in boys 12–18 years. Prevalence rates of overweight and obesity were 31.8% and 19.6%, respectively, exceeding rates in the general population. Dietary intakes did not vary consistently by latitude (n = 248), as revealed by ANOVA. This study provided a unique investigation of the dietary intakes of on-reserve FN youth in Ontario and revealed poor intakes of vegetables and fruit and related nutrients and high intakes of “other” foods. Prevalence rates of overweight and obesity exceed those of the general population. PMID:22690200

  3. Average probability that a "cold hit" in a DNA database search results in an erroneous attribution.

    PubMed

    Song, Yun S; Patil, Anand; Murphy, Erin E; Slatkin, Montgomery

    2009-01-01

    We consider a hypothetical series of cases in which the DNA profile of a crime-scene sample is found to match a known profile in a DNA database (i.e., a "cold hit"), resulting in the identification of a suspect based only on genetic evidence. We show that the average probability that there is another person in the population whose profile matches the crime-scene sample but who is not in the database is approximately 2(N - d)p(A), where N is the number of individuals in the population, d is the number of profiles in the database, and p(A) is the average match probability (AMP) for the population. The AMP is estimated by computing the average of the probabilities that two individuals in the population have the same profile. We show further that if a priori each individual in the population is equally likely to have left the crime-scene sample, then the average probability that the database search attributes the crime-scene sample to a wrong person is (N - d)p(A).

  4. National wildlife refuge visitor survey results: 2010/2011

    USGS Publications Warehouse

    Sexton, Natalie R.; Dietsch, Alia M.; Don Carolos, Andrew W.; Miller, Holly M.; Koontz, Lynne M.; Solomon, Adam N.

    2012-01-01

    The U.S. Fish and Wildlife Service (Service) collaborated with the U.S. Geological Survey to conduct a national survey of visitors regarding their experiences on national wildlife refuges. The survey was conducted to better understand visitor needs and experiences and to design programs and facilities that respond to those needs. The survey results will inform Service performance planning, budget, and communications goals. Results will also inform Comprehensive Conservation Plan (CCPs), Visitor Services, and Transportation Planning processes. The survey was conducted on 53 refuges across the National Wildlife Refuge System (Refuge System) to better understand visitor needs and experiences and to design programs and facilities that respond to those needs. A total of 14,832 visitors agreed to participate in the survey between July 2010 and November 2011. In all, 10,233 visitors completed the survey for a 71% response rate. This report provides a summary of visitor and trip characteristics; visitor opinions about refuges and their offerings; and visitor opinions about alternative transportation and climate change, two Refuge System topics of interest. The Refuge System, established in 1903 and managed by the Service, is the leading network of protected lands and waters in the world dedicated to the conservation of fish, wildlife and their habitats. There are 556 National Wildlife Refuges and 38 wetland management districts nationwide, encompassing more than 150 million acres. The Refuge System attracts more than 45 million visitors annually, including 25 million people per year to observe and photograph wildlife, over 9 million to hunt and fish, and more than 10 million to participate in educational and interpretation programs. Understanding visitors and characterizing their experiences on national wildlife refuges are critical elements of managing these lands and meeting the goals of the Refuge System. These combined results are based on surveying at 53 participating

  5. Focus on Teacher Salaries: An Update on Average Salaries and Recent Legislative Actions in the SREB States.

    ERIC Educational Resources Information Center

    Gaines, Gale F.

    Focused state efforts have helped teacher salaries in Southern Regional Education Board (SREB) states move toward the national average. Preliminary 2000-01 estimates put SREB's average teacher salary at its highest point in 22 years compared to the national average. The SREB average teacher salary is approximately 90 percent of the national…

  6. Virtual Averaging Making Nonframe-Averaged Optical Coherence Tomography Images Comparable to Frame-Averaged Images.

    PubMed

    Chen, Chieh-Li; Ishikawa, Hiroshi; Wollstein, Gadi; Bilonick, Richard A; Kagemann, Larry; Schuman, Joel S

    2016-01-01

    Developing a novel image enhancement method so that nonframe-averaged optical coherence tomography (OCT) images become comparable to active eye-tracking frame-averaged OCT images. Twenty-one eyes of 21 healthy volunteers were scanned with noneye-tracking nonframe-averaged OCT device and active eye-tracking frame-averaged OCT device. Virtual averaging was applied to nonframe-averaged images with voxel resampling and adding amplitude deviation with 15-time repetitions. Signal-to-noise (SNR), contrast-to-noise ratios (CNR), and the distance between the end of visible nasal retinal nerve fiber layer (RNFL) and the foveola were assessed to evaluate the image enhancement effect and retinal layer visibility. Retinal thicknesses before and after processing were also measured. All virtual-averaged nonframe-averaged images showed notable improvement and clear resemblance to active eye-tracking frame-averaged images. Signal-to-noise and CNR were significantly improved (SNR: 30.5 vs. 47.6 dB, CNR: 4.4 vs. 6.4 dB, original versus processed, P < 0.0001, paired t -test). The distance between the end of visible nasal RNFL and the foveola was significantly different before (681.4 vs. 446.5 μm, Cirrus versus Spectralis, P < 0.0001) but not after processing (442.9 vs. 446.5 μm, P = 0.76). Sectoral macular total retinal and circumpapillary RNFL thicknesses showed systematic differences between Cirrus and Spectralis that became not significant after processing. The virtual averaging method successfully improved nontracking nonframe-averaged OCT image quality and made the images comparable to active eye-tracking frame-averaged OCT images. Virtual averaging may enable detailed retinal structure studies on images acquired using a mixture of nonframe-averaged and frame-averaged OCT devices without concerning about systematic differences in both qualitative and quantitative aspects.

  7. National wildlife refuge visitor survey 2012--Individual refuge results

    USGS Publications Warehouse

    Dietsch, Alia M.; Sexton, Natalie R.; Koontz, Lynne M.; Conk, Shannon J.

    2013-01-01

    The National Wildlife Refuge System (Refuge System), established in 1903 and managed by the U.S. Fish and Wildlife Service (Service), is the leading network of protected lands and waters in the world dedicated to the conservation of fish, wildlife and their habitats. There are 560 national wildlife refuges and 38 wetland management districts nationwide, encompassing more than 150 million acres. The Refuge System attracts nearly 45 million visitors annually, including 34.8 million people who observe and photograph wildlife, 9.6 million who hunt and fish, and nearly 675,000 teachers and students who use refuges as outdoor classrooms. Understanding visitor perceptions of refuges and characterizing their experiences on refuges are critical elements of managing these lands and meeting the goals of the Refuge System. The Service collaborated with the U.S. Geological Survey to conduct a national survey of visitors regarding their experiences on national wildlife refuges. The purpose of the survey was to better understand visitor experiences and trip characteristics, to gauge visitors’ levels of satisfaction with existing recreational opportunities, and to garner feedback to inform the design of programs and facilities. The survey results will inform performance, planning, budget, and communications goals. Results will also inform Comprehensive Conservation Plans (CCPs), visitor services, and transportation planning processes. This Data Series consists of 25 separate data files. Each file describes the results of the survey for an individual refuge and contains the following information: • Introduction: An overview of the Refuge System and the goals of the national surveying effort. • Methods: The procedures for the national surveying effort, including selecting refuges, developing the survey instrument, contacting visitors, and guidance for interpreting the results.• Refuge Description: A brief description of the refuge location, acreage, purpose, recreational

  8. Alternatives to the Moving Average

    Treesearch

    Paul C. van Deusen

    2001-01-01

    There are many possible estimators that could be used with annual inventory data. The 5-year moving average has been selected as a default estimator to provide initial results for states having available annual inventory data. User objectives for these estimates are discussed. The characteristics of a moving average are outlined. It is shown that moving average...

  9. Test-to-Test Repeatability of Results From a Subsonic Wing-Body Configuration in the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Pendergraft, Odis C., Jr.

    2000-01-01

    Results from three wind tunnel tests in the National Transonic Facility of a model of an advanced-technology, subsonic-transport wing-body configuration have been analyzed to assess the test-to-test repeatability of several aerodynamic parameters. The scatter, as measured by the prediction interval, in the longitudinal force and moment coefficients increases as the Mach number increases. Residual errors with and without the ESP tubes installed suggest a bias leading to lower drag with the tubes installed. Residual errors as well as average values of the longitudinal force and moment coefficients show that there are small bias errors between the different tests.

  10. National wildlife refuge visitor survey 2010/2011: Individual refuge results

    USGS Publications Warehouse

    Sexton, Natalie R.; Dietsch, Alia M.; Don Carlos, Andrew W.; Koontz, Lynne M.; Solomon, Adam N.; Miller, Holly M.

    2012-01-01

    The National Wildlife Refuge System (Refuge System), established in 1903 and managed by the U.S. Fish and Wildlife Service (Service), is the leading network of protected lands and waters in the world dedicated to the conservation of fish, wildlife and their habitats. There are 556 national wildlife refuges and 38 wetland management districts nationwide, encompassing more than 150 million acres. The Refuge System attracts more than 45 million visitors annually, including 25 million people per year to observe and photograph wildlife, over 9 million to hunt and fish, and more than 10 million to participate in educational and interpretation programs. Understanding visitors and characterizing their experiences on national wildlife refuges are critical elements of managing these lands and meeting the goals of the Refuge System. The Service collaborated with the U.S. Geological Survey to conduct a national survey of visitors regarding their experiences on national wildlife refuges. The survey was conducted to better understand visitor needs and experiences and to design programs and facilities that respond to those needs. The survey results will inform Service performance planning, budget, and communications goals. Results will also inform Comprehensive Conservation Plan (CCPs), Visitor Services, and Transportation Planning processes. This data series consists of 53 separate data files. Each file describes the results of the survey for an individual refuge and contains the following information: * Introduction: An overview of the Refuge System and the goals of the national surveying effort. * Methods: The procedures for the national surveying effort, including selecting refuges, developing the survey instrument, contacting visitors, and guidance for interpreting the results. * Refuge Description: A brief description of the refuge location, acreage, purpose, recreational activities, and visitation statistics, including a map (where available) and refuge website link

  11. 40 CFR 63.5710 - How do I demonstrate compliance using emissions averaging?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES National Emission Standards for Hazardous Air Pollutants for Boat Manufacturing Standards for Open... section to compute the weighted-average MACT model point value for each open molding resin and gel coat...

  12. Instantaneous, phase-averaged, and time-averaged pressure from particle image velocimetry

    NASA Astrophysics Data System (ADS)

    de Kat, Roeland

    2015-11-01

    Recent work on pressure determination using velocity data from particle image velocimetry (PIV) resulted in approaches that allow for instantaneous and volumetric pressure determination. However, applying these approaches is not always feasible (e.g. due to resolution, access, or other constraints) or desired. In those cases pressure determination approaches using phase-averaged or time-averaged velocity provide an alternative. To assess the performance of these different pressure determination approaches against one another, they are applied to a single data set and their results are compared with each other and with surface pressure measurements. For this assessment, the data set of a flow around a square cylinder (de Kat & van Oudheusden, 2012, Exp. Fluids 52:1089-1106) is used. RdK is supported by a Leverhulme Trust Early Career Fellowship.

  13. Results from the 2010 National Survey on Drug Use and Health: Summary of National Findings

    ERIC Educational Resources Information Center

    Substance Abuse and Mental Health Services Administration, 2011

    2011-01-01

    This report presents a first look at results from the 2010 National Survey on Drug Use and Health (NSDUH), an annual survey of the civilian, noninstitutionalized population of the United States aged 12 years old or older. The report presents national estimates of rates of use, numbers of users, and other measures related to illicit drugs, alcohol,…

  14. Results of Ponseti Brasil Program: Multicentric Study in 1621 Feet: Preliminary Results.

    PubMed

    Nogueira, Monica P; Queiroz, Ana C D B F; Melanda, Alessandro G; Tedesco, Ana P; Brandão, Antonio L G; Beling, Claudio; Violante, Francisco H; Brandão, Gilberto F; Ferreira, Laura F A; Brambila, Leandro S; Leite, Leopoldina M; Zabeu, Jose L; Kim, Jung H; Fernandes, Kalyana E; Arima, Marcia A S; Aguilar, Maria D P Q; Farias Filho, Orlando C D; Oliveira Filho, Oscar B D A; Pinho, Solange D S; Moulin, Paulo; Volpi, Reinaldo; Fox, Mark; Greenwald, Miles F; Lyle, Brandon; Morcuende, Jose A

    The Ponseti method has been shown to be the most effective treatment for congenital clubfoot. The current challenge is to establish sustainable national clubfoot treatment programs that utilize the Ponseti method and integrate it within a nation's governmental health system. The Brazilian Ponseti Program (Programa Ponseti Brasil) has increased awareness of the utility of the Ponseti method and has trained >500 Brazilian orthopaedic surgeons in it. A group of 18 of those surgeons had been able to reproduce the Ponseti clubfoot treatment, and compiled their initial results through structured spreadsheet. The study compiled 1040 patients for a total of 1621 feet. The average follow-up time was 2.3 years with an average correction time of approximately 3 months. Patients required an average of 6.40 casts to achieve correction. This study demonstrates that good initial correction rates are reproducible after training; from 1040 patients only 1.4% required a posteromedial release. Level IV.

  15. National School-Age Child Care Alliance (NSACCA): National Survey Results. Draft Report.

    ERIC Educational Resources Information Center

    Marx, Fern

    Presenting preliminary results of a National School-Age Child Care Alliance study of child care providers, this report is an initial analysis of 250 out of 427 questionnaires received as of April, 1993, representing practitioners in 40 states and 180 cities. Tables present data from responses to 16 items on the questionnaire soliciting information…

  16. Threaded average temperature thermocouple

    NASA Technical Reports Server (NTRS)

    Ward, Stanley W. (Inventor)

    1990-01-01

    A threaded average temperature thermocouple 11 is provided to measure the average temperature of a test situs of a test material 30. A ceramic insulator rod 15 with two parallel holes 17 and 18 through the length thereof is securely fitted in a cylinder 16, which is bored along the longitudinal axis of symmetry of threaded bolt 12. Threaded bolt 12 is composed of material having thermal properties similar to those of test material 30. Leads of a thermocouple wire 20 leading from a remotely situated temperature sensing device 35 are each fed through one of the holes 17 or 18, secured at head end 13 of ceramic insulator rod 15, and exit at tip end 14. Each lead of thermocouple wire 20 is bent into and secured in an opposite radial groove 25 in tip end 14 of threaded bolt 12. Resulting threaded average temperature thermocouple 11 is ready to be inserted into cylindrical receptacle 32. The tip end 14 of the threaded average temperature thermocouple 11 is in intimate contact with receptacle 32. A jam nut 36 secures the threaded average temperature thermocouple 11 to test material 30.

  17. Improving consensus structure by eliminating averaging artifacts

    PubMed Central

    KC, Dukka B

    2009-01-01

    Background Common structural biology methods (i.e., NMR and molecular dynamics) often produce ensembles of molecular structures. Consequently, averaging of 3D coordinates of molecular structures (proteins and RNA) is a frequent approach to obtain a consensus structure that is representative of the ensemble. However, when the structures are averaged, artifacts can result in unrealistic local geometries, including unphysical bond lengths and angles. Results Herein, we describe a method to derive representative structures while limiting the number of artifacts. Our approach is based on a Monte Carlo simulation technique that drives a starting structure (an extended or a 'close-by' structure) towards the 'averaged structure' using a harmonic pseudo energy function. To assess the performance of the algorithm, we applied our approach to Cα models of 1364 proteins generated by the TASSER structure prediction algorithm. The average RMSD of the refined model from the native structure for the set becomes worse by a mere 0.08 Å compared to the average RMSD of the averaged structures from the native structure (3.28 Å for refined structures and 3.36 A for the averaged structures). However, the percentage of atoms involved in clashes is greatly reduced (from 63% to 1%); in fact, the majority of the refined proteins had zero clashes. Moreover, a small number (38) of refined structures resulted in lower RMSD to the native protein versus the averaged structure. Finally, compared to PULCHRA [1], our approach produces representative structure of similar RMSD quality, but with much fewer clashes. Conclusion The benchmarking results demonstrate that our approach for removing averaging artifacts can be very beneficial for the structural biology community. Furthermore, the same approach can be applied to almost any problem where averaging of 3D coordinates is performed. Namely, structure averaging is also commonly performed in RNA secondary prediction [2], which could also benefit

  18. Dissociating Averageness and Attractiveness: Attractive Faces Are Not Always Average

    ERIC Educational Resources Information Center

    DeBruine, Lisa M.; Jones, Benedict C.; Unger, Layla; Little, Anthony C.; Feinberg, David R.

    2007-01-01

    Although the averageness hypothesis of facial attractiveness proposes that the attractiveness of faces is mostly a consequence of their averageness, 1 study has shown that caricaturing highly attractive faces makes them mathematically less average but more attractive. Here the authors systematically test the averageness hypothesis in 5 experiments…

  19. National mortality rates: the impact of inequality?

    PubMed

    Wilkinson, R G

    1992-08-01

    Although health is closely associated with income differences within each country there is, at best, only a weak link between national mortality rates and average income among the developed countries. On the other hand, there is evidence of a strong relationship between national mortality rates and the scale of income differences within each society. These three elements are coherent if health is affected less by changes in absolute material standards across affluent populations than it is by relative income or the scale of income differences and the resulting sense of disadvantage within each society. Rather than socioeconomic mortality differentials representing a distribution around given national average mortality rates, it is likely that the degree of income inequality indicates the burden of relative deprivation on national mortality rates.

  20. Determining GPS average performance metrics

    NASA Technical Reports Server (NTRS)

    Moore, G. V.

    1995-01-01

    Analytic and semi-analytic methods are used to show that users of the GPS constellation can expect performance variations based on their location. Specifically, performance is shown to be a function of both altitude and latitude. These results stem from the fact that the GPS constellation is itself non-uniform. For example, GPS satellites are over four times as likely to be directly over Tierra del Fuego than over Hawaii or Singapore. Inevitable performance variations due to user location occur for ground, sea, air and space GPS users. These performance variations can be studied in an average relative sense. A semi-analytic tool which symmetrically allocates GPS satellite latitude belt dwell times among longitude points is used to compute average performance metrics. These metrics include average number of GPS vehicles visible, relative average accuracies in the radial, intrack and crosstrack (or radial, north/south, east/west) directions, and relative average PDOP or GDOP. The tool can be quickly changed to incorporate various user antenna obscuration models and various GPS constellation designs. Among other applications, tool results can be used in studies to: predict locations and geometries of best/worst case performance, design GPS constellations, determine optimal user antenna location and understand performance trends among various users.

  1. Parents' Reactions to Finding Out That Their Children Have Average or above Average IQ Scores.

    ERIC Educational Resources Information Center

    Dirks, Jean; And Others

    1983-01-01

    Parents of 41 children who had been given an individually-administered intelligence test were contacted 19 months after testing. Parents of average IQ children were less accurate in their memory of test results. Children with above average IQ experienced extremely low frequencies of sibling rivalry, conceit or pressure. (Author/HLM)

  2. Numerical and experimental research on pentagonal cross-section of the averaging Pitot tube

    NASA Astrophysics Data System (ADS)

    Zhang, Jili; Li, Wei; Liang, Ruobing; Zhao, Tianyi; Liu, Yacheng; Liu, Mingsheng

    2017-07-01

    Averaging Pitot tubes have been widely used in many fields because of their simple structure and stable performance. This paper introduces a new shape of the cross-section of an averaging Pitot tube. Firstly, the structure of the averaging Pitot tube and the distribution of pressure taps are given. Then, a mathematical model of the airflow around it is formulated. After that, a series of numerical simulations are carried out to optimize the geometry of the tube. The distribution of the streamline and pressures around the tube are given. To test its performance, a test platform was constructed in accordance with the relevant national standards and is described in this paper. Curves are provided, linking the values of flow coefficient with the values of Reynolds number. With a maximum deviation of only  ±3%, the results of the flow coefficient obtained from the numerical simulations were in agreement with those obtained from experimental methods. The proposed tube has a stable flow coefficient and favorable metrological characteristics.

  3. AVERAGE ANNUAL SOLAR UV DOSE OF THE CONTINENTAL US CITIZEN

    EPA Science Inventory

    The average annual solar UV dose of US citizens is not known, but is required for relative risk assessments of skin cancer from UV-emitting devices. We solved this problem using a novel approach. The EPA's "National Human Activity Pattern Survey" recorded the daily ou...

  4. 49 CFR 525.11 - Termination of exemption; amendment of alternative average fuel economy standard.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 6 2010-10-01 2010-10-01 false Termination of exemption; amendment of alternative average fuel economy standard. 525.11 Section 525.11 Transportation Other Regulations Relating to Transportation (Continued) NATIONAL HIGHWAY TRAFFIC SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EXEMPTIONS FROM AVERAGE FUEL ECONOMY STANDARDS...

  5. Grade Point Average: What's Wrong and What's the Alternative?

    ERIC Educational Resources Information Center

    Soh, Kay Cheng

    2011-01-01

    Grade point average (GPA) has been around for more than two centuries. However, it has created a lot of confusion, frustration, and anxiety to GPA-producers and users alike, especially when used across-nation for different purposes. This paper looks into the reasons for such a state of affairs from the perspective of educational measurement. It…

  6. National mortality rates: the impact of inequality?

    PubMed Central

    Wilkinson, R G

    1992-01-01

    Although health is closely associated with income differences within each country there is, at best, only a weak link between national mortality rates and average income among the developed countries. On the other hand, there is evidence of a strong relationship between national mortality rates and the scale of income differences within each society. These three elements are coherent if health is affected less by changes in absolute material standards across affluent populations than it is by relative income or the scale of income differences and the resulting sense of disadvantage within each society. Rather than socioeconomic mortality differentials representing a distribution around given national average mortality rates, it is likely that the degree of income inequality indicates the burden of relative deprivation on national mortality rates. PMID:1636827

  7. State and national household concentrations of PM2.5 from solid cookfuel use: Results from measurements and modeling in India for estimation of the global burden of disease

    PubMed Central

    2013-01-01

    Background Previous global burden of disease (GBD) estimates for household air pollution (HAP) from solid cookfuel use were based on categorical indicators of exposure. Recent progress in GBD methodologies that use integrated–exposure–response (IER) curves for combustion particles required the development of models to quantitatively estimate average HAP levels experienced by large populations. Such models can also serve to inform public health intervention efforts. Thus, we developed a model to estimate national household concentrations of PM2.5 from solid cookfuel use in India, together with estimates for 29 states. Methods We monitored 24-hr household concentrations of PM2.5, in 617 rural households from 4 states in India on a cross-sectional basis between November 2004 and March 2005. We then, developed log-linear regression models that predict household concentrations as a function of multiple, independent household level variables available in national household surveys and generated national / state estimates using The Indian National Family and Health Survey (NFHS 2005). Results The measured mean 24-hr concentration of PM2.5 in solid cookfuel using households ranged from 163 μg/m3 (95% CI: 143,183; median 106; IQR: 191) in the living area to 609 μg/m3 (95% CI: 547,671; median: 472; IQR: 734) in the kitchen area. Fuel type, kitchen type, ventilation, geographical location and cooking duration were found to be significant predictors of PM2.5 concentrations in the household model. k-fold cross validation showed a fair degree of correlation (r = 0.56) between modeled and measured values. Extrapolation of the household results by state to all solid cookfuel-using households in India, covered by NFHS 2005, resulted in a modeled estimate of 450 μg/m3 (95% CI: 318,640) and 113 μg/m3 (95% CI: 102,127) , for national average 24-hr PM2.5 concentrations in the kitchen and living areas respectively. Conclusions The model affords substantial improvement

  8. What's in a country average? Wealth, gender, and regional inequalities in immunization in India.

    PubMed

    Pande, Rohini P; Yazbeck, Abdo S

    2003-12-01

    Recent attention to Millennium Development Goals by the international development community has led to the formation of targets to measure country-level achievements, including achievements on health status indicators such as childhood immunization. Using the example of immunization in India, this paper demonstrates the importance of disaggregating national averages for a better understanding of social disparities in health. Specifically, the paper uses data from the India National Family Health Survey 1992-93 to analyze socioeconomic, gender, urban-rural and regional inequalities in immunization in India for each of the 17 largest states. Results show that, on average, southern states have better immunization levels and lower immunization inequalities than many northern states. Wealth and regional inequalities are correlated with overall levels of immunization in a non-linear fashion. Gender inequalities persist in most states, including in the south, and seem unrelated to overall immunization or the levels of other inequalities measured here. This suggests that the gender differentials reflect deep-seated societal factors rather than health system issues per se. The disaggregated information and analysis used in this paper allows for setting more meaningful targets than country averages. Additionally, it helps policy makers and planners to understand programmatic constraints and needs by identifying disparities between sub-groups of the population, including strong and weak performers at the state and regional levels.

  9. Applying national survey results for strategic planning and program improvement: the National Diabetes Education Program.

    PubMed

    Griffey, Susan; Piccinino, Linda; Gallivan, Joanne; Lotenberg, Lynne Doner; Tuncer, Diane

    2015-02-01

    Since the 1970s, the federal government has spearheaded major national education programs to reduce the burden of chronic diseases in the United States. These prevention and disease management programs communicate critical information to the public, those affected by the disease, and health care providers. The National Diabetes Education Program (NDEP), the leading federal program on diabetes sponsored by the National Institutes of Health (NIH) and the Centers for Disease Control and Prevention (CDC), uses primary and secondary quantitative data and qualitative audience research to guide program planning and evaluation. Since 2006, the NDEP has filled the gaps in existing quantitative data sources by conducting its own population-based survey, the NDEP National Diabetes Survey (NNDS). The NNDS is conducted every 2–3 years and tracks changes in knowledge, attitudes and practice indicators in key target audiences. This article describes how the NDEP has used the NNDS as a key component of its evaluation framework and how it applies the survey results for strategic planning and program improvement. The NDEP's use of the NNDS illustrates how a program evaluation framework that includes periodic population-based surveys can serve as an evaluation model for similar national health education programs.

  10. EPA's National Dioxin Air Monitoring Network (NDAMN): Design, implementation, and final results

    NASA Astrophysics Data System (ADS)

    Lorber, Matthew; Ferrario, Joseph; Byrne, Christian

    2013-10-01

    The U.S. Environmental Protection Agency (U.S. EPA) established the National Dioxin Air Monitoring Network (NDAMN) in June of 1998, and operated it until November of 2004. The objective of NDAMN was to determine background air concentrations of polychlorinated dibenzo-p-dioxins (PCDDs), polychlorinated dibenzofurans (PCDFs), and dioxin-like polychlorinated biphenyls (dl-PCBs). NDAMN started with 10 sampling sites, adding more over time until the final count of 34 sites was reached by the beginning of 2003. Samples were taken quarterly, and the final sample count was 685. All samples were measured for 17 PCDD/PCDF congeners, 8 PCDD/PCDF homologue groups, and 7 dl-PCBs (note: 5 additional dl-PCBs were added for samples starting in the summer of 2002; 317 samples had measurements of 12 dl-PCBs). The overall average total toxic equivalent (TEQ) concentration in the United States was 11.2 fg TEQ m-3 with dl-PCBs contributing 0.8 fg TEQ m-3 (7%) to this total. The archetype dioxin and furan background air congener profile was seen in the survey averages and in most individual samples. This archetype profile is characterized by low and similar concentrations for tetra - through hexa PCDD/PCDF congeners, with elevations in four congeners - a hepta dioxin and furan congener, and both octa congeners. Sites were generally categorized as urban (4 sites), rural (23 sites), or remote (7 sites). The average TEQ concentrations over all sites and samples within these categories were: urban = 15.9 fg TEQ m-3, rural = 13.9 fg TEQ m-3, and remote = 1.2 fg TEQ m-3. Rural sites showed elevations during the fall or winter months when compared to the spring or summer months, and the same might be said for urban sites, but the remote sites appear to show little variation over time. The four highest individual moment measurements were 847, 292, 241, and 132 fg TEQ m-3. For the 847 and 292 fg TEQ m-3 samples, the concentrations of all congeners were elevated over their site averages, but for

  11. It's not just average faces that are attractive: computer-manipulated averageness makes birds, fish, and automobiles attractive.

    PubMed

    Halberstadt, Jamin; Rhodes, Gillian

    2003-03-01

    Average faces are attractive. We sought to distinguish whether this preference is an adaptation for finding high-quality mates (the direct selection account) or whether it reflects more general information-processing mechanisms. In three experiments, we examined the attractiveness of birds, fish, and automobiles whose averageness had been manipulated using digital image manipulation techniques common in research on facial attractiveness. Both manipulated averageness and rated averageness were strongly associated with attractiveness in all three stimulus categories. In addition, for birds and fish, but not for automobiles, the correlation between subjective averageness and attractiveness remained significant when the effect of subjective familiarity was partialled out. The results suggest that at least two mechanisms contribute to the attractiveness of average exemplars. One is a general preference for familiar stimuli, which contributes to the appeal of averageness in all three categories. The other is a preference for averageness per se, which was found for birds and fish, but not for automobiles, and may reflect a preference for features signaling genetic quality in living organisms, including conspecifics.

  12. Average absorption cross-section of the human body measured at 1-12 GHz in a reverberant chamber: results of a human volunteer study

    NASA Astrophysics Data System (ADS)

    Flintoft, I. D.; Robinson, M. P.; Melia, G. C. R.; Marvin, A. C.; Dawson, J. F.

    2014-07-01

    The electromagnetic absorption cross-section (ACS) averaged over polarization and angle-of-incidence of 60 ungrounded adult subjects was measured at microwave frequencies of 1-12 GHz in a reverberation chamber. Average ACS is important in non-ionizing dosimetry and exposure studies, and is closely related to the whole-body averaged specific absorption rate (WBSAR). The average ACS was measured with a statistical uncertainty of less than 3% and high frequency resolution for individuals with a range of body shapes and sizes allowing the statistical distribution of WBSAR over a real population with individual internal and external morphologies to be determined. The average ACS of all subjects was found to vary from 0.15 to 0.4 m2 for an individual subject it falls with frequency over 1-6 GHz, and then rises slowly over the 6-12 GHz range in which few other studies have been conducted. Average ACS and WBSAR are then used as a surrogate for worst-case ACS/WBSAR, in order to study their variability across a real population compared to literature results from simulations using numerical phantoms with a limited range of anatomies. Correlations with body morphological parameters such as height, mass and waist circumference have been investigated: the strongest correlation is with body surface area (BSA) at all frequencies above 1 GHz, however direct proportionality to BSA is not established until above 5 GHz. When the average ACS is normalized to the BSA, the resulting absorption efficiency shows a negative correlation with the estimated thickness of subcutaneous body fat. Surrogate models and statistical analysis of the measurement data are presented and compared to similar models from the literature. The overall dispersion of measured average WBSAR of the sample of the UK population studied is consistent with the dispersion of simulated worst-case WBSAR across multiple numerical phantom families. The statistical results obtained allow the calibration of human exposure

  13. Average absorption cross-section of the human body measured at 1-12 GHz in a reverberant chamber: results of a human volunteer study.

    PubMed

    Flintoft, I D; Robinson, M P; Melia, G C R; Marvin, A C; Dawson, J F

    2014-07-07

    The electromagnetic absorption cross-section (ACS) averaged over polarization and angle-of-incidence of 60 ungrounded adult subjects was measured at microwave frequencies of 1-12 GHz in a reverberation chamber. Average ACS is important in non-ionizing dosimetry and exposure studies, and is closely related to the whole-body averaged specific absorption rate (WBSAR). The average ACS was measured with a statistical uncertainty of less than 3% and high frequency resolution for individuals with a range of body shapes and sizes allowing the statistical distribution of WBSAR over a real population with individual internal and external morphologies to be determined. The average ACS of all subjects was found to vary from 0.15 to 0.4 m(2); for an individual subject it falls with frequency over 1-6 GHz, and then rises slowly over the 6-12 GHz range in which few other studies have been conducted. Average ACS and WBSAR are then used as a surrogate for worst-case ACS/WBSAR, in order to study their variability across a real population compared to literature results from simulations using numerical phantoms with a limited range of anatomies. Correlations with body morphological parameters such as height, mass and waist circumference have been investigated: the strongest correlation is with body surface area (BSA) at all frequencies above 1 GHz, however direct proportionality to BSA is not established until above 5 GHz. When the average ACS is normalized to the BSA, the resulting absorption efficiency shows a negative correlation with the estimated thickness of subcutaneous body fat. Surrogate models and statistical analysis of the measurement data are presented and compared to similar models from the literature. The overall dispersion of measured average WBSAR of the sample of the UK population studied is consistent with the dispersion of simulated worst-case WBSAR across multiple numerical phantom families. The statistical results obtained allow the calibration of human

  14. National forest visitor spending averages and the influence of trip-type and recreation activity.

    Treesearch

    Eric M. White; Daniel I. Stynes

    2008-01-01

    Estimates of national forest recreation visitor spending serve us inputs to regional economic analyses and help to identify the economic linkages between national forest recreation use and local forest communities. When completing recreation-related analyses, managers, planners, and researchers frequently think of visitors in terms of recreation activity. When...

  15. Lack of a standardised UK care pathway resulting in national variations in management and outcomes of paediatric small area scalds.

    PubMed

    Trevatt, Alexander E J; Kirkham, Emily N; Allix, Bradley; Greenwood, Rosemary; Coy, Karen; Hollén, Linda I; Young, Amber E R

    2016-09-01

    There is a paucity of evidence guiding management of small area partial thickness paediatric scalds. This has prevented the development of national management guidelines for these injuries. This research aimed to investigate whether a lack of evidence for national guidelines has resulted in variations in both management and outcomes of paediatric small area scalds across England and Wales (E&W). A national survey of initial management of paediatric scalds ≤5% Total Body Surface Area (%TBSA) was sent to 14 burns services in E&W. Skin graft rates of anonymised burns services over seven years were collected from the international Burns Injury Database (iBID). Average skin grafting rates across services were compared. Length of stay and proportion of patients receiving general anaesthesia for dressing application at each service were also compared. All 14 burns services responded to the survey. Only 50% of services had a protocol in place for the management of small area burns. All protocols varied in how partial thickness paediatrics scalds ≤5% TBSA should be managed. There was no consensus as to which scalds should be treated using biosynthetic dressings. Data from iBID for 11,917 patients showed that the average reported skin grafting rate across all burns services was 2.3% (95% CI 2.1, 2.6) but varied from 0.3% to 7.1% (P<0.001). Service provider remained associated with likelihood of skin grafting when variations in the %TBSA case mix seen by each service were controlled for (χ(2)=87.3, P<0.001). The use of general anaesthetics across services varied between 0.6 and 35.5% (P<0.001). The median length of stay across services varied from 1 to 3 days (P<0.001). A lack of evidence guiding management of small-area paediatric scalds has resulted in variation in management of these injuries across E&W. There is also significant variation in outcomes for these injuries. Further research is indicated to determine if care pathways and outcomes are linked. An evidence

  16. National Lakes Assessment 2007 Results

    EPA Pesticide Factsheets

    The National Lakes Assessment samples over 1,000 lakes, ponds and reservoirs across the country. Key findings from this assessment in 2007 include the biological condition and most widespread stressors of these waterbodies.

  17. General analytic results on averaging Lemaître-Tolman-Bondi models

    NASA Astrophysics Data System (ADS)

    Sussman, Roberto A.

    2010-12-01

    An effective acceleration, which mimics the effect of dark energy, may arise in the context of Buchert's scalar averaging formalism. We examine the conditions for such an acceleration to occur in the asymptotic radial range in generic spherically symmetric Lemaître-Tolman-Bondi (LTB) dust models. By looking at the behavior of covariant scalars along space slices orthogonal to the 4-velocity, we show that this effective acceleration occurs in a class of models with negative spatial curvature that are asymptotically convergent to sections of Minkowski spacetime. As a consequence, the boundary conditions that favor LTB models with an effective acceleration are not a void inhomogeneity embedded in a homogeneous FLRW background (Swiss cheese models), but a local void or clump embedded in a large cosmic void region represented by asymptotically Minkowski conditions.

  18. Averaging Models: Parameters Estimation with the R-Average Procedure

    ERIC Educational Resources Information Center

    Vidotto, G.; Massidda, D.; Noventa, S.

    2010-01-01

    The Functional Measurement approach, proposed within the theoretical framework of Information Integration Theory (Anderson, 1981, 1982), can be a useful multi-attribute analysis tool. Compared to the majority of statistical models, the averaging model can account for interaction effects without adding complexity. The R-Average method (Vidotto &…

  19. Estimation of average annual streamflows and power potentials for Alaska and Hawaii

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verdin, Kristine L.

    2004-05-01

    This paper describes the work done to develop average annual streamflow estimates and power potential for the states of Alaska and Hawaii. The Elevation Derivatives for National Applications (EDNA) database was used, along with climatic datasets, to develop flow and power estimates for every stream reach in the EDNA database. Estimates of average annual streamflows were derived using state-specific regression equations, which were functions of average annual precipitation, precipitation intensity, drainage area, and other elevation-derived parameters. Power potential was calculated through the use of the average annual streamflow and the hydraulic head of each reach, which is calculated from themore » EDNA digital elevation model. In all, estimates of streamflow and power potential were calculated for over 170,000 stream segments in the Alaskan and Hawaiian datasets.« less

  20. Department of Transportation, National Highway Traffic Safety Administration : light truck average fuel economy standard, model year 1999

    DOT National Transportation Integrated Search

    1997-04-18

    Section 32902(a) of title 49, United States Code, requires the Secretary of Transportation to prescribe by regulation, at least 18 months in advance of each model year, average fuel economy standards (known as "Corporate Average Fuel Economy" or "CAF...

  1. Average volume of alcohol consumption and all-cause mortality in African Americans: the NHEFS cohort.

    PubMed

    Sempos, Christopher T; Rehm, Jürgen; Wu, Tiejian; Crespo, Carlos J; Trevisan, Maurizio

    2003-01-01

    To analyze the relationship between average volume of alcohol consumption and all-cause mortality in African Americans. Prospective cohort study--the NHANES Epidemiologic Follow-Up Study (NHEFS)--with baseline data collected 1971 through 1975 as part of the first National Health and Nutrition Examination Survey (NHANES I) and follow-up through 1992. The analytic data set consisted of 2054 African American men (n = 768) and women (n = 1,286), 25 to 75 years of age, who were followed for approximately 19 years. Alcohol was measured with a quantity-frequency measure at baseline. All-cause mortality. No J-shaped curve was found in the relationship between average volume of alcohol consumption and mortality for male or female African Americans. Instead, no beneficial effect appeared and mortality increased with increasing average consumption for more than one drink a day. The reason for not finding the J-shape in African Americans may be the result of the more detrimental drinking patterns in this ethnicity and consequently the lack of protective effects of alcohol on coronary heart disease. Taking into account sampling design did not substantially change the results from the models, which assumed a simple random sample. If this result can be confirmed in other samples, alcohol policy, especially prevention, should better incorporate patterns of drinking into programs.

  2. INVITED SESSION: THE 2011 NATIONAL WETLAND CONDITION ASSESSMENT: TECHNICAL UNDERPINNINGS AND RESULTS

    EPA Science Inventory

    The first-ever National Wetland Condition Assessment (NWCA) was conducted in 2011 by the US Environmental Protection Agency (USEPA) and its federal and state partners, using a survey design allowing extrapolation of results to national and regional scales. At each of 1138 locatio...

  3. Properties of model-averaged BMDLs: a study of model averaging in dichotomous response risk estimation.

    PubMed

    Wheeler, Matthew W; Bailer, A John

    2007-06-01

    Model averaging (MA) has been proposed as a method of accounting for model uncertainty in benchmark dose (BMD) estimation. The technique has been used to average BMD dose estimates derived from dichotomous dose-response experiments, microbial dose-response experiments, as well as observational epidemiological studies. While MA is a promising tool for the risk assessor, a previous study suggested that the simple strategy of averaging individual models' BMD lower limits did not yield interval estimators that met nominal coverage levels in certain situations, and this performance was very sensitive to the underlying model space chosen. We present a different, more computationally intensive, approach in which the BMD is estimated using the average dose-response model and the corresponding benchmark dose lower bound (BMDL) is computed by bootstrapping. This method is illustrated with TiO(2) dose-response rat lung cancer data, and then systematically studied through an extensive Monte Carlo simulation. The results of this study suggest that the MA-BMD, estimated using this technique, performs better, in terms of bias and coverage, than the previous MA methodology. Further, the MA-BMDL achieves nominal coverage in most cases, and is superior to picking the "best fitting model" when estimating the benchmark dose. Although these results show utility of MA for benchmark dose risk estimation, they continue to highlight the importance of choosing an adequate model space as well as proper model fit diagnostics.

  4. National Alliance of Business Sales Techniques and Results (STAR).

    ERIC Educational Resources Information Center

    Golightly, Steven J.

    This paper presents an overview of the Sales Techniques and Results (STAR) training program developed by the National Alliance of Business in conjunction with IBM. The STAR training program can be used to help vocational directors, teachers, and counselors to be better salespersons for cooperative education or job placement programs. The paper…

  5. Results of the National CT Colonography Trial: Questions and Answers

    Cancer.gov

    Learn the results of the National Computerized Tomographic Colonography (CTC) clinical trial, which evaluated how well CTC identifies participants with at least one significantly large polyp using colonoscopy as the gold (or reference) standard.

  6. Estimating Gestational Age With Sonography: Regression-Derived Formula Versus the Fetal Biometric Average.

    PubMed

    Cawyer, Chase R; Anderson, Sarah B; Szychowski, Jeff M; Neely, Cherry; Owen, John

    2018-03-01

    To compare the accuracy of a new regression-derived formula developed from the National Fetal Growth Studies data to the common alternative method that uses the average of the gestational ages (GAs) calculated for each fetal biometric measurement (biparietal diameter, head circumference, abdominal circumference, and femur length). This retrospective cross-sectional study identified nonanomalous singleton pregnancies that had a crown-rump length plus at least 1 additional sonographic examination with complete fetal biometric measurements. With the use of the crown-rump length to establish the referent estimated date of delivery, each method's (National Institute of Child Health and Human Development regression versus Hadlock average [Radiology 1984; 152:497-501]), error at every examination was computed. Error, defined as the difference between the crown-rump length-derived GA and each method's predicted GA (weeks), was compared in 3 GA intervals: 1 (14 weeks-20 weeks 6 days), 2 (21 weeks-28 weeks 6 days), and 3 (≥29 weeks). In addition, the proportion of each method's examinations that had errors outside prespecified (±) day ranges was computed by using odds ratios. A total of 16,904 sonograms were identified. The overall and prespecified GA range subset mean errors were significantly smaller for the regression compared to the average (P < .01), and the regression had significantly lower odds of observing examinations outside the specified range of error in GA intervals 2 (odds ratio, 1.15; 95% confidence interval, 1.01-1.31) and 3 (odds ratio, 1.24; 95% confidence interval, 1.17-1.32) than the average method. In a contemporary unselected population of women dated by a crown-rump length-derived GA, the National Institute of Child Health and Human Development regression formula produced fewer estimates outside a prespecified margin of error than the commonly used Hadlock average; the differences were most pronounced for GA estimates at 29 weeks and later.

  7. Breastfeeding in Mexico was stable, on average, but deteriorated among the poor, whereas complementary feeding improved: results from the 1999 to 2006 National Health and Nutrition Surveys.

    PubMed

    González de Cossío, Teresita; Escobar-Zaragoza, Leticia; González-Castell, Dinorah; Reyes-Vázquez, Horacio; Rivera-Dommarco, Juan A

    2013-05-01

    We present: 1) indicators of infant and young child feeding practices (IYCFP) and median age of introduction of foods analyzed by geographic and socioeconomic variables for the 2006 national probabilistic Health Nutrition Survey (ENSANUT-2006); and 2) changes in IYCFP indicators between the 1999 national probabilistic Nutrition Survey and ENSANUT-2006, analyzed by the same variables. Participants were women 12-49 y and their <2-y-old children (2953 in 2006 and 3191 in 1999). Indicators were estimated with the status quo method. The median age of introduction of foods was calculated by the Kaplan-Meier method using recall data. The national median duration of breastfeeding was similar in both surveys, 9.7 mo in 1999 and 10.4 mo in 2006, but decreased in the vulnerable population. In 1999 indigenous women breastfed 20.8 mo but did so for only 13.0 mo in 2006. The national percentage of those exclusively breastfeeding <6 mo also remained stable: 20% in 1999 and 22.3% in 2006. Nevertheless, exclusively breastfeeding <6 mo changed within the indigenous population, from 46% in 1999 to 34.5% in 2006. Between surveys, most breastfeeding indicators had lower values in vulnerable populations than in those better-off. Complementary feeding, however, improved overall. Complementary feeding was inadequately timed: median age of introduction of plain water was 3 mo, formula and non-human milk was 5 mo, and cereals, legumes, and animal foods was 5 mo. Late introduction of animal foods occurred among vulnerable indigenous population when 50% consumed these products at 8 mo. Mexican IYCFP indicate that public policy must protect breastfeeding while promoting the timely introduction of complementary feeding.

  8. Results from the Fourth Mathematics Assessment of the National Assessment of Educational Progress.

    ERIC Educational Resources Information Center

    Lindquist, Mary Montgomery, Ed.

    The National Assessment of Educational Progress (NAEP) completed its fourth mathematics assessment during the 1985-86 school year and finished the analyses of the results in 1988. This monograph, prepared by an interpretive team of the National Council of Teachers of Mathematics, represents a comprehensive discussion of the results of the fourth…

  9. NREL: News - Nationally Renowned Architect Panel Announces Judging Results

    Science.gov Websites

    Decathlon Sunday, September 29, 2002 Design, Livability Results Important to Competing University Teams had taken place first in the Design and Livability contest at the Solar Village on the National Mall Texas at Austin third. Design and Livability is one of ten contests in the Solar Decathlon, which runs

  10. Is cepstrum averaging applicable to circularly polarized electric-field data?

    NASA Astrophysics Data System (ADS)

    Tunnell, T.

    1990-04-01

    In FY 1988 a cepstrum averaging technique was developed to eliminate the ground reflections from charged particle beam (CPB) electromagnetic pulse (EMP) data. The work was done for the Los Alamos National Laboratory Project DEWPOINT at SST-7. The technique averages the cepstra of horizontally and vertically polarized electric field data (i.e., linearly polarized electric field data). This cepstrum averaging technique was programmed into the FORTRAN codes CEP and CEPSIM. Steve Knox, the principal investigator for Project DEWPOINT, asked the authors to determine if the cepstrum averaging technique could be applied to circularly polarized electric field data. The answer is, Yes, but some modifications may be necessary. There are two aspects to this answer that we need to address, namely, the Yes and the modifications. First, regarding the Yes, the technique is applicable to elliptically polarized electric field data in general: circular polarization is a special case of elliptical polarization. Secondly, regarding the modifications, greater care may be required in computing the phase in the calculation of the complex logarithm. The calculation of the complex logarithm is the most critical step in cepstrum-based analysis. This memorandum documents these findings.

  11. Writing: National Results--Writing Mechanics.

    ERIC Educational Resources Information Center

    Education Commission of the States, Denver, CO.

    This is the third National Assessment report on the writing of children aged 9, 13, and 17, and young adults. The three exercises used in the writing assessment were: Age 9: The Forest Fire Exercise; Age 13: The Famous Person Exercise; and Age 17: The Famous Person Exercise. An exercise for young adults (Adults: The Commissioner Stroud Letter)…

  12. Trait valence and the better-than-average effect.

    PubMed

    Gold, Ron S; Brown, Mark G

    2011-12-01

    People tend to regard themselves as having superior personality traits compared to their average peer. To test whether this "better-than-average effect" varies with trait valence, participants (N = 154 students) rated both themselves and the average student on traits constituting either positive or negative poles of five trait dimensions. In each case, the better-than-average effect was found, but trait valence had no effect. Results were discussed in terms of Kahneman and Tversky's prospect theory.

  13. Bull Market Helped Endowments Earn Average of 17.2% in 1996.

    ERIC Educational Resources Information Center

    Nicklin, Julie L.

    1997-01-01

    The National Association of College and University Business Officers' annual survey of 1996 college endowment performance found the rate of return up 15.5% from the previous year, the best since 1986. The average institution had 51.6% of endowment in domestic stocks, 25.5% in domestic fixed-income investments, 9.5% in foreign stock, 5.4% in cash…

  14. Average spectral efficiency analysis of FSO links over turbulence channel with adaptive transmissions and aperture averaging

    NASA Astrophysics Data System (ADS)

    Aarthi, G.; Ramachandra Reddy, G.

    2018-03-01

    In our paper, the impact of adaptive transmission schemes: (i) optimal rate adaptation (ORA) and (ii) channel inversion with fixed rate (CIFR) on the average spectral efficiency (ASE) are explored for free-space optical (FSO) communications with On-Off Keying (OOK), Polarization shift keying (POLSK), and Coherent optical wireless communication (Coherent OWC) systems under different turbulence regimes. Further to enhance the ASE we have incorporated aperture averaging effects along with the above adaptive schemes. The results indicate that ORA adaptation scheme has the advantage of improving the ASE performance compared with CIFR under moderate and strong turbulence regime. The coherent OWC system with ORA excels the other modulation schemes and could achieve ASE performance of 49.8 bits/s/Hz at the average transmitted optical power of 6 dBm under strong turbulence. By adding aperture averaging effect we could achieve an ASE of 50.5 bits/s/Hz under the same conditions. This makes ORA with Coherent OWC modulation as a favorable candidate for improving the ASE of the FSO communication system.

  15. Lagrangian averages, averaged Lagrangians, and the mean effects of fluctuations in fluid dynamics.

    PubMed

    Holm, Darryl D.

    2002-06-01

    We begin by placing the generalized Lagrangian mean (GLM) equations for a compressible adiabatic fluid into the Euler-Poincare (EP) variational framework of fluid dynamics, for an averaged Lagrangian. This is the Lagrangian averaged Euler-Poincare (LAEP) theorem. Next, we derive a set of approximate small amplitude GLM equations (glm equations) at second order in the fluctuating displacement of a Lagrangian trajectory from its mean position. These equations express the linear and nonlinear back-reaction effects on the Eulerian mean fluid quantities by the fluctuating displacements of the Lagrangian trajectories in terms of their Eulerian second moments. The derivation of the glm equations uses the linearized relations between Eulerian and Lagrangian fluctuations, in the tradition of Lagrangian stability analysis for fluids. The glm derivation also uses the method of averaged Lagrangians, in the tradition of wave, mean flow interaction. Next, the new glm EP motion equations for incompressible ideal fluids are compared with the Euler-alpha turbulence closure equations. An alpha model is a GLM (or glm) fluid theory with a Taylor hypothesis closure. Such closures are based on the linearized fluctuation relations that determine the dynamics of the Lagrangian statistical quantities in the Euler-alpha equations. Thus, by using the LAEP theorem, we bridge between the GLM equations and the Euler-alpha closure equations, through the small-amplitude glm approximation in the EP variational framework. We conclude by highlighting a new application of the GLM, glm, and alpha-model results for Lagrangian averaged ideal magnetohydrodynamics. (c) 2002 American Institute of Physics.

  16. Does Nationality Matter in the B2C Environment? Results from a Two Nation Study

    NASA Astrophysics Data System (ADS)

    Peikari, Hamid Reza

    Different studies have explored the relations between different dimensions of e-commerce transactions and lots of models and findings have been proposed to the academic and business worlds. However, there is a doubt on the applications and generalization of such models and findings in different countries and nations. In other words, this study argues that the relations among the variables of a model ay differ in different countries, which raises questions on the findings of researchers collecting data in one country to test their hypotheses. This study intends to examine if different nations have different perceptions toward the elements of Website interface, security and purchase intention on Internet. Moreover, a simple model was developed to investigate whether the independent variables of the model are equally important in different nations and significantly influence the dependent variable in such nations or not. Since majority of the studies in the context of e-commerce were either focused on the developed countries which have a high e-readiness indices and overall ranks, two developing countries with different e-readiness indices and ranks were selected for the data collection. The results showed that the samples had different significant perceptions of security and some of the Website interface factors. Moreover, it was found that the significance of relations among the independent variables ad the dependent variable are different between the samples, which questions the findings of the researchers testing their model and hypotheses only based on the data collected in one country.

  17. 47 CFR 1.959 - Computation of average terrain elevation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Section 1.959 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Wireless..., average terrain elevation must be calculated by computer using elevations from a 30 second point or better..., if the results differ significantly from the computer derived averages. (a) Radial average terrain...

  18. 47 CFR 1.959 - Computation of average terrain elevation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Section 1.959 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Wireless..., average terrain elevation must be calculated by computer using elevations from a 30 second point or better..., if the results differ significantly from the computer derived averages. (a) Radial average terrain...

  19. Modern average global sea-surface temperature

    USGS Publications Warehouse

    Schweitzer, Peter N.

    1993-01-01

    The data contained in this data set are derived from the NOAA Advanced Very High Resolution Radiometer Multichannel Sea Surface Temperature data (AVHRR MCSST), which are obtainable from the Distributed Active Archive Center at the Jet Propulsion Laboratory (JPL) in Pasadena, Calif. The JPL tapes contain weekly images of SST from October 1981 through December 1990 in nine regions of the world ocean: North Atlantic, Eastern North Atlantic, South Atlantic, Agulhas, Indian, Southeast Pacific, Southwest Pacific, Northeast Pacific, and Northwest Pacific. This data set represents the results of calculations carried out on the NOAA data and also contains the source code of the programs that made the calculations. The objective was to derive the average sea-surface temperature of each month and week throughout the whole 10-year series, meaning, for example, that data from January of each year would be averaged together. The result is 12 monthly and 52 weekly images for each of the oceanic regions. Averaging the images in this way tends to reduce the number of grid cells that lack valid data and to suppress interannual variability.

  20. Reproducing multi-model ensemble average with Ensemble-averaged Reconstructed Forcings (ERF) in regional climate modeling

    NASA Astrophysics Data System (ADS)

    Erfanian, A.; Fomenko, L.; Wang, G.

    2016-12-01

    Multi-model ensemble (MME) average is considered the most reliable for simulating both present-day and future climates. It has been a primary reference for making conclusions in major coordinated studies i.e. IPCC Assessment Reports and CORDEX. The biases of individual models cancel out each other in MME average, enabling the ensemble mean to outperform individual members in simulating the mean climate. This enhancement however comes with tremendous computational cost, which is especially inhibiting for regional climate modeling as model uncertainties can originate from both RCMs and the driving GCMs. Here we propose the Ensemble-based Reconstructed Forcings (ERF) approach to regional climate modeling that achieves a similar level of bias reduction at a fraction of cost compared with the conventional MME approach. The new method constructs a single set of initial and boundary conditions (IBCs) by averaging the IBCs of multiple GCMs, and drives the RCM with this ensemble average of IBCs to conduct a single run. Using a regional climate model (RegCM4.3.4-CLM4.5), we tested the method over West Africa for multiple combination of (up to six) GCMs. Our results indicate that the performance of the ERF method is comparable to that of the MME average in simulating the mean climate. The bias reduction seen in ERF simulations is achieved by using more realistic IBCs in solving the system of equations underlying the RCM physics and dynamics. This endows the new method with a theoretical advantage in addition to reducing computational cost. The ERF output is an unaltered solution of the RCM as opposed to a climate state that might not be physically plausible due to the averaging of multiple solutions with the conventional MME approach. The ERF approach should be considered for use in major international efforts such as CORDEX. Key words: Multi-model ensemble, ensemble analysis, ERF, regional climate modeling

  1. Compensatory Education: An Analysis of National Versus State and Local Evaluations.

    ERIC Educational Resources Information Center

    Stickney, Benjamin D.

    Discrepancies between the pessimistic national evaluations and the more encouraging state and local evaluations of compensatory education programs are explored in this paper. A conventional explanation for the discrepancies, namely, that the averaging of test scores from a large national sample "cancels out" the results of successful…

  2. Books average previous decade of economic misery.

    PubMed

    Bentley, R Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20(th) century since the Depression, we find a strong correlation between a 'literary misery index' derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade.

  3. Thomson scattering in the average-atom approximation.

    PubMed

    Johnson, W R; Nilsen, J; Cheng, K T

    2012-09-01

    The average-atom model is applied to study Thomson scattering of x-rays from warm dense matter with emphasis on scattering by bound electrons. Parameters needed to evaluate the dynamic structure function (chemical potential, average ionic charge, free electron density, bound and continuum wave functions, and occupation numbers) are obtained from the average-atom model. The resulting analysis provides a relatively simple diagnostic for use in connection with x-ray scattering measurements. Applications are given to dense hydrogen, beryllium, aluminum, and titanium plasmas. In the case of titanium, bound states are predicted to modify the spectrum significantly.

  4. Scalable Robust Principal Component Analysis Using Grassmann Averages.

    PubMed

    Hauberg, Sren; Feragen, Aasa; Enficiaud, Raffi; Black, Michael J

    2016-11-01

    In large datasets, manual data verification is impossible, and we must expect the number of outliers to increase with data size. While principal component analysis (PCA) can reduce data size, and scalable solutions exist, it is well-known that outliers can arbitrarily corrupt the results. Unfortunately, state-of-the-art approaches for robust PCA are not scalable. We note that in a zero-mean dataset, each observation spans a one-dimensional subspace, giving a point on the Grassmann manifold. We show that the average subspace corresponds to the leading principal component for Gaussian data. We provide a simple algorithm for computing this Grassmann Average ( GA), and show that the subspace estimate is less sensitive to outliers than PCA for general distributions. Because averages can be efficiently computed, we immediately gain scalability. We exploit robust averaging to formulate the Robust Grassmann Average (RGA) as a form of robust PCA. The resulting Trimmed Grassmann Average ( TGA) is appropriate for computer vision because it is robust to pixel outliers. The algorithm has linear computational complexity and minimal memory requirements. We demonstrate TGA for background modeling, video restoration, and shadow removal. We show scalability by performing robust PCA on the entire Star Wars IV movie; a task beyond any current method. Source code is available online.

  5. Beliefs about Obedience Levels in Studies Conducted within the Milgram Paradigm: Better than Average Effect and Comparisons of Typical Behaviors by Residents of Various Nations.

    PubMed

    Grzyb, Tomasz; Dolinski, Dariusz

    2017-01-01

    The article presents studies examining whether the better than average (BTA) effect appears in opinions regarding obedience of individuals participating in an experiment conducted in the Milgram paradigm. Participants are presented with a detailed description of the experiment, asked to declare at what moment an average participant would cease their participation in the study, and then asked to declare at what moment they themselves would quit the experiment. It turned out that the participants demonstrated a strong BTA effect. This effect also concerned those who had known the results of the Milgram experiment prior to the study. Interestingly, those individuals-in contrast to naive participants-judged that the average person would remain obedient for longer, but at the same time prior familiarity with the Milgram experiment did not impact convictions as to own obedience. By the same token, the BTA effect size was larger among those who had previously heard of the Milgram experiment than those who had not. Additionally, study participants were asked to estimate the behavior of the average resident of their country (Poland), as well as of average residents of several other European countries. It turned out that in participants' judgment the average Pole would withdraw from the experiment quicker than the average Russian and average German, but later than average residents of France and England.

  6. Beliefs about Obedience Levels in Studies Conducted within the Milgram Paradigm: Better than Average Effect and Comparisons of Typical Behaviors by Residents of Various Nations

    PubMed Central

    Grzyb, Tomasz; Dolinski, Dariusz

    2017-01-01

    The article presents studies examining whether the better than average (BTA) effect appears in opinions regarding obedience of individuals participating in an experiment conducted in the Milgram paradigm. Participants are presented with a detailed description of the experiment, asked to declare at what moment an average participant would cease their participation in the study, and then asked to declare at what moment they themselves would quit the experiment. It turned out that the participants demonstrated a strong BTA effect. This effect also concerned those who had known the results of the Milgram experiment prior to the study. Interestingly, those individuals—in contrast to naive participants—judged that the average person would remain obedient for longer, but at the same time prior familiarity with the Milgram experiment did not impact convictions as to own obedience. By the same token, the BTA effect size was larger among those who had previously heard of the Milgram experiment than those who had not. Additionally, study participants were asked to estimate the behavior of the average resident of their country (Poland), as well as of average residents of several other European countries. It turned out that in participants’ judgment the average Pole would withdraw from the experiment quicker than the average Russian and average German, but later than average residents of France and England. PMID:28979232

  7. The Nation's Report Card Reading 2011 State Snapshot Report. Oregon. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to other…

  8. The Nation's Report Card Reading 2011 State Snapshot Report. Oregon. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to other…

  9. The Nation's Report Card Mathematics 2011 State Snapshot Report. Oregon. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 mathematics assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to…

  10. The Nation's Report Card Mathematics 2011 State Snapshot Report. Oregon. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 mathematics assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to…

  11. The Nation's Report Card Reading 2011 State Snapshot Report. Minnesota. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to other…

  12. The Nation's Report Card Mathematics 2011 State Snapshot Report. Minnesota. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to other…

  13. The Nation's Report Card Mathematics 2011 State Snapshot Report. Minnesota. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 mathematics assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to…

  14. The Nation's Report Card Reading 2011 State Snapshot Report. Minnesota. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to other…

  15. The Nation's Report Card Reading 2011 State Snapshot Report. Hawaii. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to other…

  16. The Nation's Report Card Reading 2009 State Snapshot Report. Hawaii. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2010

    2010-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2009 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2009 to other…

  17. The Nation's Report Card Mathematics 2011 State Snapshot Report. Hawaii. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 mathematics assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to…

  18. The Nation's Report Card Reading 2009 State Snapshot Report. Hawaii. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2010

    2010-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2009 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2009 to other…

  19. The Nation's Report Card Reading 2011 State Snapshot Report. Hawaii. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to other…

  20. The Nation's Report Card Mathematics 2011 State Snapshot Report. Hawaii. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to other…

  1. The Nation's Report Card Reading 2009 State Snapshot Report. Idaho. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2010

    2010-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2009 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2009 to other…

  2. The Nation's Report Card Mathematics 2011 State Snapshot Report. Idaho. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 mathematics assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to…

  3. The Nation's Report Card Reading 2009 State Snapshot Report. Idaho. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2010

    2010-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2009 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2009 to other…

  4. The Nation's Report Card Reading 2011 State Snapshot Report. Idaho. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to other…

  5. The Nation's Report Card Reading 2011 State Snapshot Report. Idaho. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to other…

  6. The Nation's Report Card Mathematics 2011 State Snapshot Report. Idaho. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to other…

  7. Tennessee advanced practice nurse compensation survey results 2006-2007.

    PubMed

    Arnold, Kimberly

    2007-01-01

    In 2006, representatives from Middle Tennessee Advanced Practice Nurses (MTAPN), Greater Memphis Area Advanced Practice Nurses (GMAAPN), and Northeast Tennessee Nurse Practitioners Association (NETNPA) decided to poll APNs in Tennessee to compare data with the most recent results from the Advance for Nurse Practitioners national NP survey. Every other year, Advance for Nurse Practitioners publishes salary survey results from their survey. Most recently, in January 2006, an average nationwide salary for all APNs was reported at $74,812, with Tennessee's average at $71,068.

  8. Optimal Budget Allocation for Sample Average Approximation

    DTIC Science & Technology

    2011-06-01

    an optimization algorithm applied to the sample average problem. We examine the convergence rate of the estimator as the computing budget tends to...regime for the optimization algorithm . 1 Introduction Sample average approximation (SAA) is a frequently used approach to solving stochastic programs...appealing due to its simplicity and the fact that a large number of standard optimization algorithms are often available to optimize the resulting sample

  9. Books Average Previous Decade of Economic Misery

    PubMed Central

    Bentley, R. Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20th century since the Depression, we find a strong correlation between a ‘literary misery index’ derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade. PMID:24416159

  10. 49 CFR 537.9 - Determination of fuel economy values and average fuel economy.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 6 2010-10-01 2010-10-01 false Determination of fuel economy values and average fuel economy. 537.9 Section 537.9 Transportation Other Regulations Relating to Transportation (Continued) NATIONAL HIGHWAY TRAFFIC SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AUTOMOTIVE FUEL ECONOMY REPORTS § 537.9 Determination of fuel...

  11. Average luminosity distance in inhomogeneous universes

    NASA Astrophysics Data System (ADS)

    Kostov, Valentin Angelov

    Using numerical ray tracing, the paper studies how the average distance modulus in an inhomogeneous universe differs from its homogeneous counterpart. The averaging is over all directions from a fixed observer not over all possible observers (cosmic), thus it is more directly applicable to our observations. Unlike previous studies, the averaging is exact, non-perturbative, an includes all possible non-linear effects. The inhomogeneous universes are represented by Sweese-cheese models containing random and simple cubic lattices of mass- compensated voids. The Earth observer is in the homogeneous cheese which has an Einstein - de Sitter metric. For the first time, the averaging is widened to include the supernovas inside the voids by assuming the probability for supernova emission from any comoving volume is proportional to the rest mass in it. For voids aligned in a certain direction, there is a cumulative gravitational lensing correction to the distance modulus that increases with redshift. That correction is present even for small voids and depends on the density contrast of the voids, not on their radius. Averaging over all directions destroys the cumulative correction even in a non-randomized simple cubic lattice of voids. Despite the well known argument for photon flux conservation, the average distance modulus correction at low redshifts is not zero due to the peculiar velocities. A formula for the maximum possible average correction as a function of redshift is derived and shown to be in excellent agreement with the numerical results. The formula applies to voids of any size that: (1) have approximately constant densities in their interior and walls, (2) are not in a deep nonlinear regime. The actual average correction calculated in random and simple cubic void lattices is severely damped below the predicted maximum. That is traced to cancelations between the corrections coming from the fronts and backs of different voids at the same redshift from the

  12. National Assessment of Educational Progress, Report 1--Science: National Results. Observations and Commentary of a Panel of Reviewers.

    ERIC Educational Resources Information Center

    National Assessment of Educational Progress, Ann Arbor, MI.

    Presented are five reviews of the National Assessment of Educational Progress results in science. Dr. Mildred Ballou discusses the objectives of the assessment by age level with concern over explanations for responses, social implications, and validity of testing exercises. Wilmer Cooksey comments on the results as viewed by the classroom teacher…

  13. Calculating High Speed Centrifugal Compressor Performance from Averaged Measurements

    NASA Astrophysics Data System (ADS)

    Lou, Fangyuan; Fleming, Ryan; Key, Nicole L.

    2012-12-01

    To improve the understanding of high performance centrifugal compressors found in modern aircraft engines, the aerodynamics through these machines must be experimentally studied. To accurately capture the complex flow phenomena through these devices, research facilities that can accurately simulate these flows are necessary. One such facility has been recently developed, and it is used in this paper to explore the effects of averaging total pressure and total temperature measurements to calculate compressor performance. Different averaging techniques (including area averaging, mass averaging, and work averaging) have been applied to the data. Results show that there is a negligible difference in both the calculated total pressure ratio and efficiency for the different techniques employed. However, the uncertainty in the performance parameters calculated with the different averaging techniques is significantly different, with area averaging providing the least uncertainty.

  14. 2013-2014 National Roadside Study of alcohol and drug use by drivers : alcohol results.

    DOT National Transportation Integrated Search

    2016-12-01

    This report describes the alcohol results from the 20132014 National Roadside Survey (NRS), a national field study to : estimate the prevalence of alcohol-, drug-, and alcohol-plus-drug-involved driving, primarily among nighttime weekend : drivers...

  15. The national forest inventory in China: History, results, international context

    Treesearch

    WeiSheng Zeng; Erkki Tomppo; Sean P. Healey; Klaus V. Gadow

    2015-01-01

    Main results and important changes in China’s NFI are documented, both to support continued trend analysis and to provide data users with historical perspective. New technologies and data needs ensure that the Chinese NFI, like the national inventories in other countries, will continue to evolve. Within the context of historical change and current conditions, likely...

  16. Financial performance among adult day centers: results of a national demonstration program.

    PubMed

    Reifler, B V; Henry, R S; Rushing, J; Yates, M K; Cox, N J; Bradham, D D; McFarlane, M

    1997-02-01

    This paper describes the financial performance (defined as percent of total expenses covered by net operating revenue) of 16 adult day centers participating in a national demonstration program on day services for people with dementia, including examination of possible predictors of financial performance. Participating sites submitted quarterly financial and utilization reports to the National Program Office. Descriptive statistics summarize the factors believed to influence financial performance. Sites averaged meeting 35% of expenses from self-pay and 29% from government (mainly Medicaid) revenue, totaling 64% of all (cash plus in-kind) expenses met by operating revenue. Examination of center characteristics suggests that factors related to meeting consumer needs, such as being open a full day (i.e., 7:30 am to 6:00 pm) rather than shorter hours, and providing transportation, may be related to improved utilization and, thus, improved financial performance. Higher fees were not related to lower enrollment, census, or revenue. Adult day centers are able to achieve financial viability through a combination of operating (i.e., fee-for-service) and non-operating revenue. Operating revenue is enhanced by placing emphasis on consumer responsiveness, such as being open a full day. Because higher fees were not related to lower utilization, centers should set fees to reflect actual costs. The figure of 64% of expenses met by operating revenue is conservative inasmuch as sites included in-kind revenue as expenses in their budgeting calculations, and percent of cash expenses met by operating revenue would be higher (approximately 75% for this group of centers).

  17. The Nation's Report Card Mathematics 2011 State Snapshot Report. West Virginia. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 mathematics assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to…

  18. The Nation's Report Card Reading 2011 State Snapshot Report. West Virginia. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to other…

  19. The Nation's Report Card Reading 2009 State Snapshot Report. West Virginia. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2010

    2010-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2009 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2009 to other…

  20. The Nation's Report Card Mathematics 2011 State Snapshot Report. West Virginia. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 mathematics assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to…

  1. The Nation's Report Card Reading 2009 State Snapshot Report. West Virginia. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2010

    2010-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2009 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2009 to other…

  2. The Nation's Report Card Reading 2011 State Snapshot Report. West Virginia. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to other…

  3. The Nation's Report Card Reading 2011 State Snapshot Report. New Hampshire. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to other…

  4. The Nation's Report Card Reading 2011 State Snapshot Report. New Hampshire. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to other…

  5. The Nation's Report Card Mathematics 2011 State Snapshot Report. New Hampshire. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to other…

  6. The Nation's Report Card Reading 2009 State Snapshot Report. New Hampshire. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2010

    2010-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2009 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2009 to other…

  7. The Nation's Report Card Mathematics 2011 State Snapshot Report. New Hampshire. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 mathematics assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to…

  8. The Nation's Report Card Reading 2009 State Snapshot Report. New Hampshire. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2010

    2010-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2009 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2009 to other…

  9. The Nation's Report Card Reading 2011 State Snapshot Report. DoDEA. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to other…

  10. The Nation's Report Card Mathematics 2011 State Snapshot Report. DoDEA. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 mathematics assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to…

  11. The Nation's Report Card Reading 2011 State Snapshot Report. DoDEA. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to other…

  12. The Nation's Report Card Reading 2009 State Snapshot Report. DoDEA. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2010

    2010-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2009 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2009 to other…

  13. The Nation's Report Card Mathematics 2011 State Snapshot Report. DoDEA. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2011 mathematics assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2011 to…

  14. The Nation's Report Card Reading 2009 State Snapshot Report. DoDEA. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2010

    2010-01-01

    Each state and jurisdiction that participated in the National Assessment of Educational Progress (NAEP) 2009 reading assessment receives a one-page snapshot report that presents key findings and trends in a condensed format. Overall results, achievement level percentages and average score results, comparison of the average score in 2009 to other…

  15. Results from the 2002 National Survey on Drug Use and Health: National Findings.

    ERIC Educational Resources Information Center

    Substance Abuse and Mental Health Services Administration (DHHS/PHS), Rockville, MD. Office of Applied Studies.

    This report presents the first information from the 2002 National Survey on Drug Use and Health (NSDUH), an annual survey of the civilian, noninstitutionalized population of the United States aged 12 years old or older. Prior to 2002, the survey was called the National Household Survey on Drug Abuse (NHSDA). This initial report on the 2002 data…

  16. 75 FR 21155 - National Equal Pay Day, 2010

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-23

    ... America A Proclamation Throughout our Nation's history, extraordinary women have broken barriers to... Equal Pay Day symbolizes the day when an average American woman's earnings finally match what an average... celebrate the strength and vibrancy women add to our economy. Our Nation's workforce includes more women...

  17. The 1988 CDA National Survey Results.

    ERIC Educational Resources Information Center

    Council for Early Childhood Professional Recognition, Washington, DC.

    A 44-item questionnare was sent to 11,000 Child Development Associates in the fall of 1988 in an effort to provide an updated view of the constituency served by the Child Development Association (CDA) National Credentialing Program. The questionnaire covered four categories: (1) background information; (2) education and experience; (3) training…

  18. Average of delta: a new quality control tool for clinical laboratories.

    PubMed

    Jones, Graham R D

    2016-01-01

    Average of normals is a tool used to control assay performance using the average of a series of results from patients' samples. Delta checking is a process of identifying errors in individual patient results by reviewing the difference from previous results of the same patient. This paper introduces a novel alternate approach, average of delta, which combines these concepts to use the average of a number of sequential delta values to identify changes in assay performance. Models for average of delta and average of normals were developed in a spreadsheet application. The model assessed the expected scatter of average of delta and average of normals functions and the effect of assay bias for different values of analytical imprecision and within- and between-subject biological variation and the number of samples included in the calculations. The final assessment was the number of patients' samples required to identify an added bias with 90% certainty. The model demonstrated that with larger numbers of delta values, the average of delta function was tighter (lower coefficient of variation). The optimal number of samples for bias detection with average of delta was likely to be between 5 and 20 for most settings and that average of delta outperformed average of normals when the within-subject biological variation was small relative to the between-subject variation. Average of delta provides a possible additional assay quality control tool which theoretical modelling predicts may be more valuable than average of normals for analytes where the group biological variation is wide compared with within-subject variation and where there is a high rate of repeat testing in the laboratory patient population. © The Author(s) 2015.

  19. The causal meaning of Fisher’s average effect

    PubMed Central

    LEE, JAMES J.; CHOW, CARSON C.

    2013-01-01

    Summary In order to formulate the Fundamental Theorem of Natural Selection, Fisher defined the average excess and average effect of a gene substitution. Finding these notions to be somewhat opaque, some authors have recommended reformulating Fisher’s ideas in terms of covariance and regression, which are classical concepts of statistics. We argue that Fisher intended his two averages to express a distinction between correlation and causation. On this view, the average effect is a specific weighted average of the actual phenotypic changes that result from physically changing the allelic states of homologous genes. We show that the statistical and causal conceptions of the average effect, perceived as inconsistent by Falconer, can be reconciled if certain relationships between the genotype frequencies and non-additive residuals are conserved. There are certain theory-internal considerations favouring Fisher’s original formulation in terms of causality; for example, the frequency-weighted mean of the average effects equaling zero at each locus becomes a derivable consequence rather than an arbitrary constraint. More broadly, Fisher’s distinction between correlation and causation is of critical importance to gene-trait mapping studies and the foundations of evolutionary biology. PMID:23938113

  20. Video game console usage and US national energy consumption: Results from a field-metering study

    DOE PAGES

    Desroches, Louis-Benoit; Greenblatt, Jeffery B.; Pratt, Stacy; ...

    2014-10-23

    There has been an increased in attention placed on the energy consumption of miscellaneous electronic loads in buildings by energy analysts and policymakers in recent years. The share of electricity consumed by consumer electronics in US households has increased in the last decade. Many devices, however, lack robust energy use data, making energy consumption estimates difficult and uncertain. Video game consoles are high-performance machines present in approximately half of all households and can consume a considerable amount of power. The precise usage of game consoles has significant uncertainty, however, leading to a wide range of recent national energy consumption estimates.more » We present here an analysis based on field-metered usage data, collected as part of a larger field metering study in the USA. This larger study collected data from 880 households in 2012 on a variety of devices, including 113 game consoles (the majority of which are Generation 7 consoles). From our metering, we find that although some consoles are left on nearly 24 h/day, the overall average usage is lower than many other studies have assumed, leading to a US national energy consumption estimate of 7.1 TWh in 2012. Nevertheless, there is an opportunity to reduce energy use with proper game console power management, as a substantial amount of game console usage occurs with the television turned off. The emergence of Generation 8 consoles may increase national energy consumption.« less

  1. Video game console usage and US national energy consumption: Results from a field-metering study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Desroches, Louis-Benoit; Greenblatt, Jeffery B.; Pratt, Stacy

    There has been an increased in attention placed on the energy consumption of miscellaneous electronic loads in buildings by energy analysts and policymakers in recent years. The share of electricity consumed by consumer electronics in US households has increased in the last decade. Many devices, however, lack robust energy use data, making energy consumption estimates difficult and uncertain. Video game consoles are high-performance machines present in approximately half of all households and can consume a considerable amount of power. The precise usage of game consoles has significant uncertainty, however, leading to a wide range of recent national energy consumption estimates.more » We present here an analysis based on field-metered usage data, collected as part of a larger field metering study in the USA. This larger study collected data from 880 households in 2012 on a variety of devices, including 113 game consoles (the majority of which are Generation 7 consoles). From our metering, we find that although some consoles are left on nearly 24 h/day, the overall average usage is lower than many other studies have assumed, leading to a US national energy consumption estimate of 7.1 TWh in 2012. Nevertheless, there is an opportunity to reduce energy use with proper game console power management, as a substantial amount of game console usage occurs with the television turned off. The emergence of Generation 8 consoles may increase national energy consumption.« less

  2. Community College Economics Instruction: Results from a National Science Foundation Project

    ERIC Educational Resources Information Center

    Maier, Mark; Chi, W. Edward

    2016-01-01

    The principal investigator of a National Science Foundation project, "Economics at Community Colleges," surveyed community college economics faculty and organized workshops, webinars, and regional meetings to address community college faculty isolation from new ideas in economics and economics instruction. Survey results, combined with…

  3. Coffee, Alcohol, Smoking, Physical Activity and QT Interval Duration: Results from the Third National Health and Nutrition Examination Survey

    PubMed Central

    Zhang, Yiyi; Post, Wendy S.; Dalal, Darshan; Blasco-Colmenares, Elena; Tomaselli, Gordon F.; Guallar, Eliseo

    2011-01-01

    Background Abnormalities in the electrocardiographic QT interval duration have been associated with an increased risk of ventricular arrhythmias and sudden cardiac death. However, there is substantial uncertainty about the effect of modifiable factors such as coffee intake, cigarette smoking, alcohol consumption, and physical activity on QT interval duration. Methods We studied 7795 men and women from the Third National Health and Nutrition Survey (NHANES III, 1988–1994). Baseline QT interval was measured from the standard 12-lead electrocardiogram. Coffee and tea intake, alcohol consumption, leisure-time physical activities over the past month, and lifetime smoking habits were determined using validated questionnaires during the home interview. Results In the fully adjusted model, the average differences in QT interval comparing participants drinking ≥6 cups/day to those who did not drink any were −1.2 ms (95% CI −4.4 to 2.0) for coffee, and −2.0 ms (−11.2 to 7.3) for tea, respectively. The average differences in QT interval duration comparing current to never smokers was 1.2 ms (−0.6 to 2.9) while the average difference in QT interval duration comparing participants drinking ≥7 drinks/week to non-drinkers was 1.8 ms (−0.5 to 4.0). The age, race/ethnicity, and RR-interval adjusted differences in average QT interval duration comparing men with binge drinking episodes to non-drinkers or drinkers without binge drinking were 2.8 ms (0.4 to 5.3) and 4.0 ms (1.6 to 6.4), respectively. The corresponding differences in women were 1.1 (−2.9 to 5.2) and 1.7 ms (−2.3 to 5.7). Finally, the average differences in QT interval comparing the highest vs. the lowest categories of total physical activity was −0.8 ms (−3.0 to 1.4). Conclusion Binge drinking was associated with longer QT interval in men but not in women. QT interval duration was not associated with other modifiable factors including coffee and tea intake, smoking, and physical activity. PMID

  4. National Air Toxic Assessments (NATA) Results

    EPA Pesticide Factsheets

    The National Air Toxics Assessment was conducted by EPA in 2002 to assess air toxics emissions in order to identify and prioritize air toxics, emission source types and locations which are of greatest potential concern in terms of contributing to population risk. This data source provides downloadable information on emissions at the state, county and census tract level.

  5. Factors That Predict Marijuana Use and Grade Point Average among Undergraduate College Students

    ERIC Educational Resources Information Center

    Coco, Marlena B.

    2017-01-01

    The purpose of this study was to analyze factors that predict marijuana use and grade point average among undergraduate college students using the Core Institute national database. The Core Alcohol and Drug Survey was used to collect data on students' attitudes, beliefs, and experiences related to substance use in college. The sample used in this…

  6. Aerodynamic surface stress intermittency and conditionally averaged turbulence statistics

    NASA Astrophysics Data System (ADS)

    Anderson, William; Lanigan, David

    2015-11-01

    Aeolian erosion is induced by aerodynamic stress imposed by atmospheric winds. Erosion models prescribe that sediment flux, Q, scales with aerodynamic stress raised to exponent, n, where n > 1 . Since stress (in fully rough, inertia-dominated flows) scales with incoming velocity squared, u2, it follows that q ~u2n (where u is some relevant component of the flow). Thus, even small (turbulent) deviations of u from its time-mean may be important for aeolian activity. This rationale is augmented given that surface layer turbulence exhibits maximum Reynolds stresses in the fluid immediately above the landscape. To illustrate the importance of stress intermittency, we have used conditional averaging predicated on stress during large-eddy simulation of atmospheric boundary layer flow over an arid, bare landscape. Conditional averaging provides an ensemble-mean visualization of flow structures responsible for erosion `events'. Preliminary evidence indicates that surface stress peaks are associated with the passage of inclined, high-momentum regions flanked by adjacent low-momentum regions. We characterize geometric attributes of such structures and explore streamwise and vertical vorticity distribution within the conditionally averaged flow field. This work was supported by the National Sci. Foundation, Phys. and Dynamic Meteorology Program (PM: Drs. N. Anderson, C. Lu, and E. Bensman) under Grant # 1500224. Computational resources were provided by the Texas Adv. Comp. Center at the Univ. of Texas.

  7. Usual coffee intake in Brazil: results from the National Dietary Survey 2008-9.

    PubMed

    Sousa, Alessandra Gaspar; da Costa, Teresa Helena Macedo

    2015-05-28

    Coffee is central to the economy of many developing countries, as well as to the world economy. However, despite the widespread consumption of coffee, there are very few available data showing the usual intake of this beverage. Surveying usual coffee intake is a way of monitoring one aspect of a population's usual dietary intake. Thus, the present study aimed to characterise the usual daily coffee intake in the Brazilian population. We used data from the National Dietary Survey collected in 2008-9 from a probabilistic sample of 34,003 Brazilians aged 10 years and older. The National Cancer Institute method was applied to obtain the usual intake based on two nonconsecutive food diaries, and descriptive statistical analyses were performed by age and sex for Brazil and its regions. The estimated average usual daily coffee intake of the Brazilian population was 163 (SE 2.8) ml. The comparison by sex showed that males had a 12% greater usual coffee intake than females. In addition, the highest intake was recorded among older males. Among the five regions surveyed, the North-East had the highest usual coffee intake (175 ml). The most common method of brewing coffee was filtered/instant coffee (71%), and the main method of sweetening beverages was with sugar (87%). In Brazil, the mean usual coffee intake corresponds to 163 ml, or 1.5 cups/d. Differences in usual coffee intake according to sex and age differed among the five Brazilian regions.

  8. Passenger vehicle driver cell phone use : results from the fall 2000 National Occupant Protection Use Survey

    DOT National Transportation Integrated Search

    2001-07-01

    The National Highway Traffic Safety Administration's National Occupant Protection Use Survey (NOPUS) expanded its data collection protocols during October and November 2000 to obtain national estimates of driver cell phone use. The results of NOPUS f...

  9. Perceptual Averaging in Individuals with Autism Spectrum Disorder.

    PubMed

    Corbett, Jennifer E; Venuti, Paola; Melcher, David

    2016-01-01

    There is mounting evidence that observers rely on statistical summaries of visual information to maintain stable and coherent perception. Sensitivity to the mean (or other prototypical value) of a visual feature (e.g., mean size) appears to be a pervasive process in human visual perception. Previous studies in individuals diagnosed with Autism Spectrum Disorder (ASD) have uncovered characteristic patterns of visual processing that suggest they may rely more on enhanced local representations of individual objects instead of computing such perceptual averages. To further explore the fundamental nature of abstract statistical representation in visual perception, we investigated perceptual averaging of mean size in a group of 12 high-functioning individuals diagnosed with ASD using simplified versions of two identification and adaptation tasks that elicited characteristic perceptual averaging effects in a control group of neurotypical participants. In Experiment 1, participants performed with above chance accuracy in recalling the mean size of a set of circles ( mean task ) despite poor accuracy in recalling individual circle sizes ( member task ). In Experiment 2, their judgments of single circle size were biased by mean size adaptation. Overall, these results suggest that individuals with ASD perceptually average information about sets of objects in the surrounding environment. Our results underscore the fundamental nature of perceptual averaging in vision, and further our understanding of how autistic individuals make sense of the external environment.

  10. Expanding the g-Nexus: Further Evidence Regarding the Relations among National IQ, Religiosity and National Health Outcomes

    ERIC Educational Resources Information Center

    Reeve, Charlie L.

    2009-01-01

    The current study seeks to better understand how religiosity and health are positioned within the g-nexus. Specifically, the degree to which differences in average IQ across nations is associated with differences in national religiosity (i.e., belief rate) and national health statistics independent of differences in national wealth is examined.…

  11. Authoritarian parenting and youth depression: Results from a national study.

    PubMed

    King, Keith A; Vidourek, Rebecca A; Merianos, Ashley L

    2016-01-01

    Depression is a prevalent illness affecting youth across the nation. The study purpose was to examine depression and authoritarian parenting among youth from 12 to 17 years of age. A secondary data analysis of the National Survey on Drug Use and Health was performed in the present study. All participants in the present study were youth (N = 17,399) nationwide. The results revealed that 80.6% of youth participants reported having five or more depressive symptoms. Parenting styles based on depression significantly differed among males, females, 12-13-year-olds, 14-15-year-olds, and 16-17-year-olds. Specifically, those who reported experiencing authoritarian parenting practices were more likely to report depressive symptoms compared to their counterparts who experienced authoritative parenting practices. Emphasizing the role of the parents and teaching positive parenting practices and authoritative parenting styles may increase success of prevention programs.

  12. The Nation's Report Card Mathematics 2011 Trial Urban District Snapshot Report. Dallas Public Schools. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…

  13. The Nation's Report Card Mathematics 2011 Trial Urban District Snapshot Report. Dallas Public Schools. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…

  14. The Nation's Report Card Reading 2011 Trial Urban District Snapshot Report. Dallas Public Schools. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…

  15. The Nation's Report Card Reading 2011 Trial Urban District Snapshot Report. Dallas Public Schools. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…

  16. The Nation's Report Card Reading 2011 Trial Urban District Snapshot Report. Albuquerque Public Schools. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…

  17. The Nation's Report Card Reading 2011 Trial Urban District Snapshot Report. Albuquerque Public Schools. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…

  18. The Nation's Report Card Mathematics 2011 Trial Urban District Snapshot Report. Albuquerque Public Schools. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…

  19. The Nation's Report Card Mathematics 2011 Trial Urban District Snapshot Report. Albuquerque Public Schools. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…

  20. 47 CFR 1.959 - Computation of average terrain elevation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Section 1.959 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by...) of this chapter, average terrain elevation must be calculated by computer using elevations from a 30... also be done manually, if the results differ significantly from the computer derived averages. (a...

  1. 47 CFR 1.959 - Computation of average terrain elevation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Section 1.959 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by...) of this chapter, average terrain elevation must be calculated by computer using elevations from a 30... also be done manually, if the results differ significantly from the computer derived averages. (a...

  2. 47 CFR 1.959 - Computation of average terrain elevation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Section 1.959 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by...) of this chapter, average terrain elevation must be calculated by computer using elevations from a 30... also be done manually, if the results differ significantly from the computer derived averages. (a...

  3. Utilization of breast cancer screening methods in a developing nation: results from a nationally representative sample of Malaysian households.

    PubMed

    Dunn, Richard A; Tan, Andrew K G

    2011-01-01

    As is the case in many developing nations, previous studies of breast cancer screening behavior in Malaysia have used relatively small samples that are not nationally representative, thereby limiting the generalizability of results. Therefore, this study uses nationally representative data from the Malaysia Non-Communicable Disease Surveillance-1 to investigate the role of socio-economic status on breast cancer screening behavior in Malaysia, particularly differences in screening behaviour between ethnic groups. The decisions of 816 women above age 40 in Malaysia to screen for breast cancer using mammography, clinical breast exams (CBE), and breast self-exams (BSE) are modeled using logistic regression. Results indicate that after adjusting for differences in age, education, household income, marital status, and residential location, Malay women are less likely than Chinese and Indian women to utilize mammography, but more likely to perform BSE. Education level and urban residence are positively associated with utilization of each method, but these relationships vary across ethnicity. Higher education levels are strongly related to using each screening method among Chinese women, but have no statistically significant relationship to screening among Malays. © 2011 Wiley Periodicals, Inc.

  4. An improved moving average technical trading rule

    NASA Astrophysics Data System (ADS)

    Papailias, Fotis; Thomakos, Dimitrios D.

    2015-06-01

    This paper proposes a modified version of the widely used price and moving average cross-over trading strategies. The suggested approach (presented in its 'long only' version) is a combination of cross-over 'buy' signals and a dynamic threshold value which acts as a dynamic trailing stop. The trading behaviour and performance from this modified strategy are different from the standard approach with results showing that, on average, the proposed modification increases the cumulative return and the Sharpe ratio of the investor while exhibiting smaller maximum drawdown and smaller drawdown duration than the standard strategy.

  5. RHIC BPM system average orbit calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michnoff,R.; Cerniglia, P.; Degen, C.

    2009-05-04

    RHIC beam position monitor (BPM) system average orbit was originally calculated by averaging positions of 10000 consecutive turns for a single selected bunch. Known perturbations in RHIC particle trajectories, with multiple frequencies around 10 Hz, contribute to observed average orbit fluctuations. In 2006, the number of turns for average orbit calculations was made programmable; this was used to explore averaging over single periods near 10 Hz. Although this has provided an average orbit signal quality improvement, an average over many periods would further improve the accuracy of the measured closed orbit. A new continuous average orbit calculation was developed justmore » prior to the 2009 RHIC run and was made operational in March 2009. This paper discusses the new algorithm and performance with beam.« less

  6. 40 CFR 89.204 - Averaging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... are defined as follows: (1) Eligible engines rated at or above 19 kW, other than marine diesel engines, constitute an averaging set. (2) Eligible engines rated under 19 kW, other than marine diesel engines, constitute an averaging set. (3) Marine diesel engines rated at or above 19 kW constitute an averaging set...

  7. 40 CFR 89.204 - Averaging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... are defined as follows: (1) Eligible engines rated at or above 19 kW, other than marine diesel engines, constitute an averaging set. (2) Eligible engines rated under 19 kW, other than marine diesel engines, constitute an averaging set. (3) Marine diesel engines rated at or above 19 kW constitute an averaging set...

  8. 40 CFR 89.204 - Averaging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... are defined as follows: (1) Eligible engines rated at or above 19 kW, other than marine diesel engines, constitute an averaging set. (2) Eligible engines rated under 19 kW, other than marine diesel engines, constitute an averaging set. (3) Marine diesel engines rated at or above 19 kW constitute an averaging set...

  9. 40 CFR 89.204 - Averaging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... are defined as follows: (1) Eligible engines rated at or above 19 kW, other than marine diesel engines, constitute an averaging set. (2) Eligible engines rated under 19 kW, other than marine diesel engines, constitute an averaging set. (3) Marine diesel engines rated at or above 19 kW constitute an averaging set...

  10. 42 CFR 412.212 - National rate.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... discharge classified within a DRG, the national rate equals the product of— (1) The national average... under § 412.60(b)) for that DRG. (d) Adjusting for different area wage levels. CMS adjusts the...

  11. Results from the 2010 National Survey on Drug Use and Health: Mental Health Findings

    ERIC Educational Resources Information Center

    Substance Abuse and Mental Health Services Administration, 2012

    2012-01-01

    This report presents results pertaining to mental health from the 2010 National Survey on Drug Use and Health (NSDUH), an annual survey of the civilian, noninstitutionalized population of the United States aged 12 years old or older. This report presents national estimates of the prevalence of past year mental disorders and past year mental health…

  12. Long-term-average spectrum characteristics of country singers during speaking and singing.

    PubMed

    Cleveland, T F; Sundberg, J; Stone, R E

    2001-03-01

    Five premier male country singers involved in our previous studies spoke and sang the words of both the national anthem and a country song of their choice. Long-term-average spectra were made of the spoken and sung material of each singer. The spectral characteristics of county singers' speech and singing were similar. A prominent peak in the upper part of the spectrum, previously described as the "speaker's formant," was found in the county singers' speech and singing. The singer's formant, a strong spectral peak near 2.8 kHz, an important part of the spectrum of classically trained singers, was not found in the spectra of the country singers. The results support the conclusion that the resonance characteristics in speech and singing are similar in country singing and that county singing is not characterized by a singer's formant.

  13. The Nation's Report Card Mathematics 2011 Trial Urban District Snapshot Report. School District of Philadelphia. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…

  14. The Nation's Report Card Reading 2011 Trial Urban District Snapshot Report. Austin Independent School District. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…

  15. The Nation's Report Card Reading 2011 Trial Urban District Snapshot Report. Austin Independent School District. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…

  16. The Nation's Report Card Mathematics 2011 Trial Urban District Snapshot Report. Austin Independent School District. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…

  17. The Nation's Report Card Mathematics 2011 Trial Urban District Snapshot Report. Austin Independent School District. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…

  18. The Nation's Report Card Mathematics 2011 Trial Urban District Snapshot Report. Cleveland Metropolitan School District. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…

  19. States' Average College Tuition.

    ERIC Educational Resources Information Center

    Eglin, Joseph J., Jr.; And Others

    This report presents statistical data on trends in tuition costs from 1980-81 through 1995-96. The average tuition for in-state undergraduate students of 4-year public colleges and universities for academic year 1995-96 was approximately 8.9 percent of median household income. This figure was obtained by dividing the students' average annual…

  20. Warp-averaging event-related potentials.

    PubMed

    Wang, K; Begleiter, H; Porjesz, B

    2001-10-01

    To align the repeated single trials of the event-related potential (ERP) in order to get an improved estimate of the ERP. A new implementation of the dynamic time warping is applied to compute a warp-average of the single trials. The trilinear modeling method is applied to filter the single trials prior to alignment. Alignment is based on normalized signals and their estimated derivatives. These features reduce the misalignment due to aligning the random alpha waves, explaining amplitude differences in latency differences, or the seemingly small amplitudes of some components. Simulations and applications to visually evoked potentials show significant improvement over some commonly used methods. The new implementation of the dynamic time warping can be used to align the major components (P1, N1, P2, N2, P3) of the repeated single trials. The average of the aligned single trials is an improved estimate of the ERP. This could lead to more accurate results in subsequent analysis.

  1. Average Temperatures in the Southwestern United States, 2000-2015 Versus Long-Term Average

    EPA Pesticide Factsheets

    This indicator shows how the average air temperature from 2000 to 2015 has differed from the long-term average (1895??2015). To provide more detailed information, each state has been divided into climate divisions, which are zones that share similar climate features. For more information: www.epa.gov/climatechange/science/indicators

  2. Averaging processes in granular flows driven by gravity

    NASA Astrophysics Data System (ADS)

    Rossi, Giulia; Armanini, Aronne

    2016-04-01

    One of the more promising theoretical frames to analyse the two-phase granular flows is offered by the similarity of their rheology with the kinetic theory of gases [1]. Granular flows can be considered a macroscopic equivalent of the molecular case: the collisions among molecules are compared to the collisions among grains at a macroscopic scale [2,3]. However there are important statistical differences in dealing with the two applications. In the two-phase fluid mechanics, there are two main types of average: the phasic average and the mass weighed average [4]. The kinetic theories assume that the size of atoms is so small, that the number of molecules in a control volume is infinite. With this assumption, the concentration (number of particles n) doesn't change during the averaging process and the two definitions of average coincide. This hypothesis is no more true in granular flows: contrary to gases, the dimension of a single particle becomes comparable to that of the control volume. For this reason, in a single realization the number of grain is constant and the two averages coincide; on the contrary, for more than one realization, n is no more constant and the two types of average lead to different results. Therefore, the ensamble average used in the standard kinetic theory (which usually is the phasic average) is suitable for the single realization, but not for several realization, as already pointed out in [5,6]. In the literature, three main length scales have been identified [7]: the smallest is the particles size, the intermediate consists in the local averaging (in order to describe some instability phenomena or secondary circulation) and the largest arises from phenomena such as large eddies in turbulence. Our aim is to solve the intermediate scale, by applying the mass weighted average, when dealing with more than one realizations. This statistical approach leads to additional diffusive terms in the continuity equation: starting from experimental

  3. Analysis and comparison of safety models using average daily, average hourly, and microscopic traffic.

    PubMed

    Wang, Ling; Abdel-Aty, Mohamed; Wang, Xuesong; Yu, Rongjie

    2018-02-01

    There have been plenty of traffic safety studies based on average daily traffic (ADT), average hourly traffic (AHT), or microscopic traffic at 5 min intervals. Nevertheless, not enough research has compared the performance of these three types of safety studies, and seldom of previous studies have intended to find whether the results of one type of study is transferable to the other two studies. First, this study built three models: a Bayesian Poisson-lognormal model to estimate the daily crash frequency using ADT, a Bayesian Poisson-lognormal model to estimate the hourly crash frequency using AHT, and a Bayesian logistic regression model for the real-time safety analysis using microscopic traffic. The model results showed that the crash contributing factors found by different models were comparable but not the same. Four variables, i.e., the logarithm of volume, the standard deviation of speed, the logarithm of segment length, and the existence of diverge segment, were positively significant in the three models. Additionally, weaving segments experienced higher daily and hourly crash frequencies than merge and basic segments. Then, each of the ADT-based, AHT-based, and real-time models was used to estimate safety conditions at different levels: daily and hourly, meanwhile, the real-time model was also used in 5 min intervals. The results uncovered that the ADT- and AHT-based safety models performed similar in predicting daily and hourly crash frequencies, and the real-time safety model was able to provide hourly crash frequency. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. White River National Forest Hanging Lake visitor transportation survey : summary of results

    DOT National Transportation Integrated Search

    2017-01-01

    The USDOT Volpe Center conducted a visitor transportation survey at Hanging Lake recreation site in the White River National Forest from July 14 to July 18, 2016. This report outlines the summary of results from that survey effort. Key findings inclu...

  5. Dynamic testing and test anxiety amongst gifted and average-ability children.

    PubMed

    Vogelaar, Bart; Bakker, Merel; Elliott, Julian G; Resing, Wilma C M

    2017-03-01

    Dynamic testing has been proposed as a testing approach that is less disadvantageous for children who may be potentially subject to bias when undertaking conventional assessments. For example, those who encounter high levels of test anxiety, or who are unfamiliar with standardized test procedures, may fail to demonstrate their true potential or capabilities. While dynamic testing has proven particularly useful for special groups of children, it has rarely been used with gifted children. We investigated whether it would be useful to conduct a dynamic test to measure the cognitive abilities of intellectually gifted children. We also investigated whether test anxiety scores would be related to a progression in the children's test scores after dynamic training. Participants were 113 children aged between 7 and 8 years from several schools in the western part of the Netherlands. The children were categorized as either gifted or average-ability and split into an unguided practice or a dynamic testing condition. The study employed a pre-test-training-post-test design. Using linear mixed modelling analysis with a multilevel approach, we inspected the growth trajectories of children in the various conditions and examined the impact of ability and test anxiety on progression and training benefits. Dynamic testing proved to be successful in improving the scores of the children, although no differences in training benefits were found between gifted and average-ability children. Test anxiety was shown to influence the children's rate of change across all test sessions and their improvement in performance accuracy after dynamic training. © 2016 The British Psychological Society.

  6. Programmable noise bandwidth reduction by means of digital averaging

    NASA Technical Reports Server (NTRS)

    Poklemba, John J. (Inventor)

    1993-01-01

    Predetection noise bandwidth reduction is effected by a pre-averager capable of digitally averaging the samples of an input data signal over two or more symbols, the averaging interval being defined by the input sampling rate divided by the output sampling rate. As the averaged sample is clocked to a suitable detector at a much slower rate than the input signal sampling rate the noise bandwidth at the input to the detector is reduced, the input to the detector having an improved signal to noise ratio as a result of the averaging process, and the rate at which such subsequent processing must operate is correspondingly reduced. The pre-averager forms a data filter having an output sampling rate of one sample per symbol of received data. More specifically, selected ones of a plurality of samples accumulated over two or more symbol intervals are output in response to clock signals at a rate of one sample per symbol interval. The pre-averager includes circuitry for weighting digitized signal samples using stored finite impulse response (FIR) filter coefficients. A method according to the present invention is also disclosed.

  7. Comparison of region-of-interest-averaged and pixel-averaged analysis of DCE-MRI data based on simulations and pre-clinical experiments

    NASA Astrophysics Data System (ADS)

    He, Dianning; Zamora, Marta; Oto, Aytekin; Karczmar, Gregory S.; Fan, Xiaobing

    2017-09-01

    Differences between region-of-interest (ROI) and pixel-by-pixel analysis of dynamic contrast enhanced (DCE) MRI data were investigated in this study with computer simulations and pre-clinical experiments. ROIs were simulated with 10, 50, 100, 200, 400, and 800 different pixels. For each pixel, a contrast agent concentration as a function of time, C(t), was calculated using the Tofts DCE-MRI model with randomly generated physiological parameters (K trans and v e) and the Parker population arterial input function. The average C(t) for each ROI was calculated and then K trans and v e for the ROI was extracted. The simulations were run 100 times for each ROI with new K trans and v e generated. In addition, white Gaussian noise was added to C(t) with 3, 6, and 12 dB signal-to-noise ratios to each C(t). For pre-clinical experiments, Copenhagen rats (n  =  6) with implanted prostate tumors in the hind limb were used in this study. The DCE-MRI data were acquired with a temporal resolution of ~5 s in a 4.7 T animal scanner, before, during, and after a bolus injection (<5 s) of Gd-DTPA for a total imaging duration of ~10 min. K trans and v e were calculated in two ways: (i) by fitting C(t) for each pixel, and then averaging the pixel values over the entire ROI, and (ii) by averaging C(t) over the entire ROI, and then fitting averaged C(t) to extract K trans and v e. The simulation results showed that in heterogeneous ROIs, the pixel-by-pixel averaged K trans was ~25% to ~50% larger (p  <  0.01) than the ROI-averaged K trans. At higher noise levels, the pixel-averaged K trans was greater than the ‘true’ K trans, but the ROI-averaged K trans was lower than the ‘true’ K trans. The ROI-averaged K trans was closer to the true K trans than pixel-averaged K trans for high noise levels. In pre-clinical experiments, the pixel-by-pixel averaged K trans was ~15% larger than the ROI-averaged K trans. Overall, with the Tofts model, the extracted

  8. Correction for spatial averaging in laser speckle contrast analysis

    PubMed Central

    Thompson, Oliver; Andrews, Michael; Hirst, Evan

    2011-01-01

    Practical laser speckle contrast analysis systems face a problem of spatial averaging of speckles, due to the pixel size in the cameras used. Existing practice is to use a system factor in speckle contrast analysis to account for spatial averaging. The linearity of the system factor correction has not previously been confirmed. The problem of spatial averaging is illustrated using computer simulation of time-integrated dynamic speckle, and the linearity of the correction confirmed using both computer simulation and experimental results. The valid linear correction allows various useful compromises in the system design. PMID:21483623

  9. Results of the 2013 National Resident Matching Program: family medicine.

    PubMed

    Biggs, Wendy S; Crosley, Philip W; Kozakowski, Stanley M

    2013-10-01

    The percentage of US seniors who chose primary care careers remains well below the nation's future workforce needs. Entrants into family medicine residency programs, along with their colleagues entering other primary care-designated residencies, will compose the primary care workforce of the future. Data in this article are collected from the 2013 National Resident Matching Program (NRMP) Main Residency Match and the 2013 American Academy of Family Physicians (AAFP) Medical Education Residency Census. The information provided includes the number of applicants to graduate medical education programs for the 2013--2014 academic year, specialty choice, and trends in specialty selection. Family medicine residency programs experienced a modest increase in both the overall fill rate as well as the number of positions filled with US seniors through the NRMP in 2013 in comparison to 2012. Other primary care fields, primary care internal medicine positions, pediatrics-primary care, and internal medicine-pediatrics programs also experienced modest increases in 2013. The 2013 NRMP results show a small increase in medical students choosing primary care careers for the fourth year in a row. Changes in the NRMP Match process in 2013 make a comparison to prior years' Match results difficult. Medical school admission changes, loan repayment, and improved primary care reimbursement may help increase the number of students pursuing family medicine.

  10. Adaptive Spontaneous Transitions between Two Mechanisms of Numerical Averaging.

    PubMed

    Brezis, Noam; Bronfman, Zohar Z; Usher, Marius

    2015-06-04

    We investigated the mechanism with which humans estimate numerical averages. Participants were presented with 4, 8 or 16 (two-digit) numbers, serially and rapidly (2 numerals/second) and were instructed to convey the sequence average. As predicted by a dual, but not a single-component account, we found a non-monotonic influence of set-size on accuracy. Moreover, we observed a marked decrease in RT as set-size increases and RT-accuracy tradeoff in the 4-, but not in the 16-number condition. These results indicate that in accordance with the normative directive, participants spontaneously employ analytic/sequential thinking in the 4-number condition and intuitive/holistic thinking in the 16-number condition. When the presentation rate is extreme (10 items/sec) we find that, while performance still remains high, the estimations are now based on intuitive processing. The results are accounted for by a computational model postulating population-coding underlying intuitive-averaging and working-memory-mediated symbolic procedures underlying analytical-averaging, with flexible allocation between the two.

  11. The Nation's Report Card[TM]: Mathematics, 2003. NCES 2005-451

    ERIC Educational Resources Information Center

    Braswell, James S.; Dion, Gloria S.; Daane, Mary C.; Jin, Ying

    2005-01-01

    This report presents results of the NAEP 2003 fourth- and eighth-grade mathematics assessments for the nation, for regions of the country, for participating states and other jurisdictions, and for participating urban districts. Assessment results are described in terms of students' average mathematics score on a 0-500 scale and in terms of the…

  12. Weighted south-wide average pulpwood prices

    Treesearch

    James E. Granskog; Kevin D. Growther

    1991-01-01

    Weighted average prices provide a more accurate representation of regional pulpwood price trends when production volumes valy widely by state. Unweighted South-wide average delivered prices for pulpwood, as reported by Timber Mart-South, were compared to average annual prices weighted by each state's pulpwood production from 1977 to 1986. Weighted average prices...

  13. Results of time-domain electromagnetic soundings in Everglades National Park, Florida

    USGS Publications Warehouse

    Fitterman, D.V.; Deszcz-Pan, Maria; Stoddard, C.E.

    1999-01-01

    This report describes the collection, processing, and interpretation of time-domain electromagnetic soundings from Everglades National Park. The results are used to locate the extent of seawater intrusion in the Biscayne aquifer and to map the base of the Biscayne aquifer in regions where well coverage is sparse. The data show no evidence of fresh, ground-water flows at depth into Florida Bay.

  14. The Nation's Report Card Reading 2011 Trial Urban District Snapshot Report. San Diego Unified School District. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…

  15. The Nation's Report Card Mathematics 2011 Trial Urban District Snapshot Report. San Diego Unified School District. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…

  16. The Nation's Report Card Mathematics 2011 Trial Urban District Snapshot Report. San Diego Unified School District. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…

  17. The Nation's Report Card Reading 2011 Trial Urban District Snapshot Report. San Diego Unified School District. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…

  18. Scale-invariant Green-Kubo relation for time-averaged diffusivity

    NASA Astrophysics Data System (ADS)

    Meyer, Philipp; Barkai, Eli; Kantz, Holger

    2017-12-01

    In recent years it was shown both theoretically and experimentally that in certain systems exhibiting anomalous diffusion the time- and ensemble-averaged mean-squared displacement are remarkably different. The ensemble-averaged diffusivity is obtained from a scaling Green-Kubo relation, which connects the scale-invariant nonstationary velocity correlation function with the transport coefficient. Here we obtain the relation between time-averaged diffusivity, usually recorded in single-particle tracking experiments, and the underlying scale-invariant velocity correlation function. The time-averaged mean-squared displacement is given by 〈δ2¯〉 ˜2 DνtβΔν -β , where t is the total measurement time and Δ is the lag time. Here ν is the anomalous diffusion exponent obtained from ensemble-averaged measurements 〈x2〉 ˜tν , while β ≥-1 marks the growth or decline of the kinetic energy 〈v2〉 ˜tβ . Thus, we establish a connection between exponents that can be read off the asymptotic properties of the velocity correlation function and similarly for the transport constant Dν. We demonstrate our results with nonstationary scale-invariant stochastic and deterministic models, thereby highlighting that systems with equivalent behavior in the ensemble average can differ strongly in their time average. If the averaged kinetic energy is finite, β =0 , the time scaling of 〈δ2¯〉 and 〈x2〉 are identical; however, the time-averaged transport coefficient Dν is not identical to the corresponding ensemble-averaged diffusion constant.

  19. External quality-assurance results for the National Atmospheric Deposition Program/National Trends Network during 1991

    USGS Publications Warehouse

    Nilles, M.A.; Gordon, J.D.; Schroder, L.J.; Paulin, C.E.

    1995-01-01

    The U.S. Geological Survey used four programs in 1991 to provide external quality assurance for the National Atmospheric Deposition Program/National Trends Network (NADP/NTN). An intersite-comparison program was used to evaluate onsite pH and specific-conductance determinations. The effects of routine sample handling, processing, and shipping of wet-deposition samples on analyte determinations and an estimated precision of analyte values and concentrations were evaluated in the blind-audit program. Differences between analytical results and an estimate of the analytical precision of four laboratories routinely measuring wet deposition were determined by an interlaboratory-comparison program. Overall precision estimates for the precipitation-monitoring system were determined for selected sites by a collocated-sampler program. Results of the intersite-comparison program indicated that 93 and 86 percent of the site operators met the NADP/NTN accuracy goal for pH determinations during the two intersite-comparison studies completed during 1991. The results also indicated that 96 and 97 percent of the site operators met the NADP/NTN accuracy goal for specific-conductance determinations during the two 1991 studies. The effects of routine sample handling, processing, and shipping, determined in the blind-audit program indicated significant positive bias (a=.O 1) for calcium, magnesium, sodium, potassium, chloride, nitrate, and sulfate. Significant negative bias (or=.01) was determined for hydrogen ion and specific conductance. Only ammonium determinations were not biased. A Kruskal-Wallis test indicated that there were no significant (*3t=.01) differences in analytical results from the four laboratories participating in the interlaboratory-comparison program. Results from the collocated-sampler program indicated the median relative error for cation concentration and deposition exceeded eight percent at most sites, whereas the median relative error for sample volume

  20. Analytic computation of average energy of neutrons inducing fission

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, Alexander Rich

    2016-08-12

    The objective of this report is to describe how I analytically computed the average energy of neutrons that induce fission in the bare BeRP ball. The motivation of this report is to resolve a discrepancy between the average energy computed via the FMULT and F4/FM cards in MCNP6 by comparison to the analytic results.

  1. Test Results From The Idaho National Laboratory Of The NASA Bi-Supported Cell Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C Stoots; J O'Brien; T Cable

    The Idaho National Laboratory has been researching the application of solid-oxide fuel cell technology for large-scale hydrogen production. As a result, the Idaho National Laboratory has been testing various cell designs to characterize electrolytic performance. NASA, in conjunction with the University of Toledo, has developed a new cell concept with the goals of reduced weight and high power density. This paper presents results of the INL's testing of this new solid oxide cell design as an electrolyzer. Gas composition, operating voltage, and other parameters were varied during testing. Results to date show the NASA cell to be a promising designmore » for both high power-to-weight fuel cell and electrolyzer applications.« less

  2. Plan averaging for multicriteria navigation of sliding window IMRT and VMAT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, David, E-mail: dcraft@partners.org; Papp, Dávid; Unkelbach, Jan

    2014-02-15

    Purpose: To describe a method for combining sliding window plans [intensity modulated radiation therapy (IMRT) or volumetric modulated arc therapy (VMAT)] for use in treatment plan averaging, which is needed for Pareto surface navigation based multicriteria treatment planning. Methods: The authors show that by taking an appropriately defined average of leaf trajectories of sliding window plans, the authors obtain a sliding window plan whose fluence map is the exact average of the fluence maps corresponding to the initial plans. In the case of static-beam IMRT, this also implies that the dose distribution of the averaged plan is the exact dosimetricmore » average of the initial plans. In VMAT delivery, the dose distribution of the averaged plan is a close approximation of the dosimetric average of the initial plans. Results: The authors demonstrate the method on three Pareto optimal VMAT plans created for a demanding paraspinal case, where the tumor surrounds the spinal cord. The results show that the leaf averaged plans yield dose distributions that approximate the dosimetric averages of the precomputed Pareto optimal plans well. Conclusions: The proposed method enables the navigation of deliverable Pareto optimal plans directly, i.e., interactive multicriteria exploration of deliverable sliding window IMRT and VMAT plans, eliminating the need for a sequencing step after navigation and hence the dose degradation that is caused by such a sequencing step.« less

  3. NATIONAL RESULTS FROM THE 2011 NATIONAL WETLAND CONDITION ASSESSMENT (NWCA) SOILS ANALYSIS

    EPA Science Inventory

    In 2011, US Environmental Protection Agency conducted the first National Wetland Condition Assessment (NWCA). Field crews conducted one-day surveys of over 1000 wetlands across the contiguous United States. For every wetland sampled, soils were collected by layer (i.e., horizon)...

  4. Assessing the average sodium content of prepacked foods with nutrition declarations: the importance of sales data.

    PubMed

    Korošec, Živa; Pravst, Igor

    2014-09-04

    Processed foods are recognized as a major contributor to high dietary sodium intake, associated with increased risk of cardiovascular disease. Different public health actions are being introduced to reduce sodium content in processed foods and sodium intake in general. A gradual reduction of sodium content in processed foods was proposed in Slovenia, but monitoring sodium content in the food supply is essential to evaluate the progress. Our primary objective was to test a new approach for assessing the sales-weighted average sodium content of prepacked foods on the market. We show that a combination of 12-month food sales data provided by food retailers covering the majority of the national market and a comprehensive food composition database compiled using food labelling data represent a robust and cost-effective approach to assessing the sales-weighted average sodium content of prepacked foods. Food categories with the highest sodium content were processed meats (particularly dry cured meat), ready meals (especially frozen pizza) and cheese. The reported results show that in most investigated food categories, market leaders in the Slovenian market have lower sodium contents than the category average. The proposed method represents an excellent tool for monitoring sodium content in the food supply.

  5. Assessing the Average Sodium Content of Prepacked Foods with Nutrition Declarations: The Importance of Sales Data

    PubMed Central

    Korošec, Živa; Pravst, Igor

    2014-01-01

    Processed foods are recognized as a major contributor to high dietary sodium intake, associated with increased risk of cardiovascular disease. Different public health actions are being introduced to reduce sodium content in processed foods and sodium intake in general. A gradual reduction of sodium content in processed foods was proposed in Slovenia, but monitoring sodium content in the food supply is essential to evaluate the progress. Our primary objective was to test a new approach for assessing the sales-weighted average sodium content of prepacked foods on the market. We show that a combination of 12-month food sales data provided by food retailers covering the majority of the national market and a comprehensive food composition database compiled using food labelling data represent a robust and cost-effective approach to assessing the sales-weighted average sodium content of prepacked foods. Food categories with the highest sodium content were processed meats (particularly dry cured meat), ready meals (especially frozen pizza) and cheese. The reported results show that in most investigated food categories, market leaders in the Slovenian market have lower sodium contents than the category average. The proposed method represents an excellent tool for monitoring sodium content in the food supply. PMID:25192028

  6. Rigid shape matching by segmentation averaging.

    PubMed

    Wang, Hongzhi; Oliensis, John

    2010-04-01

    We use segmentations to match images by shape. The new matching technique does not require point-to-point edge correspondence and is robust to small shape variations and spatial shifts. To address the unreliability of segmentations computed bottom-up, we give a closed form approximation to an average over all segmentations. Our method has many extensions, yielding new algorithms for tracking, object detection, segmentation, and edge-preserving smoothing. For segmentation, instead of a maximum a posteriori approach, we compute the "central" segmentation minimizing the average distance to all segmentations of an image. For smoothing, instead of smoothing images based on local structures, we smooth based on the global optimal image structures. Our methods for segmentation, smoothing, and object detection perform competitively, and we also show promising results in shape-based tracking.

  7. Preliminary Results of National Amyotrophic Lateral Sclerosis (ALS) Registry Risk Factor Survey Data

    PubMed Central

    2016-01-01

    Background The National ALS Registry is made up of two components to capture amyotrophic lateral sclerosis (ALS) cases: national administrative databases (Medicare, Medicaid, Veterans Health Administration and Veterans Benefits Administration) and self-identified cases captured by the Registry’s web portal. This study describes self-reported characteristics of U.S. adults with ALS using the data collected by the National ALS Registry web portal risk factor surveys only from October 19, 2010 through December 31, 2013. Objective To describe findings from the National ALS Registry’s web portal risk factor surveys. Measurements The prevalence of select risk factors among adults with ALS was determined by calculating the frequencies of select risk factors—smoking and alcohol (non, current and former) histories, military service and occupational history, and family history of neurodegenerative diseases such as ALS, Alzheimer’s and/or Parkinson’s. Results Nearly half of survey respondents were ever smokers compared with nearly 41% of adults nationally. Most respondents were ever drinkers which is comparable to national estimates. The majority were light drinkers. Nearly one-quarter of survey respondents were veterans compared with roughly 9% of US adults nationally. Most respondents were retired or disabled. The industries in which respondents were employed for the longest time were Professional and Scientific and Technical Services. When family history of neurodegenerative diseases in first degree relatives was evaluated against our comparison group, the rates of ALS were similar, but were higher for Parkinson’s disease, Alzheimer’s disease and any neurodegenerative diseases. Conclusions The National ALS Registry web portal, to our knowledge, is the largest, most geographically diverse collection of risk factor data about adults living with ALS. Various characteristics were consistent with other published studies on ALS risk factors and will allow

  8. Geographic information systems for mapping the National Exam Result of Junior High School in 2014 at West Java Province

    NASA Astrophysics Data System (ADS)

    Setiawan Abdullah, Atje; Nurani Ruchjana, Budi; Rejito, Juli; Rosadi, Rudi; Candra Permana, Fahmi

    2017-10-01

    National Exam level of schooling is implemented by the Ministry of Education and Culture for the development of education in Indonesia. The national examinations are centrally evaluated by the National Education Standards Agency, and the expected implementation of the national exams can describe the successful implementation of education at the district, municipal, provincial, or national level. In this study, we evaluate, analyze, and explore the implementation of the national exam database of the results of the Junior High School in 2014, with the Junior High School (SMP/MTs) as the smallest unit of analysis at the district level. The method used in this study is a data mining approach using the methodology of Knowledge Discovery in Databases (KDD) using descriptive analysis and spatial mapping of national examinations. The results of the classification of the data mining process to national exams of Junior High School in 2014 using data 6,878 SMP/MTs in West Java showed that 81.01 % were at moderate levels. While the results of the spatial mapping for SMP/MTs in West Java can be explained 36,99 % at the unfavorable level. The evaluation results visualization in graphic is done using ArcGIS to provide position information quality of education in municipal, provincial or national level. The results of this study can be used by management to make decision to improve educational services based on the national exam database in West Java. Keywords: KDD, spatial mapping, national exam.

  9. GI Joe or Average Joe? The impact of average-size and muscular male fashion models on men's and women's body image and advertisement effectiveness.

    PubMed

    Diedrichs, Phillippa C; Lee, Christina

    2010-06-01

    Increasing body size and shape diversity in media imagery may promote positive body image. While research has largely focused on female models and women's body image, men may also be affected by unrealistic images. We examined the impact of average-size and muscular male fashion models on men's and women's body image and perceived advertisement effectiveness. A sample of 330 men and 289 women viewed one of four advertisement conditions: no models, muscular, average-slim or average-large models. Men and women rated average-size models as equally effective in advertisements as muscular models. For men, exposure to average-size models was associated with more positive body image in comparison to viewing no models, but no difference was found in comparison to muscular models. Similar results were found for women. Internalisation of beauty ideals did not moderate these effects. These findings suggest that average-size male models can promote positive body image and appeal to consumers. 2010 Elsevier Ltd. All rights reserved.

  10. A Nation at Risk Revisited: Did "Wrong" Reasoning Result in "Right" Results? At What Cost?

    ERIC Educational Resources Information Center

    Guthrie, James W.; Springer, Matthew G.

    2004-01-01

    A Nation at Risk (NAR; National Commission on Excellence in Education, 1983) proclaimed in 1983 that U.S. K-12 educational achievement was on a downward trajectory and that American technological and economic preeminence was consequently imperiled. Both assertions were incorrect. American education achievement was not then declining and the…

  11. What Makes Professional Development Effective? Results from a National Sample of Teachers.

    ERIC Educational Resources Information Center

    Garet, Michael S.; Porter, Andrew C.; Desimone, Laura; Birman, Beatrice F.; Yoon, Kwang Suk

    2001-01-01

    Used a national probability sample of 1,027 mathematics and science teachers to provide a large-scale empirical comparison of effects of different characteristics of professional development on teachers' learning. Results identify three core features of professional development that have significant positive effects on teachers' self-reported…

  12. BOREAS AES Five-Day Averaged Surface Meteorological and Upper Air Data

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Strub, Richard; Newcomer, Jeffrey A.

    2000-01-01

    The Canadian Atmospheric Environment Service (AES) provided BOREAS with hourly and daily surface meteorological data from 23 of the AES meteorological stations located across Canada and upper air data from 1 station at The Pas, Manitoba. Due to copyright restrictions on the full resolution surface meteorological data, this data set contains 5-day average values for the surface parameters. The upper air data are provided in their full resolution form. The 5-day averaging was performed in order to create a data set that could be publicly distributed at no cost. Temporally, the surface meteorological data cover the period of January 1975 to December 1996 and the upper air data cover the period of January 1961 to November 1996. The data are provided in tabular ASCII files, and are classified as AFM-staff data. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).

  13. Reductions in Average Lengths of Stays for Surgical Procedures Between the 2008 and 2014 United States National Inpatient Samples Were Not Associated With Greater Incidences of Use of Postacute Care Facilities.

    PubMed

    Dexter, Franklin; Epstein, Richard H

    2018-03-01

    Diagnosis-related group (DRG) based reimbursement creates incentives for reduction in hospital length of stay (LOS). Such reductions might be accomplished by lesser incidences of discharges to home. However, we previously reported that, while controlling for DRG, each 1-day decrease in hospital median LOS was associated with lesser odds of transfer to a postacute care facility (P = .0008). The result, though, was limited to elective admissions, 15 common surgical DRGs, and the 2013 US National Readmission Database. We studied the same potential relationship between decreased LOS and postacute care using different methodology and over 2 different years. The observational study was performed using summary measures from the 2008 and 2014 US National Inpatient Sample, with 3 types of categories (strata): (1) Clinical Classifications Software's classes of procedures (CCS), (2) DRGs including a major operating room procedure during hospitalization, or (3) CCS limiting patients to those with US Medicare as the primary payer. Greater reductions in the mean LOS were associated with smaller percentages of patients with disposition to postacute care. Analyzed using 72 different CCSs, 174 DRGs, or 70 CCSs limited to Medicare patients, each pairwise reduction in the mean LOS by 1 day was associated with an estimated 2.6% ± 0.4%, 2.3% ± 0.3%, or 2.4% ± 0.3% (absolute) pairwise reduction in the mean incidence of use of postacute care, respectively. These 3 results obtained using bivariate weighted least squares linear regression were all P < .0001, as were the corresponding results obtained using unweighted linear regression or the Spearman rank correlation. In the United States, reductions in hospital LOS, averaged over many surgical procedures, are not accomplished through a greater incidence of use of postacute care.

  14. COMPARISON OF 24H AVERAGE VOC MONITORING RESULTS FOR RESIDENTIAL INDOOR AND OUTDOOR AIR USING CARBOPACK X-FILLED DIFFUSIVE SAMPLERS AND ACTIVE SAMPLING - A PILOT STUDY

    EPA Science Inventory

    Analytical results obtained by thermal desorption GC/MS for 24h diffusive sampling of 11 volatile organic compounds (VOCs) are compared with results of time-averaged active sampling at a known constant flow rate. Air samples were collected with co-located duplicate diffusive samp...

  15. An improved car-following model with two preceding cars' average speed

    NASA Astrophysics Data System (ADS)

    Yu, Shao-Wei; Shi, Zhong-Ke

    2015-01-01

    To better describe cooperative car-following behaviors under intelligent transportation circumstances and increase roadway traffic mobility, the data of three successive following cars at a signalized intersection of Jinan in China were obtained and employed to explore the linkage between two preceding cars' average speed and car-following behaviors. The results indicate that two preceding cars' average velocity has significant effects on the following car's motion. Then an improved car-following model considering two preceding cars' average velocity was proposed and calibrated based on full velocity difference model and some numerical simulations were carried out to study how two preceding cars' average speed affected the starting process and the traffic flow evolution process with an initial small disturbance, the results indicate that the improved car-following model can qualitatively describe the impacts of two preceding cars' average velocity on traffic flow and that taking two preceding cars' average velocity into account in designing the control strategy for the cooperative adaptive cruise control system can improve the stability of traffic flow, suppress the appearance of traffic jams and increase the capacity of signalized intersections.

  16. 40 CFR 91.1304 - Averaging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Averaging. (a) A manufacturer may use averaging across engine families to demonstrate a zero or positive... credits obtained through trading. (b) Beginning in model year 2004, credits used to demonstrate a zero or...

  17. 40 CFR 91.1304 - Averaging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Averaging. (a) A manufacturer may use averaging across engine families to demonstrate a zero or positive... credits obtained through trading. (b) Beginning in model year 2004, credits used to demonstrate a zero or...

  18. 40 CFR 91.1304 - Averaging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Averaging. (a) A manufacturer may use averaging across engine families to demonstrate a zero or positive... credits obtained through trading. (b) Beginning in model year 2004, credits used to demonstrate a zero or...

  19. 40 CFR 91.1304 - Averaging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Averaging. (a) A manufacturer may use averaging across engine families to demonstrate a zero or positive... credits obtained through trading. (b) Beginning in model year 2004, credits used to demonstrate a zero or...

  20. 40 CFR 91.1304 - Averaging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Averaging. (a) A manufacturer may use averaging across engine families to demonstrate a zero or positive... credits obtained through trading. (b) Beginning in model year 2004, credits used to demonstrate a zero or...

  1. When Is the Local Average Treatment Close to the Average? Evidence from Fertility and Labor Supply

    ERIC Educational Resources Information Center

    Ebenstein, Avraham

    2009-01-01

    The local average treatment effect (LATE) may differ from the average treatment effect (ATE) when those influenced by the instrument are not representative of the overall population. Heterogeneity in treatment effects may imply that parameter estimates from 2SLS are uninformative regarding the average treatment effect, motivating a search for…

  2. Lagrangian averaging with geodesic mean

    NASA Astrophysics Data System (ADS)

    Oliver, Marcel

    2017-11-01

    This paper revisits the derivation of the Lagrangian averaged Euler (LAE), or Euler-α equations in the light of an intrinsic definition of the averaged flow map as the geodesic mean on the volume-preserving diffeomorphism group. Under the additional assumption that first-order fluctuations are statistically isotropic and transported by the mean flow as a vector field, averaging of the kinetic energy Lagrangian of an ideal fluid yields the LAE Lagrangian. The derivation presented here assumes a Euclidean spatial domain without boundaries.

  3. Lagrangian averaging with geodesic mean.

    PubMed

    Oliver, Marcel

    2017-11-01

    This paper revisits the derivation of the Lagrangian averaged Euler (LAE), or Euler- α equations in the light of an intrinsic definition of the averaged flow map as the geodesic mean on the volume-preserving diffeomorphism group. Under the additional assumption that first-order fluctuations are statistically isotropic and transported by the mean flow as a vector field, averaging of the kinetic energy Lagrangian of an ideal fluid yields the LAE Lagrangian. The derivation presented here assumes a Euclidean spatial domain without boundaries.

  4. Draft environmental impact statement : corporate average fuel economy standards, passenger cars and light trucks, model years 2011-2015.

    DOT National Transportation Integrated Search

    2008-06-01

    The National Highway Traffic Safety Administration (NHTSA) has prepared this Draft Environmental Impact Statement (DEIS) to disclose and analyze the potential environmental impacts of the proposed new Corporate Average Fuel Economy (CAFE) standards a...

  5. Using the National Assessment of Educational Progress To Confirm State Test Results. A Report of the Ad Hoc Committee on Confirming Test Results.

    ERIC Educational Resources Information Center

    National Assessment Governing Board, Washington, DC.

    The National Assessment Governing Board has recognized the need to study associated policy and technical issues to ensure that the National Assessment of Educational Progress (NAEP) is ready to do the best job possible if called on to confirm the results of state achievement testing under the No Child Left Behind Act. This report describes the…

  6. A national report of nursing home information technology: year 1 results.

    PubMed

    Alexander, Gregory L; Madsen, Richard W; Miller, Erin L; Schaumberg, Melissa K; Holm, Allison E; Alexander, Rachel L; Wise, Keely K; Dougherty, Michelle L; Gugerty, Brian

    2017-01-01

    To provide a report on year 1 results of a national study investigating nursing home information technology (IT) adoption, called IT sophistication. A reliable and valid survey was used to measure IT sophistication. The target goal was 10% from each state in the United States, 1570 nursing homes. A random sample of homes from each state was recruited from Nursing Home Compare. The team reached 2627 nursing home administrators, among whom 1799 administrators agreed to participate and were sent a survey. A total of 815 surveys were completed (45.3% response rate), which was below the goal. Facilities in the participating sample have similar demographic characteristics (ownership, total population in a location, and bed size) to the remaining homes not participating. There are greater IT capabilities in resident care and administrative activities, less in clinical support. The extent of use of these capabilities appears to be highest in administrative activities and lowest in clinical support. IT in resident care appears to be the most integrated with internal and external stakeholders. IT capabilities appear to be greater than IT extent of use in all health domains, with the greatest difference in resident care. National evaluations of nursing home IT are rare. Measuring trends in IT adoption in a nationally representative sample provides meaningful analytics that could be more useful for policy makers and nursing home leaders in the future. Discovering national baseline assessments is a first step toward recognizing nursing home trends in IT adoption. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Viewpoint: observations on scaled average bioequivalence.

    PubMed

    Patterson, Scott D; Jones, Byron

    2012-01-01

    The two one-sided test procedure (TOST) has been used for average bioequivalence testing since 1992 and is required when marketing new formulations of an approved drug. TOST is known to require comparatively large numbers of subjects to demonstrate bioequivalence for highly variable drugs, defined as those drugs having intra-subject coefficients of variation greater than 30%. However, TOST has been shown to protect public health when multiple generic formulations enter the marketplace following patent expiration. Recently, scaled average bioequivalence (SABE) has been proposed as an alternative statistical analysis procedure for such products by multiple regulatory agencies. SABE testing requires that a three-period partial replicate cross-over or full replicate cross-over design be used. Following a brief summary of SABE analysis methods applied to existing data, we will consider three statistical ramifications of the proposed additional decision rules and the potential impact of implementation of scaled average bioequivalence in the marketplace using simulation. It is found that a constraint being applied is biased, that bias may also result from the common problem of missing data and that the SABE methods allow for much greater changes in exposure when generic-generic switching occurs in the marketplace. Copyright © 2011 John Wiley & Sons, Ltd.

  8. Light propagation in the averaged universe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bagheri, Samae; Schwarz, Dominik J., E-mail: s_bagheri@physik.uni-bielefeld.de, E-mail: dschwarz@physik.uni-bielefeld.de

    Cosmic structures determine how light propagates through the Universe and consequently must be taken into account in the interpretation of observations. In the standard cosmological model at the largest scales, such structures are either ignored or treated as small perturbations to an isotropic and homogeneous Universe. This isotropic and homogeneous model is commonly assumed to emerge from some averaging process at the largest scales. We assume that there exists an averaging procedure that preserves the causal structure of space-time. Based on that assumption, we study the effects of averaging the geometry of space-time and derive an averaged version of themore » null geodesic equation of motion. For the averaged geometry we then assume a flat Friedmann-Lemaître (FL) model and find that light propagation in this averaged FL model is not given by null geodesics of that model, but rather by a modified light propagation equation that contains an effective Hubble expansion rate, which differs from the Hubble rate of the averaged space-time.« less

  9. Average years of life lost due to breast and cervical cancer and the association with the marginalization index in Mexico in 2000 and 2010.

    PubMed

    Cervantes, Claudio Alberto Dávila; Botero, Marcela Agudelo

    2014-05-01

    The objective of this study was to calculate average years of life lost due to breast and cervical cancer in Mexico in 2000 and 2010. Data on mortality in women aged between 20 and 84 years was obtained from the National Institute for Statistics and Geography. Age-specific mortality rates and average years of life lost, which is an estimate of the number of years that a person would have lived if he or she had not died prematurely, were estimated for both diseases. Data was disaggregated into five-year age groups and socioeconomic status based on the 2010 marginalization index obtained from the National Population Council. A decrease in average years of life lost due to cervical cancer (37.4%) and an increase in average years of life lost due breast cancer (8.9%) was observed during the period studied. Average years of life lost due to cervical cancer was greater among women living in areas with a high marginalization index, while average years of life lost due to breast cancer was greater in women from areas with a low marginalization index.

  10. Incoherent averaging of phase singularities in speckle-shearing interferometry.

    PubMed

    Mantel, Klaus; Nercissian, Vanusch; Lindlein, Norbert

    2014-08-01

    Interferometric speckle techniques are plagued by the omnipresence of phase singularities, impairing the phase unwrapping process. To reduce the number of phase singularities by physical means, an incoherent averaging of multiple speckle fields may be applied. It turns out, however, that the results may strongly deviate from the expected √N behavior. Using speckle-shearing interferometry as an example, we investigate the mechanism behind the reduction of phase singularities, both by calculations and by computer simulations. Key to an understanding of the reduction mechanism during incoherent averaging is the representation of the physical averaging process in terms of certain vector fields associated with each speckle field.

  11. Specialized Prisons and Services: Results From a National Survey

    PubMed Central

    Cropsey, Karen L.; Wexler, Harry K.; Melnick, Gerald; Taxman, Faye S.; Young, Douglas W.

    2008-01-01

    Findings from the National Criminal Justice Drug Abuse Treatment Studies (CJ-DATS) National Criminal Justice Treatment Practices survey are examined to describe types of services provided by three types of prisons: those that serve a cross-section of offenders, those that specialize in serving offenders with special psychosocial and medical needs, and those that specialize in serving legal status or gender specific populations. Information is presented on the prevalence and type of specialized prisons and services provided to offenders as reported by wardens and other facility directors drawn from a nationally representative sample of prisons. Additional analyses explore organizational factors that differentiate prisons that serve specialized populations including staffing, training, other resources, leadership, and climate for change and innovation. Implications for expanding and improving services for special populations in correctional settings and the values of specialized prisons are discussed. PMID:18443650

  12. Predicting Secchi disk depth from average beam attenuation in a deep, ultra-clear lake

    USGS Publications Warehouse

    Larson, G.L.; Hoffman, R.L.; Hargreaves, B.R.; Collier, R.W.

    2007-01-01

    We addressed potential sources of error in estimating the water clarity of mountain lakes by investigating the use of beam transmissometer measurements to estimate Secchi disk depth. The optical properties Secchi disk depth (SD) and beam transmissometer attenuation (BA) were measured in Crater Lake (Crater Lake National Park, Oregon, USA) at a designated sampling station near the maximum depth of the lake. A standard 20 cm black and white disk was used to measure SD. The transmissometer light source had a nearly monochromatic wavelength of 660 nm and a path length of 25 cm. We created a SD prediction model by regression of the inverse SD of 13 measurements recorded on days when environmental conditions were acceptable for disk deployment with BA averaged over the same depth range as the measured SD. The relationship between inverse SD and averaged BA was significant and the average 95% confidence interval for predicted SD relative to the measured SD was ??1.6 m (range = -4.6 to 5.5 m) or ??5.0%. Eleven additional sample dates tested the accuracy of the predictive model. The average 95% confidence interval for these sample dates was ??0.7 m (range = -3.5 to 3.8 m) or ??2.2%. The 1996-2000 time-series means for measured and predicted SD varied by 0.1 m, and the medians varied by 0.5 m. The time-series mean annual measured and predicted SD's also varied little, with intra-annual differences between measured and predicted mean annual SD ranging from -2.1 to 0.1 m. The results demonstrated that this prediction model reliably estimated Secchi disk depths and can be used to significantly expand optical observations in an environment where the conditions for standardized SD deployments are limited. ?? 2007 Springer Science+Business Media B.V.

  13. Gambling participation in the U.S.--results from a national survey.

    PubMed

    Welte, John W; Barnes, Grace M; Wieczorek, William F; Tidwell, Marie-Cecile; Parker, John

    2002-01-01

    Demographic patterns of gambling participation in the U.S. were examined. A national telephone survey was conducted with 2,630 representative U.S. residents aged 18 or older. The sample as weighted for analysis was 48% male, 12% black, and 11% Hispanic. Respondents were questioned on 15 types of gambling: how often they played and how much they won or lost. Eighty-two percent gambled in the past year. Lottery was the most commonly played game, while casino gambling accounted for the largest extent of gambling involvement. Men and women were equally likely to gamble in the past year, but men gambled more frequently and had larger wins and losses, particularly on sports betting and games of skill. Blacks were less likely to have gambled in the past year, but blacks who gambled did so more heavily than other racial groups. Blacks and Hispanics were more likely than average to be pathological gamblers. The rate of past year gambling declined with age, but extent of gambling involvement among gamblers did not vary with age. Rates of participation in most forms of gambling increased with socioeconomic status, but higher socioeconomic status gamblers had lower rates of pathological gambling, and lower extent of gambling involvement, particularly for lottery. New Englanders gambled more heavily than other Americans. Comparison with past studies showed an increase in overall gambling participation in the U.S., and large increases in rates of participation in lottery and casino gambling.

  14. 40 CFR 80.205 - How is the annual refinery or importer average and corporate pool average sulfur level determined?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... average and corporate pool average sulfur level determined? 80.205 Section 80.205 Protection of... ADDITIVES Gasoline Sulfur Gasoline Sulfur Standards § 80.205 How is the annual refinery or importer average and corporate pool average sulfur level determined? (a) The annual refinery or importer average and...

  15. 40 CFR 80.205 - How is the annual refinery or importer average and corporate pool average sulfur level determined?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... average and corporate pool average sulfur level determined? 80.205 Section 80.205 Protection of... ADDITIVES Gasoline Sulfur Gasoline Sulfur Standards § 80.205 How is the annual refinery or importer average and corporate pool average sulfur level determined? (a) The annual refinery or importer average and...

  16. 40 CFR 80.205 - How is the annual refinery or importer average and corporate pool average sulfur level determined?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... average and corporate pool average sulfur level determined? 80.205 Section 80.205 Protection of... ADDITIVES Gasoline Sulfur Gasoline Sulfur Standards § 80.205 How is the annual refinery or importer average and corporate pool average sulfur level determined? (a) The annual refinery or importer average and...

  17. 40 CFR 80.205 - How is the annual refinery or importer average and corporate pool average sulfur level determined?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... average and corporate pool average sulfur level determined? 80.205 Section 80.205 Protection of... ADDITIVES Gasoline Sulfur Gasoline Sulfur Standards § 80.205 How is the annual refinery or importer average and corporate pool average sulfur level determined? (a) The annual refinery or importer average and...

  18. The influence of community well-being on mortality among Registered First Nations people.

    PubMed

    Oliver, Lisa N; Penney, Chris; Peters, Paul A

    2016-07-20

    Living in a community with lower socioeconomic status is associated with higher mortality. However, few studies have examined associations between community socioeconomic characteristics and mortality among the First Nations population. The 1991-to-2006 Census Mortality and Cancer Cohort follow-up, which tracked a 15% sample of Canadians aged 25 or older, included 57,300 respondents who self-identified as Registered First Nations people or Indian band members. The Community Well-Being Index (CWB), a measure of the social and economic well-being of communities, consists of income, education, labour force participation, and housing components. A dichotomous variable was used to indicate residence in a community with a CWB score above or below the average for First Nations communities. Age-standardized mortality rates (ASMRs) were calculated for First Nations cohort members in communities with CWB scores above and below the First Nations average. Cox proportional hazards models examined the impact of CWB when controlling for individual characteristics. The ASMR for First Nations cohort members in communities with a below-average CWB was 1,057 per 100,000 person-years at risk, compared with 912 for those in communities with an above-average CWB score. For men, living in a community with below-average income and labour force participation CWB scores was associated with an increased hazard of death, even when individual socioeconomic characteristics were taken into account. Women in communities with below-average income scores had an increased hazard of death. First Nations people in communities with below-average CWB scores tended to have higher mortality rates. For some components of the CWB, effects remained even when individual socioeconomic characteristics were taken into account.

  19. Ensemble Averaged Probability Density Function (APDF) for Compressible Turbulent Reacting Flows

    NASA Technical Reports Server (NTRS)

    Shih, Tsan-Hsing; Liu, Nan-Suey

    2012-01-01

    In this paper, we present a concept of the averaged probability density function (APDF) for studying compressible turbulent reacting flows. The APDF is defined as an ensemble average of the fine grained probability density function (FG-PDF) with a mass density weighting. It can be used to exactly deduce the mass density weighted, ensemble averaged turbulent mean variables. The transport equation for APDF can be derived in two ways. One is the traditional way that starts from the transport equation of FG-PDF, in which the compressible Navier- Stokes equations are embedded. The resulting transport equation of APDF is then in a traditional form that contains conditional means of all terms from the right hand side of the Navier-Stokes equations except for the chemical reaction term. These conditional means are new unknown quantities that need to be modeled. Another way of deriving the transport equation of APDF is to start directly from the ensemble averaged Navier-Stokes equations. The resulting transport equation of APDF derived from this approach appears in a closed form without any need for additional modeling. The methodology of ensemble averaging presented in this paper can be extended to other averaging procedures: for example, the Reynolds time averaging for statistically steady flow and the Reynolds spatial averaging for statistically homogeneous flow. It can also be extended to a time or spatial filtering procedure to construct the filtered density function (FDF) for the large eddy simulation (LES) of compressible turbulent reacting flows.

  20. National Media Laboratory media testing results

    NASA Technical Reports Server (NTRS)

    Mularie, William

    1993-01-01

    The government faces a crisis in data storage, analysis, archive, and communication. The sheer quantity of data being poured into the government systems on a daily basis is overwhelming systems ability to capture, analyze, disseminate, and store critical information. Future systems requirements are even more formidable: with single government platforms having data rate of over 1 Gbit/sec, greater than Terabyte/day storage requirements, and with expected data archive lifetimes of over 10 years. The charter of the National Media Laboratory (NML) is to focus the resources of industry, government, and academia on government needs in the evaluation, development, and field support of advanced recording systems.

  1. Deblurring of Class-Averaged Images in Single-Particle Electron Microscopy.

    PubMed

    Park, Wooram; Madden, Dean R; Rockmore, Daniel N; Chirikjian, Gregory S

    2010-03-01

    This paper proposes a method for deblurring of class-averaged images in single-particle electron microscopy (EM). Since EM images of biological samples are very noisy, the images which are nominally identical projection images are often grouped, aligned and averaged in order to cancel or reduce the background noise. However, the noise in the individual EM images generates errors in the alignment process, which creates an inherent limit on the accuracy of the resulting class averages. This inaccurate class average due to the alignment errors can be viewed as the result of a convolution of an underlying clear image with a blurring function. In this work, we develop a deconvolution method that gives an estimate for the underlying clear image from a blurred class-averaged image using precomputed statistics of misalignment. Since this convolution is over the group of rigid body motions of the plane, SE(2), we use the Fourier transform for SE(2) in order to convert the convolution into a matrix multiplication in the corresponding Fourier space. For practical implementation we use a Hermite-function-based image modeling technique, because Hermite expansions enable lossless Cartesian-polar coordinate conversion using the Laguerre-Fourier expansions, and Hermite expansion and Laguerre-Fourier expansion retain their structures under the Fourier transform. Based on these mathematical properties, we can obtain the deconvolution of the blurred class average using simple matrix multiplication. Tests of the proposed deconvolution method using synthetic and experimental EM images confirm the performance of our method.

  2. Implementation of Biplot Analysis for Mapping Elementary and Junior High Schools in West Sumatra Based on National Examination Results 2016

    NASA Astrophysics Data System (ADS)

    Amalita, N.; Fitria, D.; Distian, V.

    2018-04-01

    National examination is an assessment of learning outcomes that aims to assess the achievement of graduate competence nationally. The result of the national examination is used as a mapping of educational issues in order to arrange the national education policy. Therefore the results of National Examination are used, also, as a reference for the admission of new students to continue their education to a higher level. The results of National Examination in West Sumatra in 2016 decreased from the previous year, both elementary schools (SD) and Junior High School level (SMP). This paper aims to determine the characteristics of the National Examination results in each regency / city in West Sumatra for elementary and junior levels by using Bi-plot analysis. The result of Bi-plot Analysis provides the information that the results of the National Examination of Regency / City in West Sumatra Province are quite diverse. At Junior High School level there are 9 of Regencies / Cities which have similar characteristics. English subjects are the greatest diversity among all of subjects. The calculation results of the correlation of each variable in junior high school level are positively correlated. The variables with positive correlation are mathematics that correlates with English. Based on the mark of National Examination for elementary school level in West Sumatra, there are 8 Regencies / Cities have similar characteristics. The correlations of each variable at the elementary level are positively correlated. The variables that have positive correlation are Sciences (IPA) with Language.

  3. The Mapping Project: Preliminary Results from the National Survey of Faculty. Revised.

    ERIC Educational Resources Information Center

    Drago, Robert; Varner, Amy

    This document reports preliminary results from a national survey of college faculty performed as part of the Mapping Project. The project and the survey concern the ways faculty balance, or do not balance, commitments to work and family. The theoretical framework was based on the work of J. Williams (1991) and others who have argued that an…

  4. National Testing of Pupils in Europe: Objectives, Organisation and Use of Results

    ERIC Educational Resources Information Center

    Parveva, Teodora; De Coster, Isabelle; Noorani, Sogol

    2009-01-01

    This study produced by the Eurydice network gives a detailed picture of the context and organisation of national tests in 30 European countries and the use made of test results in informing education policy and practice and in guiding the school career of pupils. It presents the diverse choices made by European countries regarding the objectives,…

  5. The national fire and fire surrogate study: early results and future challenges

    Treesearch

    Thomas A. Waldrop; James McIver

    2006-01-01

    Fire-adapted ecosystems today have dense plant cover and heavy fuel loads as a result of fire exclusion and other changes in land use practices. Mechanical fuel treatments and prescribed fire are powerful tools for reducing wildfire potential, but the ecological consequences of their use is unknown. The National Fire and Fire Surrogate Study examines the effects of...

  6. 75 FR 25323 - Light-Duty Vehicle Greenhouse Gas Emission Standards and Corporate Average Fuel Economy Standards...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-07

    ...EPA and NHTSA are issuing this joint Final Rule to establish a National Program consisting of new standards for light-duty vehicles that will reduce greenhouse gas emissions and improve fuel economy. This joint Final Rule is consistent with the National Fuel Efficiency Policy announced by President Obama on May 19, 2009, responding to the country's critical need to address global climate change and to reduce oil consumption. EPA is finalizing greenhouse gas emissions standards under the Clean Air Act, and NHTSA is finalizing Corporate Average Fuel Economy standards under the Energy Policy and Conservation Act, as amended. These standards apply to passenger cars, light-duty trucks, and medium-duty passenger vehicles, covering model years 2012 through 2016, and represent a harmonized and consistent National Program. Under the National Program, automobile manufacturers will be able to build a single light-duty national fleet that satisfies all requirements under both programs while ensuring that consumers still have a full range of vehicle choices. NHTSA's final rule also constitutes the agency's Record of Decision for purposes of its National Environmental Policy Act (NEPA) analysis.

  7. 76 FR 74853 - 2017 and Later Model Year Light-Duty Vehicle Greenhouse Gas Emissions and Corporate Average Fuel...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-01

    ...EPA and NHTSA, on behalf of the Department of Transportation, are issuing this joint proposal to further reduce greenhouse gas emissions and improve fuel economy for light-duty vehicles for model years 2017-2025. This proposal extends the National Program beyond the greenhouse gas and corporate average fuel economy standards set for model years 2012-2016. On May 21, 2010, President Obama issued a Presidential Memorandum requesting that NHTSA and EPA develop through notice and comment rulemaking a coordinated National Program to reduce greenhouse gas emissions of light-duty vehicles for model years 2017- 2025. This proposal, consistent with the President's request, responds to the country's critical need to address global climate change and to reduce oil consumption. NHTSA is proposing Corporate Average Fuel Economy standards under the Energy Policy and Conservation Act, as amended by the Energy Independence and Security Act, and EPA is proposing greenhouse gas emissions standards under the Clean Air Act. These standards apply to passenger cars, light-duty trucks, and medium- duty passenger vehicles, and represent a continued harmonized and consistent National Program. Under the National Program for model years 2017-2025, automobile manufacturers would be able to continue building a single light-duty national fleet that satisfies all requirements under both programs while ensuring that consumers still have a full range of vehicle choices. EPA is also proposing a minor change to the regulations applicable to MY 2012-2016, with respect to air conditioner performance and measurement of nitrous oxides.

  8. New approximate orientation averaging of the water molecule interacting with the thermal neutron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Markovic, M.I.; Minic, D.M.; Rakic, A.D.

    1992-02-01

    This paper reports that exactly describing the time of thermal neutron collisions with water molecules, orientation averaging is performed by an exact method (EOA{sub k}) and four approximate methods (two well known and two less known). Expressions for the microscopic scattering kernel are developed. The two well-known approximate orientation averaging methods are Krieger-Nelkin (K-N) and Koppel-Young (K-Y). The results obtained by one of the two proposed approximate orientation averaging methods agree best with the corresponding results obtained by EOA{sub k}. The largest discrepancies between the EOA{sub k} results and the results of the approximate methods are obtained using the well-knowmore » K-N approximate orientation averaging method.« less

  9. The Status of White Spruce Plantations on Lake States National Forests

    Treesearch

    Glen W. Erickson; H. Michael Rauscher

    1985-01-01

    Summarizes information about white spruce plantations as of 1982. Based on average site index, the Superior National Forest in Minnesota and the Hiawatha and Huron-Manistee in Michigan contain climate-soil-seed source complexes that are, on the average, less productive for white spruce than on the other National Forests

  10. Average thermal characteristics of solar wind electrons

    NASA Technical Reports Server (NTRS)

    Montgomery, M. D.

    1972-01-01

    Average solar wind electron properties based on a 1 year Vela 4 data sample-from May 1967 to May 1968 are presented. Frequency distributions of electron-to-ion temperature ratio, electron thermal anisotropy, and thermal energy flux are presented. The resulting evidence concerning heat transport in the solar wind is discussed.

  11. Changes in average length of stay and average charges generated following institution of PSRO review.

    PubMed Central

    Westphal, M; Frazier, E; Miller, M C

    1979-01-01

    A five-year review of accounting data at a university hospital shows that immediately following institution of concurrent PSRO admission and length of stay review of Medicare-Medicaid patients, there was a significant decrease in length of stay and a fall in average charges generated per patient against the inflationary trend. Similar changes did not occur for the non-Medicare-Medicaid patients who were not reviewed. The observed changes occurred even though the review procedure rarely resulted in the denial of services to patients, suggesting an indirect effect of review. PMID:393658

  12. Dustborne Alternaria alternata antigens in U.S. homes: Results from the National Survey of Lead and Allergens in Housing

    PubMed Central

    Salo, Päivi M.; Yin, Ming; Arbes, Samuel J.; Cohn, Richard D.; Sever, Michelle; Muilenberg, Michael; Burge, Harriet A.; London, Stephanie J.; Zeldin, Darryl C.

    2005-01-01

    Background: Alternaria alternata is one of the most common fungi associated with allergic disease. However, Alternaria exposure in indoor environments is not well characterized. Objective: The primary goals of this study were to examine the prevalence of Alternaria exposure and identify independent predictors of Alternaria antigen concentrations in U.S. homes. Methods: Data for this cross-sectional study were obtained from the National Survey of Lead and Allergens in Housing. A nationally representative sample of 831 housing units in 75 different locations throughout the U.S. completed the survey. Information on housing and household characteristics was obtained by questionnaire and environmental assessments. Concentrations of Alternaria antigens in dust collected from various indoor sites were assessed with a polyclonal anti-Alternaria antibody assay. Results: Alternaria antigens were detected in most (95-99%) of the dust samples. The geometric mean concentration, reflecting the average Alternaria concentration in homes, was 4.88 μg/g (SE=0.13 μg/g). In the multivariable linear regression analysis, the age of the housing unit, geographic region, urbanization, poverty, family race, observed mold and moisture problems, use of dehumidifier, and presence of cats and dogs were independent predictors of Alternaria antigen concentrations. Less frequent cleaning and smoking indoors also contributed to higher Alternaria antigen levels in homes. Conclusion: Exposure to Alternaria alternata antigens in U.S. homes is common. Antigen levels in homes are not only influenced by regional factors but also by residential characteristics. Preventing mold and moisture problems, avoiding smoking indoors, and regular household cleaning may help reduce exposure to Alternaria antigens indoors. PMID:16159634

  13. Cosmological ensemble and directional averages of observables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonvin, Camille; Clarkson, Chris; Durrer, Ruth

    We show that at second order, ensemble averages of observables and directional averages do not commute due to gravitational lensing—observing the same thing in many directions over the sky is not the same as taking an ensemble average. In principle this non-commutativity is significant for a variety of quantities that we often use as observables and can lead to a bias in parameter estimation. We derive the relation between the ensemble average and the directional average of an observable, at second order in perturbation theory. We discuss the relevance of these two types of averages for making predictions of cosmologicalmore » observables, focusing on observables related to distances and magnitudes. In particular, we show that the ensemble average of the distance in a given observed direction is increased by gravitational lensing, whereas the directional average of the distance is decreased. For a generic observable, there exists a particular function of the observable that is not affected by second-order lensing perturbations. We also show that standard areas have an advantage over standard rulers, and we discuss the subtleties involved in averaging in the case of supernova observations.« less

  14. U.S. Army War College Guide to National Security Issues. Volume 2. National Security Policy and Strategy

    DTIC Science & Technology

    2012-06-01

    1998 National War College paper entitled “U.S. National Se- curity Structure: A New Model for the 21st Century” defines the national security community ...fueled by revolu- tions in communications and information management, the emergence of a truly global market and world economy, the primacy of economic...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions , searching existing data sources

  15. Coffee, alcohol, smoking, physical activity and QT interval duration: results from the Third National Health and Nutrition Examination Survey.

    PubMed

    Zhang, Yiyi; Post, Wendy S; Dalal, Darshan; Blasco-Colmenares, Elena; Tomaselli, Gordon F; Guallar, Eliseo

    2011-02-28

    Abnormalities in the electrocardiographic QT interval duration have been associated with an increased risk of ventricular arrhythmias and sudden cardiac death. However, there is substantial uncertainty about the effect of modifiable factors such as coffee intake, cigarette smoking, alcohol consumption, and physical activity on QT interval duration. We studied 7795 men and women from the Third National Health and Nutrition Survey (NHANES III, 1988-1994). Baseline QT interval was measured from the standard 12-lead electrocardiogram. Coffee and tea intake, alcohol consumption, leisure-time physical activities over the past month, and lifetime smoking habits were determined using validated questionnaires during the home interview. In the fully adjusted model, the average differences in QT interval comparing participants drinking ≥6 cups/day to those who did not drink any were -1.2 ms (95% CI -4.4 to 2.0) for coffee, and -2.0 ms (-11.2 to 7.3) for tea, respectively. The average differences in QT interval duration comparing current to never smokers was 1.2 ms (-0.6 to 2.9) while the average difference in QT interval duration comparing participants drinking ≥7 drinks/week to non-drinkers was 1.8 ms (-0.5 to 4.0). The age, race/ethnicity, and RR-interval adjusted differences in average QT interval duration comparing men with binge drinking episodes to non-drinkers or drinkers without binge drinking were 2.8 ms (0.4 to 5.3) and 4.0 ms (1.6 to 6.4), respectively. The corresponding differences in women were 1.1 (-2.9 to 5.2) and 1.7 ms (-2.3 to 5.7). Finally, the average differences in QT interval comparing the highest vs. the lowest categories of total physical activity was -0.8 ms (-3.0 to 1.4). Binge drinking was associated with longer QT interval in men but not in women. QT interval duration was not associated with other modifiable factors including coffee and tea intake, smoking, and physical activity.

  16. Slow progress in changing the school food environment: nationally representative results from public and private elementary schools.

    PubMed

    Turner, Lindsey; Chaloupka, Frank J

    2012-09-01

    Children spend much of their day in school, and authorities have called for improvements in the school food environment. However, it is not known whether changes have occurred since the federal wellness policy mandate took effect in 2006-2007. We examined whether the school food environment in public and private elementary schools changed over time and examined variations by school type and geographic division. Survey data were gathered from respondents at nationally representative samples of elementary schools during the 2006-2007 and 2009-2010 school years (respectively, 578 and 680 public schools, and 259 and 313 private schools). Topics assessed included competitive foods, school meals, and other food-related practices (eg, school gardens and nutrition education). A 16-item food environment summary score was computed, with possible scores ranging from 0 (least healthy) to 100 (healthiest). Multivariate regression models were used to examine changes over time in the total school food environment score and component items, and variations by US census division. Many practices improved, such as participation in school gardens or farm-to-school programs, and availability of whole grains and only lower-fat milks in lunches. Although the school food environment score increased significantly, the magnitude of change was small; as of 2009-2010 the average score was 53.5 for public schools (vs 50.1 in 2006-2007) and 42.2 for private schools (vs 37.2 in 2006-2007). Scores were higher in public schools than in private schools (P<0.001), but did not differ by race/ethnicity or school size. For public schools, scores were higher in the Pacific and West South Central divisions compared with the national average. Changes in the school food environment have been minimal, with much room remaining for improvement. Additional policy changes may be needed to speed the pace of improvement. Copyright © 2012 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights

  17. 40 CFR 89.204 - Averaging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Provisions § 89.204 Averaging. (a) Requirements for Tier 1 engines rated at or above 37 kW. A manufacturer... credits obtained through trading. (b) Requirements for Tier 2 and later engines rated at or above 37 kW and Tier 1 and later engines rated under 37 kW. A manufacturer may use averaging to offset an emission...

  18. Beverages consumption in Brazil: results from the first National Dietary Survey

    PubMed Central

    Pereira, Rosangela A; Souza, Amanda M; Duffey, Kiyah J; Sichieri, Rosely; Popkin, Barry M

    2014-01-01

    Objective To provide an overview of beverage consumption patterns using the first nationally representative survey of dietary intake in Brazil. Design Beverage consumption data were obtained by 1-day food records in an individual dietary survey. Setting nationwide cross-sectional survey, 2008–09. Subjects nationally representative sample of individuals ≥10 years (n=34,003). Results Beverages contributed to 17.1% of total energy consumption. Caloric coffee beverages provided the greatest level of energy overall (464 kJ or 111 kcal/d). Individuals from 10 to 18 (243 kJ or 58 kcal/d) and from 19 to 39 years old (230 kJ or 55 kcal/d consumed higher proportion of energy from sugar sweetened soft drinks than individuals over this age (142 kJ or 34 kcal/d for those 40–59 and 79 kJ or 19 kcal/d for those >60 years old). Conclusions Overall, the contribution of beverages, particularly sugary beverages, to total energy consumption in Brazil represents an important public health challenge and is comparable with those from other countries. PMID:25158687

  19. An average/deprivation/inequality (ADI) analysis of chronic disease outcomes and risk factors in Argentina

    PubMed Central

    De Maio, Fernando G; Linetzky, Bruno; Virgolini, Mario

    2009-01-01

    Background Recognition of the global economic and epidemiological burden of chronic non-communicable diseases has increased in recent years. However, much of the research on this issue remains focused on individual-level risk factors and neglects the underlying social patterning of risk factors and disease outcomes. Methods Secondary analysis of Argentina's 2005 Encuesta Nacional de Factores de Riesgo (National Risk Factor Survey, N = 41,392) using a novel analytical strategy first proposed by the United Nations Development Programme (UNDP), which we here refer to as the Average/Deprivation/Inequality (ADI) framework. The analysis focuses on two risk factors (unhealthy diet and obesity) and one related disease outcome (diabetes), a notable health concern in Latin America. Logistic regression is used to examine the interplay between socioeconomic and demographic factors. The ADI analysis then uses the results from the logistic regression to identify the most deprived, the best-off, and the difference between the two ideal types. Results Overall, 19.9% of the sample reported being in poor/fair health, 35.3% reported not eating any fruits or vegetables in five days of the week preceding the interview, 14.7% had a BMI of 30 or greater, and 8.5% indicated that a health professional had told them that they have diabetes or high blood pressure. However, significant variation is hidden by these summary measures. Educational attainment displayed the strongest explanatory power throughout the models, followed by household income, with both factors highlighting the social patterning of risk factors and disease outcomes. As educational attainment and household income increase, the probability of poor health, unhealthy diet, obesity, and diabetes decrease. The analyses also point toward important provincial effects and reinforce the notion that both compositional factors (i.e., characteristics of individuals) and contextual factors (i.e., characteristics of places) are

  20. Achievements and Challenges upon the Implementation of a Program for National Control of Congenital Chagas in Bolivia: Results 2004–2009

    PubMed Central

    Alonso-Vega, Cristina; Billot, Claire; Torrico, Faustino

    2013-01-01

    Bolivia is one of the most endemic countries for Chagas disease. Data of 2005 shows that incidence is around 1.09‰ inhabitants and seroprevalence in children under 15 ranged from 10% in urban areas to 40% in rural areas. In this article, we report results obtained during the implementation of the congenital Chagas program, one of the biggest casuistry in congenital Chagas disease, led by National Program of Chagas and Belgian cooperation from 2004 to 2009. The program strategy was based on serological results during pregnancy and on the follow up of children born from positive mothers until one year old; if positive, treatment was done with Benznidazole, 10 mg/Kg/day/30 days with one post treatment control 6 months later. Throughout the length of the program, a total of 318,479 pregnant women were screened and 23.31% were detected positive. 42,538 children born from positive mothers were analyzed at birth by micromethod, of which 1.43% read positive. 10,120 children returned for their second micromethod control of which 2.29% read positive, 7,650 children returned for the serological control, of which 3.32% turned out positive. From the 1,093 positive children, 70% completed the 30 day-treatment and 122 returned for post treatment control with 96% showing a negative result. It has been seen that maternal-fetal transmission rates vary between 2% and 4%, with an average of 2.6% (about half of previously reported studies that reached 5%). In this work, we show that it is possible to implement, with limited resources, a National Congenital Chagas Program and to integrate it into the Bolivian health system. Keys of success are population awareness, health personnel motivation, and political commitment at all levels. PMID:23875039

  1. Our nation's highways 2000

    DOT National Transportation Integrated Search

    2002-04-29

    The information in this publication provides a condensed overview of facts and figures about the Nation's Highways. This publication is designed to be of interest to the average citizen. The Federal Highway Administration (FHWA) is the source of the ...

  2. State survey of silviculture nonpoint source programs: a comparison of the 2000 northeastern and national results

    Treesearch

    Pamela J. Edwards; Gordon W. Stuart

    2002-01-01

    The National Association of State Foresters conducts surveys of silviculture nonpoint source (NPS) pollution control programs to measure progress and identify needs. The 2000 survey results are summarized here for the nation and for the 20-state northeastern region. Current emphasis of NPS pollution programs is on education, training, and monitoring. Educational...

  3. Common Core Math in the K-8 Classroom: Results from a National Teacher Survey

    ERIC Educational Resources Information Center

    Bay-Williams, Jennifer

    2016-01-01

    Successful implementation of the Common Core State Standards for Mathematics (CCSS-M) should result in noticeable differences in primary and middle school math classrooms across the United States. "Common Core Math in the K-8 Classroom: Results from a National Teacher Survey" takes a close look at how educators are implementing the…

  4. Disk-averaged synthetic spectra of Mars

    NASA Technical Reports Server (NTRS)

    Tinetti, Giovanna; Meadows, Victoria S.; Crisp, David; Fong, William; Velusamy, Thangasamy; Snively, Heather

    2005-01-01

    The principal goal of the NASA Terrestrial Planet Finder (TPF) and European Space Agency's Darwin mission concepts is to directly detect and characterize extrasolar terrestrial (Earthsized) planets. This first generation of instruments is expected to provide disk-averaged spectra with modest spectral resolution and signal-to-noise. Here we use a spatially and spectrally resolved model of a Mars-like planet to study the detectability of a planet's surface and atmospheric properties from disk-averaged spectra. We explore the detectability as a function of spectral resolution and wavelength range, for both the proposed visible coronograph (TPFC) and mid-infrared interferometer (TPF-I/Darwin) architectures. At the core of our model is a spectrum-resolving (line-by-line) atmospheric/surface radiative transfer model. This model uses observational data as input to generate a database of spatially resolved synthetic spectra for a range of illumination conditions and viewing geometries. The model was validated against spectra recorded by the Mars Global Surveyor-Thermal Emission Spectrometer and the Mariner 9-Infrared Interferometer Spectrometer. Results presented here include disk-averaged synthetic spectra, light curves, and the spectral variability at visible and mid-infrared wavelengths for Mars as a function of viewing angle, illumination, and season. We also considered the differences in the spectral appearance of an increasingly ice-covered Mars, as a function of spectral resolution, signal-to-noise and integration time for both TPF-C and TPFI/ Darwin.

  5. Disk-averaged synthetic spectra of Mars.

    PubMed

    Tinetti, Giovanna; Meadows, Victoria S; Crisp, David; Fong, William; Velusamy, Thangasamy; Snively, Heather

    2005-08-01

    The principal goal of the NASA Terrestrial Planet Finder (TPF) and European Space Agency's Darwin mission concepts is to directly detect and characterize extrasolar terrestrial (Earthsized) planets. This first generation of instruments is expected to provide disk-averaged spectra with modest spectral resolution and signal-to-noise. Here we use a spatially and spectrally resolved model of a Mars-like planet to study the detectability of a planet's surface and atmospheric properties from disk-averaged spectra. We explore the detectability as a function of spectral resolution and wavelength range, for both the proposed visible coronograph (TPFC) and mid-infrared interferometer (TPF-I/Darwin) architectures. At the core of our model is a spectrum-resolving (line-by-line) atmospheric/surface radiative transfer model. This model uses observational data as input to generate a database of spatially resolved synthetic spectra for a range of illumination conditions and viewing geometries. The model was validated against spectra recorded by the Mars Global Surveyor-Thermal Emission Spectrometer and the Mariner 9-Infrared Interferometer Spectrometer. Results presented here include disk-averaged synthetic spectra, light curves, and the spectral variability at visible and mid-infrared wavelengths for Mars as a function of viewing angle, illumination, and season. We also considered the differences in the spectral appearance of an increasingly ice-covered Mars, as a function of spectral resolution, signal-to-noise and integration time for both TPF-C and TPFI/ Darwin.

  6. Calculating ensemble averaged descriptions of protein rigidity without sampling.

    PubMed

    González, Luis C; Wang, Hui; Livesay, Dennis R; Jacobs, Donald J

    2012-01-01

    Previous works have demonstrated that protein rigidity is related to thermodynamic stability, especially under conditions that favor formation of native structure. Mechanical network rigidity properties of a single conformation are efficiently calculated using the integer body-bar Pebble Game (PG) algorithm. However, thermodynamic properties require averaging over many samples from the ensemble of accessible conformations to accurately account for fluctuations in network topology. We have developed a mean field Virtual Pebble Game (VPG) that represents the ensemble of networks by a single effective network. That is, all possible number of distance constraints (or bars) that can form between a pair of rigid bodies is replaced by the average number. The resulting effective network is viewed as having weighted edges, where the weight of an edge quantifies its capacity to absorb degrees of freedom. The VPG is interpreted as a flow problem on this effective network, which eliminates the need to sample. Across a nonredundant dataset of 272 protein structures, we apply the VPG to proteins for the first time. Our results show numerically and visually that the rigidity characterizations of the VPG accurately reflect the ensemble averaged [Formula: see text] properties. This result positions the VPG as an efficient alternative to understand the mechanical role that chemical interactions play in maintaining protein stability.

  7. Instrument to average 100 data sets

    NASA Technical Reports Server (NTRS)

    Tuma, G. B.; Birchenough, A. G.; Rice, W. J.

    1977-01-01

    An instrumentation system is currently under development which will measure many of the important parameters associated with the operation of an internal combustion engine. Some of these parameters include mass-fraction burn rate, ignition energy, and the indicated mean effective pressure. One of the characteristics of an internal combustion engine is the cycle-to-cycle variation of these parameters. A curve-averaging instrument has been produced which will generate the average curve, over 100 cycles, of any engine parameter. the average curve is described by 2048 discrete points which are displayed on an oscilloscope screen to facilitate recording and is available in real time. Input can be any parameter which is expressed as a + or - 10-volt signal. Operation of the curve-averaging instrument is defined between 100 and 6000 rpm. Provisions have also been made for averaging as many as four parameters simultaneously, with a subsequent decrease in resolution. This provides the means to correlate and perhaps interrelate the phenomena occurring in an internal combustion engine. This instrument has been used successfully on a 1975 Chevrolet V8 engine, and on a Continental 6-cylinder aircraft engine. While this instrument was designed for use on an internal combustion engine, with some modification it can be used to average any cyclically varying waveform.

  8. Average composition of the tonalite-trondhjemite-granodiorite association: Possibilities of application

    NASA Astrophysics Data System (ADS)

    Chekulaev, V. P.; Glebovitsky, V. A.

    2017-01-01

    The possibilities of using the average compositions of tonalite-trondhjemite-granodiorite association rocks (TTG), which make up a significant part of the Archaean continental crust, have been examined. The results of the TTG average compositions obtained by other researchers and the authors' data of the average compositions of TTG from the Baltic and Ukrainian shields and the entire Archaean crust are given. It is shown that the average compositions of the Archaean TTG of continental large crustal fragments (cratons or provinces) practically do not bear any information on their sources or conditions of their formation. The possibility of obtaining of such information by means of analysis of the average compositions of TTG, composing smaller fragments of the crust, exemplified by rocks of the Karelian subprovinces of the Baltic Shield has been demonstrated.

  9. A virtual pebble game to ensemble average graph rigidity.

    PubMed

    González, Luis C; Wang, Hui; Livesay, Dennis R; Jacobs, Donald J

    2015-01-01

    The body-bar Pebble Game (PG) algorithm is commonly used to calculate network rigidity properties in proteins and polymeric materials. To account for fluctuating interactions such as hydrogen bonds, an ensemble of constraint topologies are sampled, and average network properties are obtained by averaging PG characterizations. At a simpler level of sophistication, Maxwell constraint counting (MCC) provides a rigorous lower bound for the number of internal degrees of freedom (DOF) within a body-bar network, and it is commonly employed to test if a molecular structure is globally under-constrained or over-constrained. MCC is a mean field approximation (MFA) that ignores spatial fluctuations of distance constraints by replacing the actual molecular structure by an effective medium that has distance constraints globally distributed with perfect uniform density. The Virtual Pebble Game (VPG) algorithm is a MFA that retains spatial inhomogeneity in the density of constraints on all length scales. Network fluctuations due to distance constraints that may be present or absent based on binary random dynamic variables are suppressed by replacing all possible constraint topology realizations with the probabilities that distance constraints are present. The VPG algorithm is isomorphic to the PG algorithm, where integers for counting "pebbles" placed on vertices or edges in the PG map to real numbers representing the probability to find a pebble. In the VPG, edges are assigned pebble capacities, and pebble movements become a continuous flow of probability within the network. Comparisons between the VPG and average PG results over a test set of proteins and disordered lattices demonstrate the VPG quantitatively estimates the ensemble average PG results well. The VPG performs about 20% faster than one PG, and it provides a pragmatic alternative to averaging PG rigidity characteristics over an ensemble of constraint topologies. The utility of the VPG falls in between the most

  10. Social Work Roles and Activities Regarding Psychiatric Medication: Results of a National Survey

    ERIC Educational Resources Information Center

    Bentley, Kia J.; Walsh, Joseph; Farmer, Rosemary L.

    2005-01-01

    This article reports the findings of a 2001 national survey of social workers regarding their everyday practice roles and activities regarding psychiatric medication. The results of this quantitative study indicate variability in the types of roles carried out by social workers with regard to psychiatric medication, but that perceptions of…

  11. Thermodynamically Constrained Averaging Theory (TCAT) Two-Phase Flow Model: Derivation, Closure, and Simulation Results

    NASA Astrophysics Data System (ADS)

    Weigand, T. M.; Miller, C. T.; Dye, A. L.; Gray, W. G.; McClure, J. E.; Rybak, I.

    2015-12-01

    The thermodynamically constrained averaging theory (TCAT) has been usedto formulate general classes of porous medium models, including newmodels for two-fluid-phase flow. The TCAT approach provides advantagesthat include a firm connection between the microscale, or pore scale,and the macroscale; a thermodynamically consistent basis; explicitinclusion of factors such as interfacial areas, contact angles,interfacial tension, and curvatures; and dynamics of interface movementand relaxation to an equilibrium state. In order to render the TCATmodel solvable, certain closure relations are needed to relate fluidpressure, interfacial areas, curvatures, and relaxation rates. In thiswork, we formulate and solve a TCAT-based two-fluid-phase flow model. We detail the formulation of the model, which is a specific instancefrom a hierarchy of two-fluid-phase flow models that emerge from thetheory. We show the closure problem that must be solved. Using recentresults from high-resolution microscale simulations, we advance a set ofclosure relations that produce a closed model. Lastly, we solve the model using a locally conservative numerical scheme and compare the TCAT model to the traditional model.

  12. Time averaging, ageing and delay analysis of financial time series

    NASA Astrophysics Data System (ADS)

    Cherstvy, Andrey G.; Vinod, Deepak; Aghion, Erez; Chechkin, Aleksei V.; Metzler, Ralf

    2017-06-01

    We introduce three strategies for the analysis of financial time series based on time averaged observables. These comprise the time averaged mean squared displacement (MSD) as well as the ageing and delay time methods for varying fractions of the financial time series. We explore these concepts via statistical analysis of historic time series for several Dow Jones Industrial indices for the period from the 1960s to 2015. Remarkably, we discover a simple universal law for the delay time averaged MSD. The observed features of the financial time series dynamics agree well with our analytical results for the time averaged measurables for geometric Brownian motion, underlying the famed Black-Scholes-Merton model. The concepts we promote here are shown to be useful for financial data analysis and enable one to unveil new universal features of stock market dynamics.

  13. Water Triple-Point Comparisons: Plateau Averaging or Peak Value?

    NASA Astrophysics Data System (ADS)

    Steur, P. P. M.; Dematteis, R.

    2014-04-01

    With a certain regularity, national metrology institutes conduct comparisons of water triple-point (WTP) cells. The WTP is the most important fixed point for the International Temperature Scale of 1990 (ITS-90). In such comparisons, it is common practice to simply average all the single measured temperature points obtained on a single ice mantle. This practice is quite reasonable whenever the measurements show no time dependence in the results. Ever since the first Supplementary Information for the International Temperature Scale of 1990, published by the Bureau International des Poids et Mesures in Sèvres, it was strongly suggested to wait at least 1 day before taking measurements (now up to 10 days), in order for a newly created ice mantle to stabilize. This stabilization is accompanied by a change in temperature with time. A recent improvement in the sensitivity of resistance measurement enabled the Istituto Nazionale di Ricerca Metrologica to detect more clearly the (possible) change in temperature with time of the WTP on a single ice mantle, as for old borosilicate cells. A limited investigation was performed where the temperature of two cells was monitored day-by-day, from the moment of mantle creation, where it was found that with (old) borosilicate cells it may be counterproductive to wait the usual week before starting measurements. The results are presented and discussed, and it is suggested to adapt the standard procedure for comparisons of WTP cells allowing for a different data treatment with (old) borosilicate cells, because taking the temperature dependence into account will surely reduce the reported differences between cells.

  14. Health behaviors among people with epilepsy—Results from the 2010 National Health Interview Survey☆

    PubMed Central

    Cui, Wanjun; Zack, Matthew M.; Kobau, Rosemarie; Helmers, Sandra L.

    2015-01-01

    Objectives This study aimed to estimate and compare the prevalence of selected health behavior—alcohol use, cigarette smoking, physical activity, and sufficient sleep—between people with and without a history of epilepsy in a large, nationally representative sample in the United States. Methods We used data from the 2010 cross-sectional National Health Interview Survey (NHIS) to compare the prevalence of each health behavior for people with and without epilepsy while adjusting for sex, age, race/ethnicity, and family income. We also further categorized those with epilepsy into active epilepsy and inactive epilepsy and calculated their corresponding prevalences. Results The percentages of adults with a history of epilepsy (50.1%, 95% CI = 45.1%–55.2%) and with active epilepsy (44.4%, 95% CI = 37.6%–51.5%) who were current alcohol drinkers were significantly lower than that of those without epilepsy (65.1%, 95% CI = 64.2%–66.0%). About 21.8% (95% CI = 18.1%–25.9%) of adults with epilepsy and 19.3% (95% CI = 18.7%–19.9%) of adults without epilepsy were current smokers. Adults with active epilepsy were significantly less likely than adults without epilepsy to report following recommended physical activity guidelines for Americans (35.2%, 95% CI = 28.8%–42.1% vs. 46.3%, 95% CI = 45.4%–47.2%) and to report walking for at least ten minutes during the seven days prior to being surveyed (39.6%, 95% CI = 32.3%–47.4% vs. 50.8%, 95% CI = 49.9%–51.7%). The percentage of individuals with active epilepsy (49.8%, 95% CI = 42.0%–57.7%) who reported sleeping an average of 7 or 8 h a day was significantly lower than that of those without epilepsy (61.9%, 95% CI = 61.2%–62.7%). Conclusions Because adults with epilepsy are significantly less likely than adults without epilepsy to engage in recommended levels of physical activity and to get the encouraged amount of sleep for optimal health and well-being, promoting more safe physical activity and improved sleep

  15. Citations of Brazilian physical therapy journals in national publications

    PubMed Central

    Teixeira, Renan K. C.; Yamaki, Vitor N.; Botelho, Nara M.; Teixeira, Renato C.

    2014-01-01

    Background Quotations in Brazilian journals are mainly obtained from national articles (articles from Brazilian journals); thus, it is essential to determine how frequently these articles reference Brazilian journals. Objective This study sought to verify how frequently national papers are cited in the references of three Brazilian physical therapy journals. Method All references for articles published in Fisioterapia em Movimento, Fisioterapia e Pesquisa and Revista Brasileira de Fisioterapia between 2010 and 2012 were evaluated. In particular, the numbers of national articles and international articles (articles from international journals) cited in these references were determined. Results A total of 13,009 references cited by 456 articles were analyzed, and 2,924 (22.47%) of the cited works were national articles. There were no significant differences among the three examined years. A total of 36 (7.89%) articles did not cite national articles, whereas 65 (13.25%) articles cited more national articles than international articles. Conclusion On average, 22.47% of the works cited by the evaluated articles were national articles. No significant differences were detected among the three analyzed years. PMID:24675917

  16. Systematic review of colorectal cancer screening guidelines for average-risk adults: Summarizing the current global recommendations.

    PubMed

    Bénard, Florence; Barkun, Alan N; Martel, Myriam; von Renteln, Daniel

    2018-01-07

    To summarize and compare worldwide colorectal cancer (CRC) screening recommendations in order to identify similarities and disparities. A systematic literature search was performed using MEDLINE, EMBASE, Scopus, CENTRAL and ISI Web of knowledge identifying all average-risk CRC screening guideline publications within the last ten years and/or position statements published in the last 2 years. In addition, a hand-search of the webpages of National Gastroenterology Society websites, the National Guideline Clearinghouse, the BMJ Clinical Evidence website, Google and Google Scholar was performed. Fifteen guidelines were identified. Six guidelines were published in North America, four in Europe, four in Asia and one from the World Gastroenterology Organization. The majority of guidelines recommend screening average-risk individuals between ages 50 and 75 using colonoscopy (every 10 years), or flexible sigmoidoscopy (FS, every 5 years) or fecal occult blood test (FOBT, mainly the Fecal Immunochemical Test, annually or biennially). Disparities throughout the different guidelines are found relating to the use of colonoscopy, rank order between test, screening intervals and optimal age ranges for screening. Average risk individuals between 50 and 75 years should undergo CRC screening. Recommendations for optimal surveillance intervals, preferred tests/test cascade as well as the optimal timing when to start and stop screening differ regionally and should be considered for clinical decision making. Furthermore, local resource availability and patient preferences are important to increase CRC screening uptake, as any screening is better than none.

  17. 40 CFR 76.11 - Emissions averaging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) ACID RAIN NITROGEN OXIDES EMISSION REDUCTION PROGRAM § 76.11 Emissions averaging. (a) General... averaging plan is in compliance with the Acid Rain emission limitation for NOX under the plan only if the...

  18. 40 CFR 76.11 - Emissions averaging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) ACID RAIN NITROGEN OXIDES EMISSION REDUCTION PROGRAM § 76.11 Emissions averaging. (a) General... averaging plan is in compliance with the Acid Rain emission limitation for NOX under the plan only if the...

  19. Results from the 2006 National Survey on Drug Use and Health: National Findings

    ERIC Educational Resources Information Center

    Substance Abuse and Mental Health Services Administration, 2007

    2007-01-01

    This updated report from Substance Abuse and Mental Health Services Administration's (SAMHSA's) Office of Applied Studies presents the first information from the 2006 National Survey on Drug Use and Health (NSDUH) and is the primary source of information on the prevalence, patterns, and consequences of alcohol, tobacco, and illegal drug use and…

  20. Results From the 2014 National Wilderness Manager Survey

    Treesearch

    Ramesh Ghimire; Ken Cordell; Alan Watson; Chad Dawson; Gary T. Green

    2015-01-01

    A national survey of managers was developed to support interagency wilderness strategic planning. The focus was on major challenges, perceived needs for science and training, and accomplishments of 1995 Strategic Plan objectives. The survey was administered to managers at the four federal agencies with wilderness management responsibilities: the Bureau of Land...

  1. Education Resources, Results Vary Widely among 20 Nations.

    ERIC Educational Resources Information Center

    Brett, Patricia

    1992-01-01

    A recent compilation of educational indicators from 20 industrialized countries in North America, the Pacific, Europe, and Scandinavia shows rates and trends in college enrollment and graduation, education-related earning power for men and women, and public spending on education. Educational attainment rates are charted for the 20 nations. (MSE)

  2. High average power pockels cell

    DOEpatents

    Daly, Thomas P.

    1991-01-01

    A high average power pockels cell is disclosed which reduces the effect of thermally induced strains in high average power laser technology. The pockels cell includes an elongated, substantially rectangular crystalline structure formed from a KDP-type material to eliminate shear strains. The X- and Y-axes are oriented substantially perpendicular to the edges of the crystal cross-section and to the C-axis direction of propagation to eliminate shear strains.

  3. Teacher Salary and National Achievement: A Cross-National Analysis of 30 Countries

    ERIC Educational Resources Information Center

    Akiba, Motoko; Chiu, Yu-Lun; Shimizu, Kazuhiko; Liang, Guodong

    2012-01-01

    Using national teacher salary data from the Organisation for Economic Co-operation and Development (OECD) and student achievement data from the Programme for International Student Assessment (PISA), this study compared secondary school teacher salary in 30 countries and examined the relationship between average teacher salary and national…

  4. Examining national trends in worker health with the National Health Interview Survey.

    PubMed

    Luckhaupt, Sara E; Sestito, John P

    2013-12-01

    To describe data from the National Health Interview Survey (NHIS), both the annual core survey and periodic occupational health supplements (OHSs), available for examining national trends in worker health. The NHIS is an annual in-person household survey with a cross-sectional multistage clustered sample design to produce nationally representative health data. The 2010 NHIS included an OHS. Prevalence rates of various health conditions and health behaviors among workers based on multiple years of NHIS core data are available. In addition, the 2010 NHIS-OHS data provide prevalence rates of selected health conditions, work organization factors, and occupational exposures among US workers by industry and occupation. The publicly available NHIS data can be used to identify areas of concern for various industries and for benchmarking data from specific worker groups against national averages.

  5. Your Average Nigga

    ERIC Educational Resources Information Center

    Young, Vershawn Ashanti

    2004-01-01

    "Your Average Nigga" contends that just as exaggerating the differences between black and white language leaves some black speakers, especially those from the ghetto, at an impasse, so exaggerating and reifying the differences between the races leaves blacks in the impossible position of either having to try to be white or forever struggling to…

  6. Average Weighted Receiving Time of Weighted Tetrahedron Koch Networks

    NASA Astrophysics Data System (ADS)

    Dai, Meifeng; Zhang, Danping; Ye, Dandan; Zhang, Cheng; Li, Lei

    2015-07-01

    We introduce weighted tetrahedron Koch networks with infinite weight factors, which are generalization of finite ones. The term of weighted time is firstly defined in this literature. The mean weighted first-passing time (MWFPT) and the average weighted receiving time (AWRT) are defined by weighted time accordingly. We study the AWRT with weight-dependent walk. Results show that the AWRT for a nontrivial weight factor sequence grows sublinearly with the network order. To investigate the reason of sublinearity, the average receiving time (ART) for four cases are discussed.

  7. 152 W average power Tm-doped fiber CPA system.

    PubMed

    Stutzki, Fabian; Gaida, Christian; Gebhardt, Martin; Jansen, Florian; Wienke, Andreas; Zeitner, Uwe; Fuchs, Frank; Jauregui, Cesar; Wandt, Dieter; Kracht, Dietmar; Limpert, Jens; Tünnermann, Andreas

    2014-08-15

    A high-power thulium (Tm)-doped fiber chirped-pulse amplification system emitting a record compressed average output power of 152 W and 4 MW peak power is demonstrated. This result is enabled by utilizing Tm-doped photonic crystal fibers with mode-field diameters of 35 μm, which mitigate detrimental nonlinearities, exhibit slope efficiencies of more than 50%, and allow for reaching a pump-power-limited average output power of 241 W. The high-compression efficiency has been achieved by using multilayer dielectric gratings with diffraction efficiencies higher than 98%.

  8. Fraternities, Sororities and Binge Drinking: Results from a National Study of American Colleges

    ERIC Educational Resources Information Center

    Wechsler, Henry; Kuh, George; Davenport, Andrea E.

    2009-01-01

    The purpose of this study is to compare the drinking and associated behavior of fraternity and sorority members to that of nonmembers. The project is national in scope, and its results can help determine whether public perceptions of alcohol use by students affiliated with Greek social organizations are warranted. This study will also lend some…

  9. Averages of B-Hadron, C-Hadron, and tau-lepton properties as of early 2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amhis, Y.; et al.

    2012-07-01

    This article reports world averages of measurements of b-hadron, c-hadron, and tau-lepton properties obtained by the Heavy Flavor Averaging Group (HFAG) using results available through the end of 2011. In some cases results available in the early part of 2012 are included. For the averaging, common input parameters used in the various analyses are adjusted (rescaled) to common values, and known correlations are taken into account. The averages include branching fractions, lifetimes, neutral meson mixing parameters, CP violation parameters, parameters of semileptonic decays and CKM matrix elements.

  10. The random coding bound is tight for the average code.

    NASA Technical Reports Server (NTRS)

    Gallager, R. G.

    1973-01-01

    The random coding bound of information theory provides a well-known upper bound to the probability of decoding error for the best code of a given rate and block length. The bound is constructed by upperbounding the average error probability over an ensemble of codes. The bound is known to give the correct exponential dependence of error probability on block length for transmission rates above the critical rate, but it gives an incorrect exponential dependence at rates below a second lower critical rate. Here we derive an asymptotic expression for the average error probability over the ensemble of codes used in the random coding bound. The result shows that the weakness of the random coding bound at rates below the second critical rate is due not to upperbounding the ensemble average, but rather to the fact that the best codes are much better than the average at low rates.

  11. Applying the concept of Independent Applicability to results from the National Aquatic Resource Surveys

    EPA Science Inventory

    The assessments resulting from the National Aquatic Resource Surveys have taken the tact of basing estimates of resource condition on the biological indicators of quality. The physical habitat, chemical, hydrological, and watershed indicators are used to evaluate the relative ra...

  12. Turbine Engine Flowpath Averaging Techniques

    DTIC Science & Technology

    1980-10-01

    u~%x AEDC- TMR- 8 I-G 1 • R. P TURBINE ENGINE FLOWPATH AVERAGING TECHNIQUES T. W. Skiles ARO, Inc. October 1980 Final Report for Period...COVERED 00-01-1980 to 00-10-1980 4. TITLE AND SUBTITLE Turbine Engine Flowpath Averaging Techniques 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c...property for gas turbine engines were investigated. The investigation consisted of a literature review and review of turbine engine current flowpath

  13. Estimating 1970-99 average annual groundwater recharge in Wisconsin using streamflow data

    USGS Publications Warehouse

    Gebert, Warren A.; Walker, John F.; Kennedy, James L.

    2011-01-01

    Average annual recharge in Wisconsin for the period 1970-99 was estimated using streamflow data from U.S. Geological Survey continuous-record streamflow-gaging stations and partial-record sites. Partial-record sites have discharge measurements collected during low-flow conditions. The average annual base flow of a stream divided by the drainage area is a good approximation of the recharge rate; therefore, once average annual base flow is determined recharge can be calculated. Estimates of recharge for nearly 72 percent of the surface area of the State are provided. The results illustrate substantial spatial variability of recharge across the State, ranging from less than 1 inch to more than 12 inches per year. The average basin size for partial-record sites (50 square miles) was less than the average basin size for the gaging stations (305 square miles). Including results for smaller basins reveals a spatial variability that otherwise would be smoothed out using only estimates for larger basins. An error analysis indicates that the techniques used provide base flow estimates with standard errors ranging from 5.4 to 14 percent.

  14. Stroke units: research and reality. Results from the National Sentinel Audit of Stroke

    PubMed Central

    Rudd, A; Hoffman, A; Irwin, P; Pearson, M; Lowe, D; on, b

    2005-01-01

    Objectives: To use data from the 2001–2 National Stroke Audit to describe the organisation of stroke units in England, Wales and Northern Ireland, and to see if key characteristics deemed effective from the research literature were present. Design: Data were collected as part of the National Sentinel Audit of Stroke in 2001, both on the organisation and structure of inpatient stroke care and the process of care to hospitals managing stroke patients. Setting: 240 hospitals from England, Wales and Northern Ireland took part in the 2001–2 National Stroke Audit, a response rate of over 95%. These sites audited a total of 8200 patients. Audit tool: Royal College of Physicians Intercollegiate Working Party Stroke Audit Tool. Results: 73% of hospitals participating in the audit had a stroke unit but only 36% of stroke admissions spent any time on one. Only 46% of all units describing themselves as stroke units had all five organisational characteristics that previous research literature had identified as being key features, while 26% had four and 28% had three or less. Better organisation was associated with better process of care for patients, with patients managed on stroke units receiving better care than those managed in other settings. Conclusion: The National Service Framework for Older People set a target for all hospitals treating stroke patients to have a stroke unit by April 2004. This study suggests that in many hospitals this is being achieved without adequate resource and expertise. PMID:15691997

  15. Applicability of Time-Averaged Holography for Micro-Electro-Mechanical System Performing Non-Linear Oscillations

    PubMed Central

    Palevicius, Paulius; Ragulskis, Minvydas; Palevicius, Arvydas; Ostasevicius, Vytautas

    2014-01-01

    Optical investigation of movable microsystem components using time-averaged holography is investigated in this paper. It is shown that even a harmonic excitation of a non-linear microsystem may result in an unpredictable chaotic motion. Analytical results between parameters of the chaotic oscillations and the formation of time-averaged fringes provide a deeper insight into computational and experimental interpretation of time-averaged MEMS holograms. PMID:24451467

  16. Comparative Analysis of the Mathematics Problems Given at International Tests and at the Romanian National Tests

    ERIC Educational Resources Information Center

    Marchis, Iuliana

    2009-01-01

    The results of the Romanian pupils on international tests PISA and TIMSS in Mathematics are below the average. These poor results have many explications. In this article we compare the Mathematics problems given on these international tests with those given on national tests in Romania.

  17. Delayed and Unreported Drug-Susceptibility Testing Results in the US National Tuberculosis Surveillance System, 1993-2014.

    PubMed

    Jones, Jefferson Michael; Armstrong, Lori R

    Drug-susceptibility testing (DST) of Mycobacterium tuberculosis is necessary for identifying drug-resistant tuberculosis, administering effective treatment regimens, and preventing the spread of drug-resistant tuberculosis. DST is recommended for all culture-confirmed cases of tuberculosis. We examined trends in delayed and unreported DST results in the Centers for Disease Control and Prevention's National Tuberculosis Surveillance System. We analyzed culture-confirmed tuberculosis cases reported to the National Tuberculosis Surveillance System during 1993-2014 for annual trends in initial DST reporting for first-line antituberculosis drugs and trends in on-time, delayed, and unreported results. We defined on-time reporting as DST results received during the same calendar year in which the patient's case was reported or ≤4 months after the calendar year ended and delayed reporting as DST results received after the calendar year. We compared cases with on-time, delayed, and unreported DST results by patient and tuberculosis program characteristics. The proportion of cases with reported results for all first-line antituberculosis drugs increased during 1993-2011. Reporting of pyrazinamide results was lower than reporting of other drugs. However, during 2000-2012, of 134 787 tuberculosis cases reported to the National Tuberculosis Surveillance System, reporting was on time for 125 855 (93.4%) cases, delayed for 5332 (4.0%) cases, and unreported for 3600 (2.7%) cases. Despite increases in the proportion of cases with on-time DST results, delayed and unreported results persisted. Carefully assessing causes for delayed and unreported DST results should lead to more timely reporting of drug-resistant tuberculosis.

  18. 40 CFR 60.2943 - How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... averages into the appropriate averaging times and units? 60.2943 Section 60.2943 Protection of Environment... SOURCES Operator Training and Qualification Monitoring § 60.2943 How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units? (a) Use Equation 1 in § 60.2975 to calculate...

  19. 40 CFR 60.2943 - How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... averages into the appropriate averaging times and units? 60.2943 Section 60.2943 Protection of Environment... SOURCES Operator Training and Qualification Monitoring § 60.2943 How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units? (a) Use Equation 1 in § 60.2975 to calculate...

  20. 40 CFR 60.2943 - How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... averages into the appropriate averaging times and units? 60.2943 Section 60.2943 Protection of Environment... SOURCES Operator Training and Qualification Monitoring § 60.2943 How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units? (a) Use Equation 1 in § 60.2975 to calculate...

  1. Making the Transition: Interim Results of the National Guard Youth ChalleNGe Evaluation

    ERIC Educational Resources Information Center

    Millenky, Megan; Bloom, Dan; Dillon, Colleen

    2010-01-01

    Young people who drop out of high school face long odds of success in a labor market that increasingly values education and skills. This report presents interim results from a rigorous, ongoing evaluation of the National Guard Youth ChalleNGe Program, which aims to "reclaim the lives of at-risk youth" who have dropped out of high school.…

  2. 40 CFR 600.510-12 - Calculation of average fuel economy and average carbon-related exhaust emissions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Calculation of average fuel economy and average carbon-related exhaust emissions. 600.510-12 Section 600.510-12 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulation...

  3. Contemporary and long-term erosion in the Kruger National Park, Lowveld Savanna, South Africa. First results.

    NASA Astrophysics Data System (ADS)

    Baade, Jussi; Rheinwarth, Bastian; Glotzbach, Christoph

    2017-04-01

    Human-induced soil erosion as a consequence of the transformation of landscapes to pasture or arable land is a function of natural conditions (relief and soil properties), natural drivers (climate) as well as land use and management. It is common understanding that humans have accelerated erosion of landscapes by modifying land surface characteristics, like vegetation cover and soil properties, among others. But the magnitude of the acceleration is not yet well established. Partly, the uncertainty about the magnitude of the problem is due to the fact that baseline values, i.e., data on rates of natural erosion from uncultivated land under current climate conditions, are difficult to find. Against this background, we conducted an assessment of contemporary and long-term erosion in the Kruger National Park (KNP), South Africa. The KNP has been set aside for the recovery of wildlife in the early 20th century and was spared from agricultural practices even before that. Concerning soil properties and vegetation cover the KNP can thus be considered to represent a rather pristine savanna environment. In order to secure water provision to wildlife a number of reservoirs was established in the 1930s to 1970s with catchment areas entirely within the KNP boundaries. The size of the catchments varies from 4 to 100 km2. Volumetric mapping and dry bulk density measurements of reservoir deposits provided average minimum sediment yield rates for observation periods of 30 to 80 years. Hydrological modelling was used to assess the trap efficiency of the reservoirs and to estimate the most likely sediment yield rates. At the same time this exercise provided evidence for the stochastic nature of runoff and erosion events in this semi-arid environment and the need to evaluate contemporary erosion based on long observation periods. Measuring cosmogenic 10Be in quartz sand samples collected at the inlet of the reservoirs provided the corresponding average long-term erosion rates for

  4. National data centre preparedness exercise 2015 (NPE2015): MY-NDC progress result and experience

    NASA Astrophysics Data System (ADS)

    Rashid, Faisal Izwan Abdul; Zolkaffly, Muhammed Zulfakar

    2017-01-01

    Malaysia has established the National Data Centre (MY-NDC) in December 2005. MY-NDC is tasked to perform the Comprehensive Nuclear-Test-Ban-Treaty (CTBT) data management as well as providing relevant information for Treaty related events to the Malaysian Nuclear Agency (Nuclear Malaysia) as the CTBT National Authority. In the late 2015, MY-NDC has participated in the National Data Centre Preparedness Exercise 2015 (NPE 2015) which aims to access the level of readiness at MY-NDC. This paper aims at presenting the progress result of NPE 2015 as well as highlighting MY-NDC experience in NPE 2015 compared to previous participation in NPE 2013. MY-NDC has utilised available resources for NPE 2015. In NPE 2015, MY-NDC has performed five type of analyses compared with only two analyses in NPE 2013. Participation in the NPE 2015 has enabled MY-NDC to assess its capability and identify rooms for improvement.

  5. 40 CFR 60.2943 - How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) Use Equation 2 in § 60.2975 to calculate the 12-hour rolling averages for concentrations of carbon... averages into the appropriate averaging times and units? 60.2943 Section 60.2943 Protection of Environment... 16, 2006 Monitoring § 60.2943 How do I convert my 1-hour arithmetic averages into the appropriate...

  6. 40 CFR 60.2943 - How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) Use Equation 2 in § 60.2975 to calculate the 12-hour rolling averages for concentrations of carbon... averages into the appropriate averaging times and units? 60.2943 Section 60.2943 Protection of Environment... 16, 2006 Monitoring § 60.2943 How do I convert my 1-hour arithmetic averages into the appropriate...

  7. On the contribution of G20 and G30 in the Time-Averaged Paleomagnetic Field: First results from a new Giant Gaussian Process inverse modeling approach

    NASA Astrophysics Data System (ADS)

    Khokhlov, A.; Hulot, G.; Johnson, C. L.

    2013-12-01

    It is well known that the geometry of the recent time-averaged paleomagnetic field (TAF) is very close to that of a geocentric axial dipole (GAD). However, many TAF models recovered from averaging lava flow paleomagnetic directional data (the most numerous and reliable of all data) suggest that significant additional terms, in particular quadrupolar (G20) and octupolar (G30) zonal terms, likely contribute. The traditional way in which most such TAF models are recovered uses an empirical estimate for paleosecular variation (PSV) that is subject to limitations imposed by the limited age information available for such data. In this presentation, we will report on a new way to recover the TAF, using an inverse modeling approach based on the so-called Giant Gaussian Process (GGP) description of the TAF and PSV, and various statistical tools we recently made available (see Khokhlov and Hulot, Geophysical Journal International, 2013, doi: 10.1093/gji/ggs118). First results based on high quality data published from the Time-Averaged Field Investigations project (see Johnson et al., G-cubed, 2008, doi:10.1029/2007GC001696) clearly show that both the G20 and G30 terms are very well constrained, and that optimum values fully consistent with the data can be found. These promising results lay the groundwork for use of the method with more extensive data sets, to search for possible additional non-zonal departures of the TAF from the GAD.

  8. Development of the Los Alamos continuous high average-power microsecond pulser ion accelerator

    NASA Astrophysics Data System (ADS)

    Bitteker, L. J.; Wood, B. P.; Davis, H. A.; Waganaar, W. J.; Boyd, I. D.; Lovberg, R. H.

    2000-10-01

    The continuous high average-power microsecond pulser (CHAMP) ion accelerator is being constructed at Los Alamos National Laboratory. Progress on the testing of the CHAMP diode is discussed. A direct simulation Monte Carlo computer code is used to investigate the puffed gas fill of the CHAMP anode. High plenum pressures and low plenum volumes are found to be desirable for effective gas puffs. The typical gas fill time is 150-180 μs from initiation of valve operation to end of fill. Results of anode plasma production at three stages of development are discussed. Plasma properties are monitored with electric and magnetic field probes. From this data, the near coil plasma density under nominal conditions is found to be on the order of 1×1016 cm-3. Large error is associated with this calculation due to inconsistencies between tests and the limitations of the instrumentation used. The diode insulating magnetic field is observed to result in lower density plasma with a more diffuse structure than for the cases when the insulating field is not applied. The importance of these differences in plasma quality on the beam production is yet to be determined.

  9. Using Bayesian Model Averaging (BMA) to calibrate probabilistic surface temperature forecasts over Iran

    NASA Astrophysics Data System (ADS)

    Soltanzadeh, I.; Azadi, M.; Vakili, G. A.

    2011-07-01

    Using Bayesian Model Averaging (BMA), an attempt was made to obtain calibrated probabilistic numerical forecasts of 2-m temperature over Iran. The ensemble employs three limited area models (WRF, MM5 and HRM), with WRF used with five different configurations. Initial and boundary conditions for MM5 and WRF are obtained from the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) and for HRM the initial and boundary conditions come from analysis of Global Model Europe (GME) of the German Weather Service. The resulting ensemble of seven members was run for a period of 6 months (from December 2008 to May 2009) over Iran. The 48-h raw ensemble outputs were calibrated using BMA technique for 120 days using a 40 days training sample of forecasts and relative verification data. The calibrated probabilistic forecasts were assessed using rank histogram and attribute diagrams. Results showed that application of BMA improved the reliability of the raw ensemble. Using the weighted ensemble mean forecast as a deterministic forecast it was found that the deterministic-style BMA forecasts performed usually better than the best member's deterministic forecast.

  10. Averaging and Adding in Children's Worth Judgements

    ERIC Educational Resources Information Center

    Schlottmann, Anne; Harman, Rachel M.; Paine, Julie

    2012-01-01

    Under the normative Expected Value (EV) model, multiple outcomes are additive, but in everyday worth judgement intuitive averaging prevails. Young children also use averaging in EV judgements, leading to a disordinal, crossover violation of utility when children average the part worths of simple gambles involving independent events (Schlottmann,…

  11. Model averaging and muddled multimodel inferences.

    PubMed

    Cade, Brian S

    2015-09-01

    Three flawed practices associated with model averaging coefficients for predictor variables in regression models commonly occur when making multimodel inferences in analyses of ecological data. Model-averaged regression coefficients based on Akaike information criterion (AIC) weights have been recommended for addressing model uncertainty but they are not valid, interpretable estimates of partial effects for individual predictors when there is multicollinearity among the predictor variables. Multicollinearity implies that the scaling of units in the denominators of the regression coefficients may change across models such that neither the parameters nor their estimates have common scales, therefore averaging them makes no sense. The associated sums of AIC model weights recommended to assess relative importance of individual predictors are really a measure of relative importance of models, with little information about contributions by individual predictors compared to other measures of relative importance based on effects size or variance reduction. Sometimes the model-averaged regression coefficients for predictor variables are incorrectly used to make model-averaged predictions of the response variable when the models are not linear in the parameters. I demonstrate the issues with the first two practices using the college grade point average example extensively analyzed by Burnham and Anderson. I show how partial standard deviations of the predictor variables can be used to detect changing scales of their estimates with multicollinearity. Standardizing estimates based on partial standard deviations for their variables can be used to make the scaling of the estimates commensurate across models, a necessary but not sufficient condition for model averaging of the estimates to be sensible. A unimodal distribution of estimates and valid interpretation of individual parameters are additional requisite conditions. The standardized estimates or equivalently the t

  12. Model averaging and muddled multimodel inferences

    USGS Publications Warehouse

    Cade, Brian S.

    2015-01-01

    Three flawed practices associated with model averaging coefficients for predictor variables in regression models commonly occur when making multimodel inferences in analyses of ecological data. Model-averaged regression coefficients based on Akaike information criterion (AIC) weights have been recommended for addressing model uncertainty but they are not valid, interpretable estimates of partial effects for individual predictors when there is multicollinearity among the predictor variables. Multicollinearity implies that the scaling of units in the denominators of the regression coefficients may change across models such that neither the parameters nor their estimates have common scales, therefore averaging them makes no sense. The associated sums of AIC model weights recommended to assess relative importance of individual predictors are really a measure of relative importance of models, with little information about contributions by individual predictors compared to other measures of relative importance based on effects size or variance reduction. Sometimes the model-averaged regression coefficients for predictor variables are incorrectly used to make model-averaged predictions of the response variable when the models are not linear in the parameters. I demonstrate the issues with the first two practices using the college grade point average example extensively analyzed by Burnham and Anderson. I show how partial standard deviations of the predictor variables can be used to detect changing scales of their estimates with multicollinearity. Standardizing estimates based on partial standard deviations for their variables can be used to make the scaling of the estimates commensurate across models, a necessary but not sufficient condition for model averaging of the estimates to be sensible. A unimodal distribution of estimates and valid interpretation of individual parameters are additional requisite conditions. The standardized estimates or equivalently the

  13. Impact of Bias-Correction Type and Conditional Training on Bayesian Model Averaging over the Northeast United States

    Treesearch

    Michael J. Erickson; Brian A. Colle; Joseph J. Charney

    2012-01-01

    The performance of a multimodel ensemble over the northeast United States is evaluated before and after applying bias correction and Bayesian model averaging (BMA). The 13-member Stony Brook University (SBU) ensemble at 0000 UTC is combined with the 21-member National Centers for Environmental Prediction (NCEP) Short-Range Ensemble Forecast (SREF) system at 2100 UTC....

  14. 40 CFR 60.3042 - How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... averages into the appropriate averaging times and units? 60.3042 Section 60.3042 Protection of Environment... Construction On or Before December 9, 2004 Model Rule-Monitoring § 60.3042 How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units? (a) Use Equation 1 in § 60.3076 to...

  15. 40 CFR 60.3042 - How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... averages into the appropriate averaging times and units? 60.3042 Section 60.3042 Protection of Environment... Construction On or Before December 9, 2004 Model Rule-Monitoring § 60.3042 How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units? (a) Use Equation 1 in § 60.3076 to...

  16. 40 CFR 60.3042 - How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... averages into the appropriate averaging times and units? 60.3042 Section 60.3042 Protection of Environment... Construction On or Before December 9, 2004 Model Rule-Monitoring § 60.3042 How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units? (a) Use Equation 1 in § 60.3076 to...

  17. 40 CFR 60.3042 - How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... averages into the appropriate averaging times and units? 60.3042 Section 60.3042 Protection of Environment... Construction On or Before December 9, 2004 Model Rule-Monitoring § 60.3042 How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units? (a) Use Equation 1 in § 60.3076 to...

  18. 40 CFR 60.3042 - How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... averages into the appropriate averaging times and units? 60.3042 Section 60.3042 Protection of Environment... Construction On or Before December 9, 2004 Model Rule-Monitoring § 60.3042 How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units? (a) Use Equation 1 in § 60.3076 to...

  19. Status Variations in Alcohol Use among Young Adults: Results from the 1984 National Longitudinal Surveys of Youth.

    ERIC Educational Resources Information Center

    Crowley, Joan E.

    This document gives descriptive results on alcohol use patterns among young adults from the 1984 National Longitudinal Survey of Labor Market of Youth, a survey of a large, nationally representative sample supplemented by samples of blacks, Hispanics, and economically disadvantaged non-black, non-Hispanic youth and covering the entire range of…

  20. Hybrid Reynolds-Averaged/Large-Eddy Simulations of a Co-Axial Supersonic Free-Jet Experiment

    NASA Technical Reports Server (NTRS)

    Baurle, R. A.; Edwards, J. R.

    2009-01-01

    Reynolds-averaged and hybrid Reynolds-averaged/large-eddy simulations have been applied to a supersonic coaxial jet flow experiment. The experiment utilized either helium or argon as the inner jet nozzle fluid, and the outer jet nozzle fluid consisted of laboratory air. The inner and outer nozzles were designed and operated to produce nearly pressure-matched Mach 1.8 flow conditions at the jet exit. The purpose of the computational effort was to assess the state-of-the-art for each modeling approach, and to use the hybrid Reynolds-averaged/large-eddy simulations to gather insight into the deficiencies of the Reynolds-averaged closure models. The Reynolds-averaged simulations displayed a strong sensitivity to choice of turbulent Schmidt number. The baseline value chosen for this parameter resulted in an over-prediction of the mixing layer spreading rate for the helium case, but the opposite trend was noted when argon was used as the injectant. A larger turbulent Schmidt number greatly improved the comparison of the results with measurements for the helium simulations, but variations in the Schmidt number did not improve the argon comparisons. The hybrid simulation results showed the same trends as the baseline Reynolds-averaged predictions. The primary reason conjectured for the discrepancy between the hybrid simulation results and the measurements centered around issues related to the transition from a Reynolds-averaged state to one with resolved turbulent content. Improvements to the inflow conditions are suggested as a remedy to this dilemma. Comparisons between resolved second-order turbulence statistics and their modeled Reynolds-averaged counterparts were also performed.

  1. 7 CFR 51.2561 - Average moisture content.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Average moisture content. 51.2561 Section 51.2561... STANDARDS) United States Standards for Grades of Shelled Pistachio Nuts § 51.2561 Average moisture content. (a) Determining average moisture content of the lot is not a requirement of the grades, except when...

  2. Retrieval of Areal-averaged Spectral Surface Albedo from Transmission Data Alone: Computationally Simple and Fast Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kassianov, Evgueni I.; Barnard, James C.; Flynn, Connor J.

    We introduce and evaluate a simple retrieval of areal-averaged surface albedo using ground-based measurements of atmospheric transmission alone at five wavelengths (415, 500, 615, 673 and 870nm), under fully overcast conditions. Our retrieval is based on a one-line semi-analytical equation and widely accepted assumptions regarding the weak spectral dependence of cloud optical properties, such as cloud optical depth and asymmetry parameter, in the visible and near-infrared spectral range. To illustrate the performance of our retrieval, we use as input measurements of spectral atmospheric transmission from Multi-Filter Rotating Shadowband Radiometer (MFRSR). These MFRSR data are collected at two well-established continental sitesmore » in the United States supported by the U.S. Department of Energy’s (DOE’s) Atmospheric Radiation Measurement (ARM) Program and National Oceanic and Atmospheric Administration (NOAA). The areal-averaged albedos obtained from the MFRSR are compared with collocated and coincident Moderate Resolution Imaging Spectroradiometer (MODIS) white-sky albedo. In particular, these comparisons are made at four MFRSR wavelengths (500, 615, 673 and 870nm) and for four seasons (winter, spring, summer and fall) at the ARM site using multi-year (2008-2013) MFRSR and MODIS data. Good agreement, on average, for these wavelengths results in small values (≤0.01) of the corresponding root mean square errors (RMSEs) for these two sites. The obtained RMSEs are comparable with those obtained previously for the shortwave albedos (MODIS-derived versus tower-measured) for these sites during growing seasons. We also demonstrate good agreement between tower-based daily-averaged surface albedos measured for “nearby” overcast and non-overcast days. Thus, our retrieval originally developed for overcast conditions likely can be extended for non-overcast days by interpolating between overcast retrievals.« less

  3. Estimating Watershed-Averaged Precipitation and Evapotranspiration Fluxes using Streamflow Measurements in a Semi-Arid, High Altitude Montane Catchment

    NASA Astrophysics Data System (ADS)

    Herrington, C.; Gonzalez-Pinzon, R.

    2014-12-01

    Streamflow through the Middle Rio Grande Valley is largely driven by snowmelt pulses and monsoonal precipitation events originating in the mountain highlands of New Mexico (NM) and Colorado. Water managers rely on results from storage/runoff models to distribute this resource statewide and to allocate compact deliveries to Texas under the Rio Grande Compact agreement. Prevalent drought conditions and the added uncertainty of climate change effects in the American southwest have led to a greater call for accuracy in storage model parameter inputs. While precipitation and evapotranspiration measurements are subject to scaling and representativeness errors, streamflow readings remain relatively dependable and allow watershed-average water budget estimates. Our study seeks to show that by "Doing Hydrology Backwards" we can effectively estimate watershed-average precipitation and evapotranspiration fluxes in semi-arid landscapes of NM using fluctuations in streamflow data alone. We tested this method in the Valles Caldera National Preserve (VCNP) in the Jemez Mountains of central NM. This method will be further verified by using existing weather stations and eddy-covariance towers within the VCNP to obtain measured values to compare against our model results. This study contributes to further validate this technique as being successful in humid and semi-arid catchments as the method has already been verified as effective in the former setting.

  4. Average variograms to guide soil sampling

    NASA Astrophysics Data System (ADS)

    Kerry, R.; Oliver, M. A.

    2004-10-01

    To manage land in a site-specific way for agriculture requires detailed maps of the variation in the soil properties of interest. To predict accurately for mapping, the interval at which the soil is sampled should relate to the scale of spatial variation. A variogram can be used to guide sampling in two ways. A sampling interval of less than half the range of spatial dependence can be used, or the variogram can be used with the kriging equations to determine an optimal sampling interval to achieve a given tolerable error. A variogram might not be available for the site, but if the variograms of several soil properties were available on a similar parent material and or particular topographic positions an average variogram could be calculated from these. Averages of the variogram ranges and standardized average variograms from four different parent materials in southern England were used to suggest suitable sampling intervals for future surveys in similar pedological settings based on half the variogram range. The standardized average variograms were also used to determine optimal sampling intervals using the kriging equations. Similar sampling intervals were suggested by each method and the maps of predictions based on data at different grid spacings were evaluated for the different parent materials. Variograms of loss on ignition (LOI) taken from the literature for other sites in southern England with similar parent materials had ranges close to the average for a given parent material showing the possible wider application of such averages to guide sampling.

  5. The origin of consistent protein structure refinement from structural averaging.

    PubMed

    Park, Hahnbeom; DiMaio, Frank; Baker, David

    2015-06-02

    Recent studies have shown that explicit solvent molecular dynamics (MD) simulation followed by structural averaging can consistently improve protein structure models. We find that improvement upon averaging is not limited to explicit water MD simulation, as consistent improvements are also observed for more efficient implicit solvent MD or Monte Carlo minimization simulations. To determine the origin of these improvements, we examine the changes in model accuracy brought about by averaging at the individual residue level. We find that the improvement in model quality from averaging results from the superposition of two effects: a dampening of deviations from the correct structure in the least well modeled regions, and a reinforcement of consistent movements towards the correct structure in better modeled regions. These observations are consistent with an energy landscape model in which the magnitude of the energy gradient toward the native structure decreases with increasing distance from the native state. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Human perceptions of colour rendition vary with average fidelity, average gamut, and gamut shape

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Royer, MP; Wilkerson, A.; Wei, M.

    An experiment was conducted to evaluate how subjective impressions of color quality vary with changes in average fidelity, average gamut, and gamut shape (which considers the specific hues that are saturated or desaturated). Twenty-eight participants each evaluated 26 lighting conditions—created using four, seven-channel, tunable LED luminaires—in a 3.1 m by 3.7 m room filled with objects selected to cover a range of hue, saturation, and lightness. IES TM-30 fidelity index (Rf) values ranged from 64 to 93, IES TM-30 gamut index (Rg¬) values from 79 to 117, and IES TM-30 Rcs,h1 values (a proxy for gamut shape) from -19% tomore » 26%. All lighting conditions delivered the same nominal illuminance and chromaticity. Participants were asked to rate each condition on eight point semantic differential scales for saturated-dull, normal-shifted, and like-dislike. They were also asked one multiple choice question, classifying the condition as saturated, dull, normal, or shifted. The findings suggest that gamut shape is more important than average gamut for human preference, where reds play a more important role than other hues. Additionally, average fidelity alone is a poor predictor of human perceptions, although Rf was somewhat better than CIE Ra. The most preferred source had a CIE Ra value of 68, and 9 of the top 12 rated products had a CIE Ra value of 73 or less, which indicates that the commonly used criteria of CIE Ra ≥ 80 may be excluding a majority of preferred light sources.« less

  7. Determining average path length and average trapping time on generalized dual dendrimer

    NASA Astrophysics Data System (ADS)

    Li, Ling; Guan, Jihong

    2015-03-01

    Dendrimer has wide number of important applications in various fields. In some cases during transport or diffusion process, it transforms into its dual structure named Husimi cactus. In this paper, we study the structure properties and trapping problem on a family of generalized dual dendrimer with arbitrary coordination numbers. We first calculate exactly the average path length (APL) of the networks. The APL increases logarithmically with the network size, indicating that the networks exhibit a small-world effect. Then we determine the average trapping time (ATT) of the trapping process in two cases, i.e., the trap placed on a central node and the trap is uniformly distributed in all the nodes of the network. In both case, we obtain explicit solutions of ATT and show how they vary with the networks size. Besides, we also discuss the influence of the coordination number on trapping efficiency.

  8. Averaging, passage through resonances, and capture into resonance in two-frequency systems

    NASA Astrophysics Data System (ADS)

    Neishtadt, A. I.

    2014-10-01

    Applying small perturbations to an integrable system leads to its slow evolution. For an approximate description of this evolution the classical averaging method prescribes averaging the rate of evolution over all the phases of the unperturbed motion. This simple recipe does not always produce correct results, because of resonances arising in the process of evolution. The phenomenon of capture into resonance consists in the system starting to evolve in such a way as to preserve the resonance property once it has arisen. This paper is concerned with application of the averaging method to a description of evolution in two-frequency systems. It is assumed that the trajectories of the averaged system intersect transversally the level surfaces of the frequency ratio and that certain other conditions of general position are satisfied. The rate of evolution is characterized by a small parameter \\varepsilon. The main content of the paper is a proof of the following result: outside a set of initial data with measure of order \\sqrt \\varepsilon the averaging method describes the evolution to within O(\\sqrt \\varepsilon \\vert\\ln\\varepsilon\\vert) for periods of time of order 1/\\varepsilon. This estimate is sharp. The exceptional set of measure \\sqrt \\varepsilon contains the initial data for phase points captured into resonance. A description of the motion of such phase points is given, along with a survey of related results on averaging. Examples of capture into resonance are presented for some problems in the dynamics of charged particles. Several open problems are stated. Bibliography: 65 titles.

  9. Propulsion and Energetics Panel Working Group 14 on Suitable Averaging Techniques in Non-Uniform Internal Flows

    DTIC Science & Technology

    1983-06-01

    PANEL WORKING GROUP 14 on SUITABLE AVERAGING TECHNIQUES IN NON-UNIFORM INTERNAL FLOWS Edited by M.Pianko Office National d’Etudes et de...d’Etudes et de Recherches Aerospatiales Pratt and Whitney Government Products Division Rocketdyne Division of Rockwell International , Inc. Teledyne CAE...actions exerted by individual components on the gas flow must be known. These specific component effects are distributed internally within the

  10. Average is Over

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo

    2018-02-01

    The popular perception of statistical distributions is depicted by the iconic bell curve which comprises of a massive bulk of 'middle-class' values, and two thin tails - one of small left-wing values, and one of large right-wing values. The shape of the bell curve is unimodal, and its peak represents both the mode and the mean. Thomas Friedman, the famous New York Times columnist, recently asserted that we have entered a human era in which "Average is Over" . In this paper we present mathematical models for the phenomenon that Friedman highlighted. While the models are derived via different modeling approaches, they share a common foundation. Inherent tipping points cause the models to phase-shift from a 'normal' bell-shape statistical behavior to an 'anomalous' statistical behavior: the unimodal shape changes to an unbounded monotone shape, the mode vanishes, and the mean diverges. Hence: (i) there is an explosion of small values; (ii) large values become super-large; (iii) 'middle-class' values are wiped out, leaving an infinite rift between the small and the super large values; and (iv) "Average is Over" indeed.

  11. Averaging Theory for Description of Environmental Problems: What Have We Learned?

    PubMed Central

    Miller, Cass T.; Schrefler, Bernhard A.

    2012-01-01

    Advances in Water Resources has been a prime archival source for implementation of averaging theories in changing the scale at which processes of importance in environmental modeling are described. Thus in celebration of the 35th year of this journal, it seems appropriate to assess what has been learned about these theories and about their utility in describing systems of interest. We review advances in understanding and use of averaging theories to describe porous medium flow and transport at the macroscale, an averaged scale that models spatial variability, and at the megascale, an integral scale that only considers time variation of system properties. We detail physical insights gained from the development and application of averaging theory for flow through porous medium systems and for the behavior of solids at the macroscale. We show the relationship between standard models that are typically applied and more rigorous models that are derived using modern averaging theory. We discuss how the results derived from averaging theory that are available can be built upon and applied broadly within the community. We highlight opportunities and needs that exist for collaborations among theorists, numerical analysts, and experimentalists to advance the new classes of models that have been derived. Lastly, we comment on averaging developments for rivers, estuaries, and watersheds. PMID:23393409

  12. 20 CFR 226.62 - Computing average monthly compensation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Computing average monthly compensation. 226... RETIREMENT ACT COMPUTING EMPLOYEE, SPOUSE, AND DIVORCED SPOUSE ANNUITIES Years of Service and Average Monthly Compensation § 226.62 Computing average monthly compensation. The employee's average monthly compensation is...

  13. Average M shell fluorescence yields for elements with 70≤Z≤92

    NASA Astrophysics Data System (ADS)

    Kahoul, A.; Deghfel, B.; Aylikci, V.; Aylikci, N. K.; Nekkab, M.

    2015-03-01

    The theoretical, experimental and analytical methods for the calculation of average M-shell fluorescence yield (ω¯M ) of different elements are very important because of the large number of their applications in various areas of physical chemistry and medical research. In this paper, the bulk of the average M-shell fluorescence yield measurements reported in the literature, covering the period 1955 to 2005 are interpolated by using an analytical function to deduce the empirical average M-shell fluorescence yield in the atomic range of 70≤Z≤92. The results were compared with the theoretical and fitted values reported by other authors. Reasonable agreement was typically obtained between our result and other works.

  14. Self-averaging in complex brain neuron signals

    NASA Astrophysics Data System (ADS)

    Bershadskii, A.; Dremencov, E.; Fukayama, D.; Yadid, G.

    2002-12-01

    Nonlinear statistical properties of Ventral Tegmental Area (VTA) of limbic brain are studied in vivo. VTA plays key role in generation of pleasure and in development of psychological drug addiction. It is shown that spiking time-series of the VTA dopaminergic neurons exhibit long-range correlations with self-averaging behavior. This specific VTA phenomenon has no relation to VTA rewarding function. Last result reveals complex role of VTA in limbic brain.

  15. National assessment of geologic carbon dioxide storage resources: results

    USGS Publications Warehouse

    ,

    2013-01-01

    In 2012, the U.S. Geological Survey (USGS) completed an assessment of the technically accessible storage resources (TASR) for carbon dioxide (CO2) in geologic formations underlying the onshore and State waters area of the United States. The formations assessed are at least 3,000 feet (914 meters) below the ground surface. The TASR is an estimate of the CO2 storage resource that may be available for CO2 injection and storage that is based on present-day geologic and hydrologic knowledge of the subsurface and current engineering practices. Individual storage assessment units (SAUs) for 36 basins were defined on the basis of geologic and hydrologic characteristics outlined in the assessment methodology of Brennan and others (2010, USGS Open-File Report 2010–1127) and the subsequent methodology modification and implementation documentation of Blondes, Brennan, and others (2013, USGS Open-File Report 2013–1055). The mean national TASR is approximately 3,000 metric gigatons (Gt). The estimate of the TASR includes buoyant trapping storage resources (BSR), where CO2 can be trapped in structural or stratigraphic closures, and residual trapping storage resources, where CO2 can be held in place by capillary pore pressures in areas outside of buoyant traps. The mean total national BSR is 44 Gt. The residual storage resource consists of three injectivity classes based on reservoir permeability: residual trapping class 1 storage resource (R1SR) represents storage in rocks with permeability greater than 1 darcy (D); residual trapping class 2 storage resource (R2SR) represents storage in rocks with moderate permeability, defined as permeability between 1 millidarcy (mD) and 1 D; and residual trapping class 3 storage resource (R3SR) represents storage in rocks with low permeability, defined as permeability less than 1 mD. The mean national storage resources for rocks in residual trapping classes 1, 2, and 3 are 140 Gt, 2,700 Gt, and 130 Gt, respectively. The known recovery

  16. A cost-benefit analysis of The National Map

    USGS Publications Warehouse

    Halsing, David L.; Theissen, Kevin; Bernknopf, Richard

    2003-01-01

    The Geography Discipline of the U.S. Geological Survey (USGS) has conducted this cost-benefit analysis (CBA) of The National Map. This analysis is an evaluation of the proposed Geography Discipline initiative to provide the Nation with a mechanism to access current and consistent digital geospatial data. This CBA is a supporting document to accompany the Exhibit 300 Capital Asset Plan and Business Case of The National Map Reengineering Program. The framework for estimating the benefits is based on expected improvements in processing information to perform any of the possible applications of spatial data. This analysis does not attempt to determine the benefits and costs of performing geospatial-data applications. Rather, it estimates the change in the differences between those benefits and costs with The National Map and the current situation without it. The estimates of total costs and benefits of The National Map were based on the projected implementation time, development and maintenance costs, rates of data inclusion and integration, expected usage levels over time, and a benefits estimation model. The National Map provides data that are current, integrated, consistent, complete, and more accessible in order to decrease the cost of implementing spatial-data applications and (or) improve the outcome of those applications. The efficiency gains in per-application improvements are greater than the cost to develop and maintain The National Map, meaning that the program would bring a positive net benefit to the Nation. The average improvement in the net benefit of performing a spatial data application was multiplied by a simulated number of application implementations across the country. The numbers of users, existing applications, and rates of application implementation increase over time as The National Map is developed and accessed by spatial data users around the country. Results from the 'most likely' estimates of model parameters and data inputs indicate that

  17. Artificial Intelligence Can Predict Daily Trauma Volume and Average Acuity.

    PubMed

    Stonko, David P; Dennis, Bradley M; Betzold, Richard D; Peetz, Allan B; Gunter, Oliver L; Guillamondegui, Oscar D

    2018-04-19

    The goal of this study was to integrate temporal and weather data in order to create an artificial neural network (ANN) to predict trauma volume, the number of emergent operative cases, and average daily acuity at a level 1 trauma center. Trauma admission data from TRACS and weather data from the National Oceanic and Atmospheric Administration (NOAA) was collected for all adult trauma patients from July 2013-June 2016. The ANN was constructed using temporal (time, day of week), and weather factors (daily high, active precipitation) to predict four points of daily trauma activity: number of traumas, number of penetrating traumas, average ISS, and number of immediate OR cases per day. We trained a two-layer feed-forward network with 10 sigmoid hidden neurons via the Levenberg-Marquardt backpropagation algorithm, and performed k-fold cross validation and accuracy calculations on 100 randomly generated partitions. 10,612 patients over 1,096 days were identified. The ANN accurately predicted the daily trauma distribution in terms of number of traumas, number of penetrating traumas, number of OR cases, and average daily ISS (combined training correlation coefficient r = 0.9018+/-0.002; validation r = 0.8899+/- 0.005; testing r = 0.8940+/-0.006). We were able to successfully predict trauma and emergent operative volume, and acuity using an ANN by integrating local weather and trauma admission data from a level 1 center. As an example, for June 30, 2016, it predicted 9.93 traumas (actual: 10), and a mean ISS score of 15.99 (actual: 13.12); see figure 3. This may prove useful for predicting trauma needs across the system and hospital administration when allocating limited resources. Level III STUDY TYPE: Prognostic/Epidemiological.

  18. Public attitudes about underage drinking policies: results from a national survey.

    PubMed

    Richter, Linda; Vaughan, Roger D; Foster, Susan E

    2004-01-01

    We conducted a national telephone survey of 900 adults in the United States to examine the attitudes of the adult public regarding underage drinking and a series of alcohol control policies aimed at reducing it. Three versions of the survey instrument were administered, each to one-third of the sample, with the versions varying in the stipulations of the policy options. Results showed high levels of public support for most of the alcohol control policies, with relatively lower support for those that would result in restrictions on adults' access to alcohol. Respondents' support of the policy options was significantly related to their sociodemographic and attitudinal characteristics, such as sex, age, drinking frequency, and level of concern about underage drinking. The findings provide important guidelines to policymakers interested in garnering support for policies aimed at curtailing underage drinking.

  19. A Microgenetic Analysis of Strategic Variability in Gifted and Average-Ability Children

    ERIC Educational Resources Information Center

    Steiner, Hillary Hettinger

    2006-01-01

    Many researchers have described cognitive differences between gifted and average-performing children. Regarding strategy use, the gifted advantage is often associated with differences such as greater knowledge of strategies, quicker problem solving, and the ability to use strategies more appropriately. The current study used microgenetic methods…

  20. A diameter distribution approach to estimating average stand dominant height in Appalachian hardwoods

    Treesearch

    John R. Brooks

    2007-01-01

    A technique for estimating stand average dominant height based solely on field inventory data is investigated. Using only 45.0919 percent of the largest trees per acre in the diameter distribution resulted in estimates of average dominant height that were within 4.3 feet of the actual value, when averaged over stands of very different structure and history. Cubic foot...

  1. BI-NATIONAL LOWER FOOD WEB ASSESSMENT: 2005 BENTHOS RESULTS

    EPA Science Inventory

    Findings have been generated as part of a bi-national coordinated partnership for lakewide sampling to support needs expressed by the Great Lakes Fisheries Committee, the Lake Superior Technical Committee, and the Lake Superior LaMP.

  2. Conclusions from the Mexican National Nutrition Survey 1999: translating results into nutrition policy.

    PubMed

    Rivera, Juan A; Sepúlveda Amor, Jaime

    2003-01-01

    This article presents and overview of the main results and conclusions from the Mexican National Nutrition Survey 1999 (NNS-1999) and the principal nutrition policy implications of the findings. The NNS-1999 was conducted on a national probabilistic sample of almost 18,000 households, representative of the national, regional, as well as urban and rural levels in Mexico. Subjects included were children < 12 years and women 12-49 years. Anthropometry, blood specimens, diet and socioeconomic information of the family were collected. The principal public nutrition problems are stunting in children < 5 years of age; anemia, iron and zinc deficiency, and low serum vitamin C concentrations at all ages; and vitamin A deficiency in children. Undernutrition (stunting and micronutrient deficiencies) was generally more prevalent in the lower socioeconomic groups, in rural areas, in the south and in Indigenous population. Overweight and obesity are serious public health problems in women and are already a concern in school-age children. A number of programs aimed at preventing undernutrition are currently in progress; several of them were designed or modified as a result of the NNS-1999 findings. Most of them have an evaluation component that will inform adjustments or modifications of their design and implementation. However, little is being done for the prevention and control of overweight and obesity and there is limited experience on effective interventions. The design and evaluation of prevention strategies for controlling obesity in the population, based on existing evidence, is urgently needed and success stories should be brought to scale quickly to maximize impact. The English version of this paper is available too at: http://www.insp.mx/salud/index.html.

  3. A Martian PFS average spectrum: Comparison with ISO SWS

    NASA Astrophysics Data System (ADS)

    Formisano, V.; Encrenaz, T.; Fonti, S.; Giuranna, M.; Grassi, D.; Hirsh, H.; Khatuntsev, I.; Ignatiev, N.; Lellouch, E.; Maturilli, A.; Moroz, V.; Orleanski, P.; Piccioni, G.; Rataj, M.; Saggin, B.; Zasova, L.

    2005-08-01

    The evaluation of the planetary Fourier spectrometer performance at Mars is presented by comparing an average spectrum with the ISO spectrum published by Lellouch et al. [2000. Planet. Space Sci. 48, 1393.]. First, the average conditions of Mars atmosphere are compared, then the mixing ratios of the major gases are evaluated. Major and minor bands of CO 2 are compared, from the point of view of features characteristics and bands depth. The spectral resolution is also compared using several solar lines. The result indicates that PFS radiance is valid to better than 1% in the wavenumber range 1800-4200 cm -1 for the average spectrum considered (1680 measurements). The PFS monochromatic transfer function generates an overshooting on the left-hand side of strong narrow lines (solar or atmospheric). The spectral resolution of PFS is of the order of 1.3 cm -1 or better. A large number of narrow features to be identified are discovered.

  4. Medical Pluralism among American Women: Results of a National Survey

    PubMed Central

    Chao, Maria; Kronenberg, Fredi; Cushman, Linda; Kalmuss, Debra

    2008-01-01

    Abstract Background Medical pluralism can be defined as the employment of more than one medical system or the use of both conventional and complementary and alternative medicine (CAM) for health and illness. American women use a variety of health services and practices for women's health conditions, yet no national study has specifically characterized women's medical pluralism. Our objective was to describe medical pluralism among American women. Methods A nationally representative telephone survey of 808 women ≥18 years of age was conducted in 2001. Cross-sectional observations of the use of 11 CAM domains and the use of an additional domain—spirituality, religion, or prayer for health—during the past year are reported. Women's health conditions, treatments used, reasons for use, and disclosure to conventional physicians are described, along with predictors of CAM use. Results Over half (53%) of respondents used CAM for health conditions, especially for those involving chronic pain. The majority of women disclosed such practices at clinical encounters with conventional providers. Biologically based CAM therapies, such as nutritional supplements and herbs, were commonly used with prescription and over-the-counter (OTC) pharmaceuticals for health conditions. Conclusions Medical pluralism is common among women and should be accepted as a cultural norm. Although disclosure rates of CAM use to conventional providers were higher than in previous population-based studies, disclosure should be increased, especially for women who are pregnant and those with heart disease and cancer. The health risks and benefits of polypharmacy should be addressed at multiple levels of the public health system. PMID:18537484

  5. Hybrid Reynolds-Averaged/Large-Eddy Simulations of a Coaxial Supersonic Free-Jet Experiment

    NASA Technical Reports Server (NTRS)

    Baurle, Robert A.; Edwards, Jack R.

    2010-01-01

    Reynolds-averaged and hybrid Reynolds-averaged/large-eddy simulations have been applied to a supersonic coaxial jet flow experiment. The experiment was designed to study compressible mixing flow phenomenon under conditions that are representative of those encountered in scramjet combustors. The experiment utilized either helium or argon as the inner jet nozzle fluid, and the outer jet nozzle fluid consisted of laboratory air. The inner and outer nozzles were designed and operated to produce nearly pressure-matched Mach 1.8 flow conditions at the jet exit. The purpose of the computational effort was to assess the state-of-the-art for each modeling approach, and to use the hybrid Reynolds-averaged/large-eddy simulations to gather insight into the deficiencies of the Reynolds-averaged closure models. The Reynolds-averaged simulations displayed a strong sensitivity to choice of turbulent Schmidt number. The initial value chosen for this parameter resulted in an over-prediction of the mixing layer spreading rate for the helium case, but the opposite trend was observed when argon was used as the injectant. A larger turbulent Schmidt number greatly improved the comparison of the results with measurements for the helium simulations, but variations in the Schmidt number did not improve the argon comparisons. The hybrid Reynolds-averaged/large-eddy simulations also over-predicted the mixing layer spreading rate for the helium case, while under-predicting the rate of mixing when argon was used as the injectant. The primary reason conjectured for the discrepancy between the hybrid simulation results and the measurements centered around issues related to the transition from a Reynolds-averaged state to one with resolved turbulent content. Improvements to the inflow conditions were suggested as a remedy to this dilemma. Second-order turbulence statistics were also compared to their modeled Reynolds-averaged counterparts to evaluate the effectiveness of common turbulence closure

  6. 40 CFR 86.449 - Averaging provisions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Later New Motorcycles, General Provisions § 86.449 Averaging provisions. (a) This section describes how... credits may not be banked for use in later model years, except as specified in paragraph (j) of this... average emission levels are at or below the applicable standards in § 86.410-2006. (2) Compliance with the...

  7. 40 CFR 63.846 - Emission averaging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... averaging. (a) General. The owner or operator of an existing potline or anode bake furnace in a State that... by total aluminum production. (c) Anode bake furnaces. The owner or operator may average TF emissions from anode bake furnaces and demonstrate compliance with the limits in Table 3 of this subpart using...

  8. 40 CFR 63.846 - Emission averaging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... averaging. (a) General. The owner or operator of an existing potline or anode bake furnace in a State that... by total aluminum production. (c) Anode bake furnaces. The owner or operator may average TF emissions from anode bake furnaces and demonstrate compliance with the limits in Table 3 of this subpart using...

  9. 40 CFR 63.846 - Emission averaging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... averaging. (a) General. The owner or operator of an existing potline or anode bake furnace in a State that... by total aluminum production. (c) Anode bake furnaces. The owner or operator may average TF emissions from anode bake furnaces and demonstrate compliance with the limits in Table 3 of this subpart using...

  10. 40 CFR 63.846 - Emission averaging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... averaging. (a) General. The owner or operator of an existing potline or anode bake furnace in a State that... by total aluminum production. (c) Anode bake furnaces. The owner or operator may average TF emissions from anode bake furnaces and demonstrate compliance with the limits in Table 3 of this subpart using...

  11. 40 CFR 63.846 - Emission averaging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... averaging. (a) General. The owner or operator of an existing potline or anode bake furnace in a State that... by total aluminum production. (c) Anode bake furnaces. The owner or operator may average TF emissions from anode bake furnaces and demonstrate compliance with the limits in Table 3 of this subpart using...

  12. Average cross-responses in correlated financial markets

    NASA Astrophysics Data System (ADS)

    Wang, Shanshan; Schäfer, Rudi; Guhr, Thomas

    2016-09-01

    There are non-vanishing price responses across different stocks in correlated financial markets, reflecting non-Markovian features. We further study this issue by performing different averages, which identify active and passive cross-responses. The two average cross-responses show different characteristic dependences on the time lag. The passive cross-response exhibits a shorter response period with sizeable volatilities, while the corresponding period for the active cross-response is longer. The average cross-responses for a given stock are evaluated either with respect to the whole market or to different sectors. Using the response strength, the influences of individual stocks are identified and discussed. Moreover, the various cross-responses as well as the average cross-responses are compared with the self-responses. In contrast to the short-memory trade sign cross-correlations for each pair of stocks, the sign cross-correlations averaged over different pairs of stocks show long memory.

  13. Mining the National Career Assessment Examination Result Using Clustering Algorithm

    NASA Astrophysics Data System (ADS)

    Pagudpud, M. V.; Palaoag, T. T.; Padirayon, L. M.

    2018-03-01

    Education is an essential process today which elicits authorities to discover and establish innovative strategies for educational improvement. This study applied data mining using clustering technique for knowledge extraction from the National Career Assessment Examination (NCAE) result in the Division of Quirino. The NCAE is an examination given to all grade 9 students in the Philippines to assess their aptitudes in the different domains. Clustering the students is helpful in identifying students’ learning considerations. With the use of the RapidMiner tool, clustering algorithms such as Density-Based Spatial Clustering of Applications with Noise (DBSCAN), k-means, k-medoid, expectation maximization clustering, and support vector clustering algorithms were analyzed. The silhouette indexes of the said clustering algorithms were compared, and the result showed that the k-means algorithm with k = 3 and silhouette index equal to 0.196 is the most appropriate clustering algorithm to group the students. Three groups were formed having 477 students in the determined group (cluster 0), 310 proficient students (cluster 1) and 396 developing students (cluster 2). The data mining technique used in this study is essential in extracting useful information from the NCAE result to better understand the abilities of students which in turn is a good basis for adopting teaching strategies.

  14. IQ and the values of nations.

    PubMed

    Kanazawa, Satoshi

    2009-07-01

    The origin of values and preferences is an unresolved theoretical question in behavioural and social sciences. The Savanna-IQ Interaction Hypothesis, derived from the Savanna Principle and a theory of the evolution of general intelligence, suggests that more intelligent individuals may be more likely to acquire and espouse evolutionarily novel values and preferences (such as liberalism and atheism and, for men, sexual exclusivity) than less intelligent individuals, but that general intelligence may have no effect on the acquisition and espousal of evolutionarily familiar values. Macro-level analyses show that nations with higher average intelligence are more liberal (have greater highest marginal individual tax rate and, as a result, lower income inequality), less religious (a smaller proportion of the population believes in God or considers themselves religious) and more monogamous. The average intelligence of a population appears to be the strongest predictor of its level of liberalism, atheism and monogamy.

  15. Average M shell fluorescence yields for elements with 70≤Z≤92

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kahoul, A., E-mail: ka-abdelhalim@yahoo.fr; LPMRN laboratory, Department of Materials Science, Faculty of Sciences and Technology, Mohamed El Bachir El Ibrahimi University, Bordj-Bou-Arreridj 34030; Deghfel, B.

    2015-03-30

    The theoretical, experimental and analytical methods for the calculation of average M-shell fluorescence yield (ω{sup ¯}{sub M}) of different elements are very important because of the large number of their applications in various areas of physical chemistry and medical research. In this paper, the bulk of the average M-shell fluorescence yield measurements reported in the literature, covering the period 1955 to 2005 are interpolated by using an analytical function to deduce the empirical average M-shell fluorescence yield in the atomic range of 70≤Z≤92. The results were compared with the theoretical and fitted values reported by other authors. Reasonable agreement wasmore » typically obtained between our result and other works.« less

  16. Dynamic Testing of Gifted and Average-Ability Children's Analogy Problem Solving: Does Executive Functioning Play a Role?

    ERIC Educational Resources Information Center

    Vogelaar, Bart; Bakker, Merel; Hoogeveen, Lianne; Resing, Wilma C. M.

    2017-01-01

    In this study, dynamic testing principles were applied to examine progression of analogy problem solving, the roles that cognitive flexibility and metacognition play in children's progression as well as training benefits, and instructional needs of 7- to 8-year-old gifted and average-ability children. Utilizing a pretest training posttest control…

  17. Structuring Collaboration in Mixed-Ability Groups to Promote Verbal Interaction, Learning, and Motivation of Average-Ability Students

    ERIC Educational Resources Information Center

    Saleh, Mohammad; Lazonder, Ard W.; Jong, Ton de

    2007-01-01

    Average-ability students often do not take full advantage of learning in mixed-ability groups because they hardly engage in the group interaction. This study examined whether structuring collaboration by group roles and ground rules for helping behavior might help overcome this participatory inequality. In a plant biology course, heterogeneously…

  18. Development of a high average current polarized electron source with long cathode operational lifetime

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C. K. Sinclair; P. A. Adderley; B. M. Dunham

    Substantially more than half of the electromagnetic nuclear physics experiments conducted at the Continuous Electron Beam Accelerator Facility of the Thomas Jefferson National Accelerator Facility (Jefferson Laboratory) require highly polarized electron beams, often at high average current. Spin-polarized electrons are produced by photoemission from various GaAs-based semiconductor photocathodes, using circularly polarized laser light with photon energy slightly larger than the semiconductor band gap. The photocathodes are prepared by activation of the clean semiconductor surface to negative electron affinity using cesium and oxidation. Historically, in many laboratories worldwide, these photocathodes have had short operational lifetimes at high average current, and havemore » often deteriorated fairly quickly in ultrahigh vacuum even without electron beam delivery. At Jefferson Lab, we have developed a polarized electron source in which the photocathodes degrade exceptionally slowly without electron emission, and in which ion back bombardment is the predominant mechanism limiting the operational lifetime of the cathodes during electron emission. We have reproducibly obtained cathode 1/e dark lifetimes over two years, and 1/e charge density and charge lifetimes during electron beam delivery of over 2?105???C/cm2 and 200 C, respectively. This source is able to support uninterrupted high average current polarized beam delivery to three experimental halls simultaneously for many months at a time. Many of the techniques we report here are directly applicable to the development of GaAs photoemission electron guns to deliver high average current, high brightness unpolarized beams.« less

  19. HiLASE: development of fully diode pumped disk lasers with high average power

    NASA Astrophysics Data System (ADS)

    Divoky, M.; Smrz, M.; Chyla, M.; Sikocinski, P.; Severova, P.; Novák, O.; Huynh, J.; Nagisetty, S. S.; Miura, T.; Liberatore, C.; Pilař, J.; Slezak, O.; Sawicka, M.; Jambunathan, V.; Gemini, L.; Vanda, J.; Svabek, R.; Endo, A.; Lucianetti, A.; Rostohar, D.; Mason, P. D.; Phillips, P. J.; Ertel, K.; Banerjee, S.; Hernandez-Gomez, C.; Collier, J. L.; Mocek, T.

    2015-02-01

    An overview of Czech national R&D project HiLASE (High average power pulsed LASEr) is presented. The HiLASE project aims at development of pulsed DPSSL for hi-tech industrial applications. HiLASE will be a user oriented facility with several laser systems with output parameters ranging from a few picosecond pulses with energy of 5 mJ to 0.5 J and repetition rate of 1-100 kHz (based on thin disk technology) to systems with 100 J output energy in nanosecond pulses with repetition rate of 10 Hz (based on multi-slab technology).

  20. National Assessment of Educational Progress. 1969-1970 Writing: Group Results for Sex, Region, and Size of Community (Preliminary Report).

    ERIC Educational Resources Information Center

    Norris, Eleanor L.; And Others

    National results for writing, one of the subjects assessed by National Assessment during 1969-70, were reported in November 1970. In the present report, results are presented by age (9, 13, 17 and young adults between 26 and 35) and by three classifications: (1) sex, (2) region: southeast, central, west, and (3) size of community: big cities,…

  1. Experimental Investigation of the Differences Between Reynolds-Averaged and Favre-Averaged Velocity in Supersonic Jets

    NASA Technical Reports Server (NTRS)

    Panda, J.; Seasholtz, R. G.

    2005-01-01

    Recent advancement in the molecular Rayleigh scattering based technique allowed for simultaneous measurement of velocity and density fluctuations with high sampling rates. The technique was used to investigate unheated high subsonic and supersonic fully expanded free jets in the Mach number range of 0.8 to 1.8. The difference between the Favre averaged and Reynolds averaged axial velocity and axial component of the turbulent kinetic energy is found to be small. Estimates based on the Morkovin's "Strong Reynolds Analogy" were found to provide lower values of turbulent density fluctuations than the measured data.

  2. Determinants of College Grade Point Averages

    ERIC Educational Resources Information Center

    Bailey, Paul Dean

    2012-01-01

    Chapter 2: The Role of Class Difficulty in College Grade Point Averages. Grade Point Averages (GPAs) are widely used as a measure of college students' ability. Low GPAs can remove a students from eligibility for scholarships, and even continued enrollment at a university. However, GPAs are determined not only by student ability but also by the…

  3. The national survey of natural radioactivity in concrete produced in Israel.

    PubMed

    Kovler, Konstantin

    2017-03-01

    The main goal of the current survey was to collect the results of the natural radiation tests of concrete produced in the country, to analyze the results statistically and make recommendations for further regulation on the national scale. Totally 109 concrete mixes produced commercially during the years 2012-2014 by concrete plants in Israel were analyzed. The average concentrations of NORM did not exceed the values recognized in the EU and were close to the values obtained from the Mediterranean countries such as Greece, Spain and Italy. It was also found that although the average value of the radon emanation coefficient of concrete containing coal fly ash (FA) was lower, than that of concrete mixes without FA, there was no significant difference between the indexes of both total radiation (addressing gamma radiation and radon together), and gamma radiation only, of the averages of the two sub-populations of concrete mixes: with and without FA. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. The average receiver operating characteristic curve in multireader multicase imaging studies

    PubMed Central

    Samuelson, F W

    2014-01-01

    Objective: In multireader, multicase (MRMC) receiver operating characteristic (ROC) studies for evaluating medical imaging systems, the area under the ROC curve (AUC) is often used as a summary metric. Owing to the limitations of AUC, plotting the average ROC curve to accompany the rigorous statistical inference on AUC is recommended. The objective of this article is to investigate methods for generating the average ROC curve from ROC curves of individual readers. Methods: We present both a non-parametric method and a parametric method for averaging ROC curves that produce a ROC curve, the area under which is equal to the average AUC of individual readers (a property we call area preserving). We use hypothetical examples, simulated data and a real-world imaging data set to illustrate these methods and their properties. Results: We show that our proposed methods are area preserving. We also show that the method of averaging the ROC parameters, either the conventional bi-normal parameters (a, b) or the proper bi-normal parameters (c, da), is generally not area preserving and may produce a ROC curve that is intuitively not an average of multiple curves. Conclusion: Our proposed methods are useful for making plots of average ROC curves in MRMC studies as a companion to the rigorous statistical inference on the AUC end point. The software implementing these methods is freely available from the authors. Advances in knowledge: Methods for generating the average ROC curve in MRMC ROC studies are formally investigated. The area-preserving criterion we defined is useful to evaluate such methods. PMID:24884728

  5. Systematic review of the relationship of Helicobacter pylori infection with geographical latitude, average annual temperature and average daily sunshine.

    PubMed

    Lu, Chao; Yu, Ye; Li, Lan; Yu, Chaohui; Xu, Ping

    2018-04-17

    Helicobacter pylori (H. pylori) infection is a worldwide threat to human health with high prevalence. In this study, we analyzed the relationship between latitude, average annual temperature, average daily sunshine time and H. pylori infection. The PubMed, ClinicalTrials.gov , EBSCO and Web of Science databases were searched to identify studies reporting H. pylori infection. Latitude 30° was the cut-off level for low and mid-latitude areas. We obtained information for latitude, average annual temperature, average daily sunshine, and Human Development Index (HDI) from reports of studies of the relationships with H. pylori infection. Of the 51 studies included, there was significant difference in H. pylori infection between the low- and mid-latitude areas (P = 0.05). There was no significant difference in the prevalence of H. pylori infection in each 15°-latitude zone analyzed (P = 0.061). Subgroup analysis revealed the highest and lowest H. pylori infection rates in the developing regions at > 30° latitude subgroup and the developed regions at < 30° latitude subgroup, respectively (P < 0.001). Multivariate analysis showed that average annual temperature, average daily sunshine time and HDI were significantly correlated with H. pylori infection (P = 0.009, P < 0.001, P < 0.001), while there was no correlation between H. pylori infection and latitude. Our analysis showed that higher average annual temperature was associated with lower H. pylori infection rates, while average daily sunshine time correlated positively with H. pylori infection. HDI was also found to be a significant factor, with higher HDI associated with lower infection rates. These findings provide evidence that can be used to devise strategies for the prevention and control of H. pylori.

  6. 20 CFR 404.220 - Average-monthly-wage method.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... average-monthly-wage method if it is to your advantage. Being eligible for either the average-monthly-wage method or the modified average-monthly-wage method does not preclude your eligibility under the old-start...

  7. Yoga in Australia: Results of a national survey

    PubMed Central

    Penman, Stephen; Cohen, Marc; Stevens, Philip; Jackson, Sue

    2012-01-01

    Introduction: The therapeutic benefits of yoga and meditation are well documented, yet little is known about the practice of yoga in Australia or elsewhere, whether as a physical activity, a form of therapy, a spiritual path or a lifestyle. Materials and Methods: To investigate the practice of yoga in Australia, a national survey of yoga practitioners was conducted utilizing a comprehensive web-based questionnaire. Respondents were self-selecting to participate. A total of 3,892 respondents completed the survey. Sixty overseas respondents and 1265 yoga teachers (to be reported separately) were excluded, leaving 2,567 yoga practitioner respondents. Results: The typical yoga survey respondent was a 41-year-old, tertiary educated, employed, health-conscious female (85% women). Asana (postures) and vinyasa (sequences of postures) represented 61% of the time spent practicing, with the other 39% devoted to the gentler practices of relaxation, pranayama (breathing techniques), meditation and instruction. Respondents commonly started practicing yoga for health and fitness but often continued practicing for stress management. One in five respondents practiced yoga for a specific health or medical reason which was seen to be improved by yoga practice. Of these, more people used yoga for stress management and anxiety than back, neck or shoulder problems, suggesting that mental health may be the primary health-related motivation for practicing yoga. Healthy lifestyle choices were seen to be more prevalent in respondents with more years of practice. Yoga-related injuries occurring under supervision in the previous 12 months were low at 2.4% of respondents. Conclusions: Yoga practice was seen to assist in the management of specific health issues and medical conditions. Regular yoga practice may also exert a healthy lifestyle effect including vegetarianism, non-smoking, reduced alcohol consumption, increased exercise and reduced stress with resulting cost benefits to the

  8. 2007 national roadside survey of alcohol and drug use by drivers : drug results.

    DOT National Transportation Integrated Search

    2009-12-01

    This report presents the first national prevalence estimates for drug-involved driving derived from the recently : completed 2007 National Roadside Survey (NRS). The NRS is a national field survey of alcohol- and drug-involved : driving conducted pri...

  9. Average Annual Rainfall Over the Globe

    NASA Astrophysics Data System (ADS)

    Agrawal, D. C.

    2013-12-01

    The atmospheric recycling of water is a very important phenomenon on the globe because it not only refreshes the water but it also redistributes it over land and oceans/rivers/lakes throughout the globe. This is made possible by the solar energy intercepted by the Earth. The half of the globe facing the Sun, on the average, intercepts 1.74×1017 J of solar radiation per second and it is divided over various channels as given in Table 1. It keeps our planet warm and maintains its average temperature2 of 288 K with the help of the atmosphere in such a way that life can survive. It also recycles the water in the oceans/rivers/ lakes by initial evaporation and subsequent precipitation; the average annual rainfall over the globe is around one meter. According to M. King Hubbert the amount of solar power going into the evaporation and precipitation channel is 4.0×1016 W. Students can verify the value of average annual rainfall over the globe by utilizing this part of solar energy. This activity is described in the next section.

  10. Results from Sandia National Laboratories/Lockheed Martin Electromagnetic Missile Launcher (EMML).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lockner, Thomas Ramsbeck; Skurdal, Ben; Gaigler, Randy

    2005-05-01

    Sandia national laboratories (SNL) and lockheed martin MS2 are designing an electromagnetic missile launcher (EMML) for naval applications. The EMML uses an induction coilgun topology with the requirement of launching a 3600 lb. missile up to a velocity of 40 m/s. To demonstrate the feasibility of the electromagnetic propulsion design, a demonstrator launcher was built that consists of approximately 10% of the propulsion coils needed for a tactical design. The demonstrator verified the design by launching a 1430 lb weighted sled to a height of 24 ft in mid-December 2004 (Figure 1). This paper provides the general launcher design, specificmore » pulsed power system component details, system operation, and demonstration results.« less

  11. Role of spatial averaging in multicellular gradient sensing.

    PubMed

    Smith, Tyler; Fancher, Sean; Levchenko, Andre; Nemenman, Ilya; Mugler, Andrew

    2016-05-20

    Gradient sensing underlies important biological processes including morphogenesis, polarization, and cell migration. The precision of gradient sensing increases with the length of a detector (a cell or group of cells) in the gradient direction, since a longer detector spans a larger range of concentration values. Intuition from studies of concentration sensing suggests that precision should also increase with detector length in the direction transverse to the gradient, since then spatial averaging should reduce the noise. However, here we show that, unlike for concentration sensing, the precision of gradient sensing decreases with transverse length for the simplest gradient sensing model, local excitation-global inhibition. The reason is that gradient sensing ultimately relies on a subtraction of measured concentration values. While spatial averaging indeed reduces the noise in these measurements, which increases precision, it also reduces the covariance between the measurements, which results in the net decrease in precision. We demonstrate how a recently introduced gradient sensing mechanism, regional excitation-global inhibition (REGI), overcomes this effect and recovers the benefit of transverse averaging. Using a REGI-based model, we compute the optimal two- and three-dimensional detector shapes, and argue that they are consistent with the shapes of naturally occurring gradient-sensing cell populations.

  12. Role of spatial averaging in multicellular gradient sensing

    NASA Astrophysics Data System (ADS)

    Smith, Tyler; Fancher, Sean; Levchenko, Andre; Nemenman, Ilya; Mugler, Andrew

    2016-06-01

    Gradient sensing underlies important biological processes including morphogenesis, polarization, and cell migration. The precision of gradient sensing increases with the length of a detector (a cell or group of cells) in the gradient direction, since a longer detector spans a larger range of concentration values. Intuition from studies of concentration sensing suggests that precision should also increase with detector length in the direction transverse to the gradient, since then spatial averaging should reduce the noise. However, here we show that, unlike for concentration sensing, the precision of gradient sensing decreases with transverse length for the simplest gradient sensing model, local excitation-global inhibition. The reason is that gradient sensing ultimately relies on a subtraction of measured concentration values. While spatial averaging indeed reduces the noise in these measurements, which increases precision, it also reduces the covariance between the measurements, which results in the net decrease in precision. We demonstrate how a recently introduced gradient sensing mechanism, regional excitation-global inhibition (REGI), overcomes this effect and recovers the benefit of transverse averaging. Using a REGI-based model, we compute the optimal two- and three-dimensional detector shapes, and argue that they are consistent with the shapes of naturally occurring gradient-sensing cell populations.

  13. Quantized Average Consensus on Gossip Digraphs with Reduced Computation

    NASA Astrophysics Data System (ADS)

    Cai, Kai; Ishii, Hideaki

    The authors have recently proposed a class of randomized gossip algorithms which solve the distributed averaging problem on directed graphs, with the constraint that each node has an integer-valued state. The essence of this algorithm is to maintain local records, called “surplus”, of individual state updates, thereby achieving quantized average consensus even though the state sum of all nodes is not preserved. In this paper we study a modified version of this algorithm, whose feature is primarily in reducing both computation and communication effort. Concretely, each node needs to update fewer local variables, and can transmit surplus by requiring only one bit. Under this modified algorithm we prove that reaching the average is ensured for arbitrary strongly connected graphs. The condition of arbitrary strong connection is less restrictive than those known in the literature for either real-valued or quantized states; in particular, it does not require the special structure on the network called balanced. Finally, we provide numerical examples to illustrate the convergence result, with emphasis on convergence time analysis.

  14. Calculations of High-Temperature Jet Flow Using Hybrid Reynolds-Average Navier-Stokes Formulations

    NASA Technical Reports Server (NTRS)

    Abdol-Hamid, Khaled S.; Elmiligui, Alaa; Giriamaji, Sharath S.

    2008-01-01

    Two multiscale-type turbulence models are implemented in the PAB3D solver. The models are based on modifying the Reynolds-averaged Navier Stokes equations. The first scheme is a hybrid Reynolds-averaged- Navier Stokes/large-eddy-simulation model using the two-equation k(epsilon) model with a Reynolds-averaged-Navier Stokes/large-eddy-simulation transition function dependent on grid spacing and the computed turbulence length scale. The second scheme is a modified version of the partially averaged Navier Stokes model in which the unresolved kinetic energy parameter f(sub k) is allowed to vary as a function of grid spacing and the turbulence length scale. This parameter is estimated based on a novel two-stage procedure to efficiently estimate the level of scale resolution possible for a given flow on a given grid for partially averaged Navier Stokes. It has been found that the prescribed scale resolution can play a major role in obtaining accurate flow solutions. The parameter f(sub k) varies between zero and one and is equal to one in the viscous sublayer and when the Reynolds-averaged Navier Stokes turbulent viscosity becomes smaller than the large-eddy-simulation viscosity. The formulation, usage methodology, and validation examples are presented to demonstrate the enhancement of PAB3D's time-accurate turbulence modeling capabilities. The accurate simulations of flow and turbulent quantities will provide a valuable tool for accurate jet noise predictions. Solutions from these models are compared with Reynolds-averaged Navier Stokes results and experimental data for high-temperature jet flows. The current results show promise for the capability of hybrid Reynolds-averaged Navier Stokes and large eddy simulation and partially averaged Navier Stokes in simulating such flow phenomena.

  15. Ranking and averaging independent component analysis by reproducibility (RAICAR).

    PubMed

    Yang, Zhi; LaConte, Stephen; Weng, Xuchu; Hu, Xiaoping

    2008-06-01

    Independent component analysis (ICA) is a data-driven approach that has exhibited great utility for functional magnetic resonance imaging (fMRI). Standard ICA implementations, however, do not provide the number and relative importance of the resulting components. In addition, ICA algorithms utilizing gradient-based optimization give decompositions that are dependent on initialization values, which can lead to dramatically different results. In this work, a new method, RAICAR (Ranking and Averaging Independent Component Analysis by Reproducibility), is introduced to address these issues for spatial ICA applied to fMRI. RAICAR utilizes repeated ICA realizations and relies on the reproducibility between them to rank and select components. Different realizations are aligned based on correlations, leading to aligned components. Each component is ranked and thresholded based on between-realization correlations. Furthermore, different realizations of each aligned component are selectively averaged to generate the final estimate of the given component. Reliability and accuracy of this method are demonstrated with both simulated and experimental fMRI data. Copyright 2007 Wiley-Liss, Inc.

  16. What difference a decade? The costs of psychosis in Australia in 2000 and 2010: comparative results from the first and second Australian national surveys of psychosis.

    PubMed

    Neil, Amanda L; Carr, Vaughan J; Mihalopoulos, Cathrine; Mackinnon, Andrew; Lewin, Terry J; Morgan, Vera A

    2014-03-01

    To assess differences in costs of psychosis between the first and second Australian national surveys of psychosis and examine them in light of policy developments. Cost differences due to changes in resource use and/or real price rises were assessed by minimizing differences in recruitment and costing methodologies between the two surveys. For each survey, average annual societal costs of persons recruited through public specialized mental health services in the census month were assessed through prevalence-based, bottom-up cost-of-illness analyses. The first survey costing methodology was employed as the reference approach. Unit costs were specific to each time period (2000, 2010) and expressed in 2010 Australian dollars. There was minimal change in the average annual costs of psychosis between the surveys, although newly included resources in the second survey's analysis cost AUD$3183 per person. Among resources common to each analysis were significant increases in the average annual cost per person for ambulatory care of AUD$7380, non-government services AUD$2488 and pharmaceuticals AUD$1892, and an upward trend in supported accommodation costs. These increases were offset by over a halving of mental health inpatient costs of AUD$11,790 per person and a 84.6% (AUD$604) decrease in crisis accommodation costs. Productivity losses, the greatest component cost, changed minimally, reflecting the magnitude and constancy of reduced employment levels of individuals with psychosis across the surveys. Between 2000 and 2010 there was little change in total average annual costs of psychosis for individuals receiving treatment at public specialized mental health services. However, there was a significant redistribution of costs within and away from the health sector in line with government initiatives arising from the Second and Third National Mental Health Plans. Non-health sector costs are now a critical component of cost-of-illness analyses of mental illnesses reflecting

  17. Averages of $b$-hadron, $c$-hadron, and $$\\tau$$-lepton properties as of summer 2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amhis, Y.; et al.

    2014-12-23

    This article reports world averages of measurements ofmore » $b$-hadron, $c$-hadron, and $$\\tau$$-lepton properties obtained by the Heavy Flavor Averaging Group (HFAG) using results available through summer 2014. For the averaging, common input parameters used in the various analyses are adjusted (rescaled) to common values, and known correlations are taken into account. The averages include branching fractions, lifetimes, neutral meson mixing parameters, $CP$ violation parameters, parameters of semileptonic decays and CKM matrix elements.« less

  18. African Americans and Mathematics Outcomes on National Assessment of Educational Progress: Parental and Individual Influences

    ERIC Educational Resources Information Center

    Noble, Richard, III; Morton, Crystal Hill

    2013-01-01

    This study investigated within group differences between African American female and male students who participated in the 2009 National Assessment of Educational Progress mathematics assessment. Using results from participating states, we compare average scale scores of African American students based on home regulatory environment and interest…

  19. National Audit of Seizure management in Hospitals (NASH): results of the national audit of adult epilepsy in the UK

    PubMed Central

    Dixon, Peter A; Kirkham, Jamie J; Marson, Anthony G; Pearson, Mike G

    2015-01-01

    Objectives About 100 000 people present to hospitals each year in England with an epileptic seizure. How they are managed is unknown; thus, the National Audit of Seizure management in Hospitals (NASH) set out to assess prior care, management of the acute event and follow-up of these patients. This paper describes the data from the second audit conducted in 2013. Setting 154 emergency departments (EDs) across the UK. Participants Data from 4544 attendances (median age of 45 years, 57% men) showed that 61% had a prior diagnosis of epilepsy, 12% other neurological problems and 22% were first seizure cases. Each ED identified 30 consecutive adult cases presenting due to a seizure. Primary and secondary outcome measures Details were recorded of the patient's prior care, management at hospital and onward referral to neurological specialists onto an online database. Descriptive results are reported at national level. Results Of those with epilepsy, 498 (18%) were on no antiepileptic drug therapy and 1330 (48%) were on monotherapy. Assessments were often incomplete and witness histories were sought in only 759 (75%) of first seizure patients, 58% were seen by a senior doctor and 57% were admitted. For first seizure patients, advice on further seizure management was given to 264 (27%) and only 55% were referred to a neurologist or epilepsy specialist. For each variable, there was wide variability among sites that was not explicable. For the sites who partook in both audits, there was a trend towards better care in 2013, but this was small and dwarfed by the intersite variability. Conclusions These results have parallels with the Sentinel Audit of Stroke performed a decade earlier. There is wide intersite variability in care covering the entire care pathway, and a need for better organised and accessible care for these patients. PMID:25829372

  20. Violence Directed against Teachers: Results from a National Survey

    ERIC Educational Resources Information Center

    Mcmahon, Susan D.; Martinez, Andrew; Espelage, Dorothy; Rose, Chad; Reddy, Linda A.; Lane, Kathleen; Anderman, Eric M.; Reynolds, Cecil R.; Jones, Abraham; Brown, Veda

    2014-01-01

    Teachers in U.S. schools report high rates of victimization, yet previous studies focus on select types of victimization and student perpetrators, which may underestimate the extent of the problem. This national study was based on work conducted by the American Psychological Association Classroom Violence Directed Against Teachers Task Force and…

  1. A comparison of average wages with age-specific wages for assessing indirect productivity losses: analytic simplicity versus analytic precision.

    PubMed

    Connolly, Mark P; Tashjian, Cole; Kotsopoulos, Nikolaos; Bhatt, Aomesh; Postma, Maarten J

    2017-07-01

    Numerous approaches are used to estimate indirect productivity losses using various wage estimates applied to poor health in working aged adults. Considering the different wage estimation approaches observed in the published literature, we sought to assess variation in productivity loss estimates when using average wages compared with age-specific wages. Published estimates for average and age-specific wages for combined male/female wages were obtained from the UK Office of National Statistics. A polynomial interpolation was used to convert 5-year age-banded wage data into annual age-specific wages estimates. To compare indirect cost estimates, average wages and age-specific wages were used to project productivity losses at various stages of life based on the human capital approach. Discount rates of 0, 3, and 6 % were applied to projected age-specific and average wage losses. Using average wages was found to overestimate lifetime wages in conditions afflicting those aged 1-27 and 57-67, while underestimating lifetime wages in those aged 27-57. The difference was most significant for children where average wage overestimated wages by 15 % and for 40-year-olds where it underestimated wages by 14 %. Large differences in projecting productivity losses exist when using the average wage applied over a lifetime. Specifically, use of average wages overestimates productivity losses between 8 and 15 % for childhood illnesses. Furthermore, during prime working years, use of average wages will underestimate productivity losses by 14 %. We suggest that to achieve more precise estimates of productivity losses, age-specific wages should become the standard analytic approach.

  2. Creating "Intelligent" Ensemble Averages Using a Process-Based Framework

    NASA Astrophysics Data System (ADS)

    Baker, Noel; Taylor, Patrick

    2014-05-01

    The CMIP5 archive contains future climate projections from over 50 models provided by dozens of modeling centers from around the world. Individual model projections, however, are subject to biases created by structural model uncertainties. As a result, ensemble averaging of multiple models is used to add value to individual model projections and construct a consensus projection. Previous reports for the IPCC establish climate change projections based on an equal-weighted average of all model projections. However, individual models reproduce certain climate processes better than other models. Should models be weighted based on performance? Unequal ensemble averages have previously been constructed using a variety of mean state metrics. What metrics are most relevant for constraining future climate projections? This project develops a framework for systematically testing metrics in models to identify optimal metrics for unequal weighting multi-model ensembles. The intention is to produce improved ("intelligent") unequal-weight ensemble averages. A unique aspect of this project is the construction and testing of climate process-based model evaluation metrics. A climate process-based metric is defined as a metric based on the relationship between two physically related climate variables—e.g., outgoing longwave radiation and surface temperature. Several climate process metrics are constructed using high-quality Earth radiation budget data from NASA's Clouds and Earth's Radiant Energy System (CERES) instrument in combination with surface temperature data sets. It is found that regional values of tested quantities can vary significantly when comparing the equal-weighted ensemble average and an ensemble weighted using the process-based metric. Additionally, this study investigates the dependence of the metric weighting scheme on the climate state using a combination of model simulations including a non-forced preindustrial control experiment, historical simulations, and

  3. Effects of temporal averaging on short-term irradiance variability under mixed sky conditions

    NASA Astrophysics Data System (ADS)

    Lohmann, Gerald M.; Monahan, Adam H.

    2018-05-01

    Characterizations of short-term variability in solar radiation are required to successfully integrate large numbers of photovoltaic power systems into the electrical grid. Previous studies have used ground-based irradiance observations with a range of different temporal resolutions and a systematic analysis of the effects of temporal averaging on the representation of variability is lacking. Using high-resolution surface irradiance data with original temporal resolutions between 0.01 and 1 s from six different locations in the Northern Hemisphere, we characterize the changes in representation of temporal variability resulting from time averaging. In this analysis, we condition all data to states of mixed skies, which are the most potentially problematic in terms of local PV power volatility. Statistics of clear-sky index k* and its increments Δk*τ (i.e., normalized surface irradiance and changes therein over specified intervals of time) are considered separately. Our results indicate that a temporal averaging time scale of around 1 s marks a transition in representing single-point irradiance variability, such that longer averages result in substantial underestimates of variability. Higher-resolution data increase the complexity of data management and quality control without appreciably improving the representation of variability. The results do not show any substantial discrepancies between locations or seasons.

  4. Preliminary results of the large experimental wind turbine phase of the national wind energy program

    NASA Technical Reports Server (NTRS)

    Thomas, R. L.; Sholes, T.; Sholes, J. E.

    1975-01-01

    The preliminary results of two projects in the development phase of reliable wind turbines designed to supply cost-competitive electrical energy were discussed. An experimental 100 kW wind turbine design and its status are first reviewed. The results of two parallel design studies for determining the configurations and power levels for wind turbines with minimum energy costs are also discussed. These studies predict wind energy costs of 1.5 to 7 cents per kW-h for wind turbines produced in quantities of 100 to 1000 per year and located at sites having average winds of 12 to 18 mph.

  5. Influence of Averaging Preprocessing on Image Analysis with a Markov Random Field Model

    NASA Astrophysics Data System (ADS)

    Sakamoto, Hirotaka; Nakanishi-Ohno, Yoshinori; Okada, Masato

    2018-02-01

    This paper describes our investigations into the influence of averaging preprocessing on the performance of image analysis. Averaging preprocessing involves a trade-off: image averaging is often undertaken to reduce noise while the number of image data available for image analysis is decreased. We formulated a process of generating image data by using a Markov random field (MRF) model to achieve image analysis tasks such as image restoration and hyper-parameter estimation by a Bayesian approach. According to the notions of Bayesian inference, posterior distributions were analyzed to evaluate the influence of averaging. There are three main results. First, we found that the performance of image restoration with a predetermined value for hyper-parameters is invariant regardless of whether averaging is conducted. We then found that the performance of hyper-parameter estimation deteriorates due to averaging. Our analysis of the negative logarithm of the posterior probability, which is called the free energy based on an analogy with statistical mechanics, indicated that the confidence of hyper-parameter estimation remains higher without averaging. Finally, we found that when the hyper-parameters are estimated from the data, the performance of image restoration worsens as averaging is undertaken. We conclude that averaging adversely influences the performance of image analysis through hyper-parameter estimation.

  6. In-situ and path-averaged measurements of aerosol optical properties

    NASA Astrophysics Data System (ADS)

    van Binsbergen, Sven A.; Grossmann, Peter; February, Faith J.; Cohen, Leo H.; van Eijk, Alexander M. J.; Stein, Karin U.

    2017-09-01

    This paper compares in-situ and path-averaged measurements of the electro-optical transmission, with emphasis on aerosol effects. The in-situ sensors consisted of optical particle counters (OPC), the path-averaged data was provided by a 7-wavelength transmissometer (MSRT) and scintillometers (BLS). Data were collected at two sites: a homogeneous test site in Northern Germany, and over the inhomogeneous False Bay near Cape Town, South Africa. A retrieval algorithm was developed to infer characteristics of the aerosol size distribution (Junge approximation) from the MSRT data. A comparison of the various sensors suggests that the optical particle counters are over optimistic in their estimate of the transmission. For the homogeneous test site, in-situ and path-averaged sensors yield similar results. For the inhomogeneous test site, sensors may react differently or temporally separated to meteorological events such as a change in wind speed and/or direction.

  7. National Hospital Input Price Index

    PubMed Central

    Freeland, Mark S.; Anderson, Gerard; Schendler, Carol Ellen

    1979-01-01

    The national community hospital input price index presented here isolates the effects of prices of goods and services required to produce hospital care and measures the average percent change in prices for a fixed market basket of hospital inputs. Using the methodology described in this article, weights for various expenditure categories were estimated and proxy price variables associated with each were selected. The index is calculated for the historical period 1970 through 1978 and forecast for 1979 through 1981. During the historical period, the input price index increased an average of 8.0 percent a year, compared with an average rate of increase of 6.6 percent for overall consumer prices. For the period 1979 through 1981, the average annual increase is forecast at between 8.5 and 9.0 percent. Using the index to deflate growth in expenses, the level of real growth in expenditures per inpatient day (net service intensity growth) averaged 4.5 percent per year with considerable annual variation related to government and hospital industry policies. PMID:10309052

  8. National hospital input price index.

    PubMed

    Freeland, M S; Anderson, G; Schendler, C E

    1979-01-01

    The national community hospital input price index presented here isolates the effects of prices of goods and services required to produce hospital care and measures the average percent change in prices for a fixed market basket of hospital inputs. Using the methodology described in this article, weights for various expenditure categories were estimated and proxy price variables associated with each were selected. The index is calculated for the historical period 1970 through 1978 and forecast for 1979 through 1981. During the historical period, the input price index increased an average of 8.0 percent a year, compared with an average rate of increase of 6.6 percent for overall consumer prices. For the period 1979 through 1981, the average annual increase is forecast at between 8.5 and 9.0 per cent. Using the index to deflate growth in expenses, the level of real growth in expenditures per inpatient day (net service intensity growth) averaged 4.5 percent per year with considerable annual variation related to government and hospital industry policies.

  9. Aperture averaging in strong oceanic turbulence

    NASA Astrophysics Data System (ADS)

    Gökçe, Muhsin Caner; Baykal, Yahya

    2018-04-01

    Receiver aperture averaging technique is employed in underwater wireless optical communication (UWOC) systems to mitigate the effects of oceanic turbulence, thus to improve the system performance. The irradiance flux variance is a measure of the intensity fluctuations on a lens of the receiver aperture. Using the modified Rytov theory which uses the small-scale and large-scale spatial filters, and our previously presented expression that shows the atmospheric structure constant in terms of oceanic turbulence parameters, we evaluate the irradiance flux variance and the aperture averaging factor of a spherical wave in strong oceanic turbulence. Irradiance flux variance variations are examined versus the oceanic turbulence parameters and the receiver aperture diameter are examined in strong oceanic turbulence. Also, the effect of the receiver aperture diameter on the aperture averaging factor is presented in strong oceanic turbulence.

  10. Averages of b-hadron, c-hadron, and τ-lepton properties as of summer 2016

    DOE PAGES

    Amhis, Y.; Banerjee, Sw.; Ben-Haim, E.; ...

    2017-12-21

    Here, this article reports world averages of measurements of b-hadron, c-hadron, and τ-lepton properties obtained by the Heavy Flavor Averaging Group using results available through summer 2016. For the averaging, common input parameters used in the various analyses are adjusted (rescaled) to common values, and known correlations are taken into account. The averages include branching fractions, lifetimes, neutral meson mixing parameters,more » $$C\\!P$$  violation parameters, parameters of semileptonic decays, and Cabbibo–Kobayashi–Maskawa matrix elements.« less

  11. Averages of b-hadron, c-hadron, and τ-lepton properties as of summer 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amhis, Y.; Banerjee, Sw.; Ben-Haim, E.

    Here, this article reports world averages of measurements of b-hadron, c-hadron, and τ-lepton properties obtained by the Heavy Flavor Averaging Group using results available through summer 2016. For the averaging, common input parameters used in the various analyses are adjusted (rescaled) to common values, and known correlations are taken into account. The averages include branching fractions, lifetimes, neutral meson mixing parameters,more » $$C\\!P$$  violation parameters, parameters of semileptonic decays, and Cabbibo–Kobayashi–Maskawa matrix elements.« less

  12. Cycle-averaged dynamics of a periodically driven, closed-loop circulation model

    NASA Technical Reports Server (NTRS)

    Heldt, T.; Chang, J. L.; Chen, J. J. S.; Verghese, G. C.; Mark, R. G.

    2005-01-01

    Time-varying elastance models have been used extensively in the past to simulate the pulsatile nature of cardiovascular waveforms. Frequently, however, one is interested in dynamics that occur over longer time scales, in which case a detailed simulation of each cardiac contraction becomes computationally burdensome. In this paper, we apply circuit-averaging techniques to a periodically driven, closed-loop, three-compartment recirculation model. The resultant cycle-averaged model is linear and time invariant, and greatly reduces the computational burden. It is also amenable to systematic order reduction methods that lead to further efficiencies. Despite its simplicity, the averaged model captures the dynamics relevant to the representation of a range of cardiovascular reflex mechanisms. c2004 Elsevier Ltd. All rights reserved.

  13. Heuristic approach to capillary pressures averaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coca, B.P.

    1980-10-01

    Several methods are available to average capillary pressure curves. Among these are the J-curve and regression equations of the wetting-fluid saturation in porosity and permeability (capillary pressure held constant). While the regression equation seem completely empiric, the J-curve method seems to be theoretically sound due to its expression based on a relation between the average capillary radius and the permeability-porosity ratio. An analysis is given of each of these methods.

  14. Holter Ridge Thinning Study, Redwood National Park: Preliminary Results of a 25-Year Retrospective

    Treesearch

    Andrew J. Chittick; Christopher R. Keyes

    2007-01-01

    Redwood National Park is comprised of large areas of overstocked stands resulting from harvest of the old-growth stands in the late 1940s to the 1970s. The Holter Ridge Thinning Study was initiated in 1978 to address this problem and examine the effects that thinning to varying spacing would have on forest development. Densities following thinning in 1979 ranged from...

  15. Regretting Ever Starting to Smoke: Results from a 2014 National Survey.

    PubMed

    Nayak, Pratibha; Pechacek, Terry F; Slovic, Paul; Eriksen, Michael P

    2017-04-06

    Background : The majority of smokers regret ever starting to smoke, yet the vast majority continue to smoke despite the fact that smoking kills nearly 50% of lifetime users. This study examined the relationships between regret and smoker characteristics, quit history, risk perceptions, experiential thinking, and beliefs and intentions at time of smoking initiation. Methods : Data from the 2014 Tobacco Products and Risk Perceptions Survey, a nationally representative survey of United States adults, were analyzed to provide the latest prevalence estimates of regret and potential predictors. Relationships among predictor variables and regret were analyzed using correlations, t -tests, and multinomial logistic regression. Results : The majority of smokers (71.5%) regretted starting to smoke. Being older and non-Hispanic white were significant predictors of regret. Smokers having a high intention to quit, having made quit attempts in the past year, worrying about getting lung cancer, believing smoking every day can be risky for your health, perceiving a risk of being diagnosed with lung cancer during one's lifetime, and considering themselves addicted to cigarettes were significant predictors of regret for smoking initiation. Conclusions : This study provides updated prevalence data on regret using a national sample, and confirms that regret is associated with perceived risk. The findings from this study can be used to inform smoking intervention programs and support the inclusion of smoker regret in cost-benefit analyses of the economic impact of tobacco regulations.

  16. Community attitudes to remunerated blood donation in Australia: results from a national telephone survey.

    PubMed

    Bambrick, Hilary; Gallego, Gisselle

    2013-10-01

    Blood in Australia is sourced through voluntary, non-remunerated donations. With periodic shortages in supply, increasing demand for blood products and a donor base that is perceived to be unsustainable, remuneration has been proposed as a means to improve donation rates. To examine community attitudes to remunerated blood donation in Australia. A national random telephone survey of Australian adults age 18-70 was conducted (n = 1024). Associations were tested using a chi-square (χ(2)) test for linear distribution. Reimbursement for the cost of travel to donate blood was supported by more respondents (46%) than reimbursement for time (19%). Non-donors were more likely to support a payment compared to donors (P = 0.002). Twelve per cent of respondents thought they would be more likely to donate if remunerated while 10% thought they would be less likely. The majority (76%) thought that a payment would not change whether or not they would donate, while 85% thought that it would make other people more likely to donate. The average amount considered to be reasonable reimbursement was AU$30. Despite the common perception that other people would be motivated to donate blood with the introduction of a financial incentive, remuneration may provide minimal incentive in Australia and is unlikely to increase donor participation for the time being. © 2013 The Authors. Transfusion Medicine © 2013 British Blood Transfusion Society.

  17. The Speech Intelligibility Index and the Pure-Tone Average as Predictors of Lexical Ability in Children Fit with Hearing Aids

    ERIC Educational Resources Information Center

    Stiles, Derek J.; Bentler, Ruth A.; McGregor, Karla K.

    2012-01-01

    Purpose: To determine whether a clinically obtainable measure of audibility, the aided Speech Intelligibility Index (SII; American National Standards Institute, 2007), is more sensitive than the pure-tone average (PTA) at predicting the lexical abilities of children who wear hearing aids (CHA). Method: School-age CHA and age-matched children with…

  18. Improving clinical practice in stroke through audit: results of three rounds of National Stroke Audit.

    PubMed

    Irwin, P; Hoffman, A; Lowe, D; Pearson, M; Rudd, A G

    2005-08-01

    The results of three rounds of National Stroke Audit in England, Wales and Northern Ireland are compared. Audit of the organization of stroke services and retrospective case-note audit of up to 40 consecutive cases admitted per hospital over a 3-month period was conducted in each of 1998, 1999 and 2001/02. The changes in the organizational, case-mix and process results of the hospitals that had participated in all three rounds were analysed. 60% of all eligible trusts from England, Wales and Northern Ireland took part in all three audits in 1998, 1999 and 2001/02. Total numbers of cases were 4996, 4841 and 5152, respectively. Case-mix variables were similar over the three rounds. Mortality at 7 and 30 days fell by 3% and 5%, respectively. The proportion of hospitals with a stroke unit rose from 48% to 77%. The proportion of patients spending most of their stay in a stroke unit rose from 17% in 1998 to 26% in 1999 and 29% in 2001/02. Improvements achieved in process standards of care between 1998 and 1999 (median change was a gain of 9%) failed to improve further by 2001/02 (median change was 0%). In all three rounds process standards of care tended to be better in stroke units. Three rounds of national audit of stroke care have shown standards of care on stroke units were notably higher than on general wards. Slowing in the rise of the proportion managed on stroke units mirrors the slow down in improvement to overall national standards of care. To further improve outcomes and national standards of stroke care a much higher proportion of patients needs to be managed in stroke units.

  19. National Enhanced Elevation Assessment at a glance

    USGS Publications Warehouse

    Snyder, Gregory I.

    2012-01-01

    Elevation data are essential for hazards mitigation, conservation, infrastructure development, national security, and many other applications. Under the leadership of the U.S. Geological Survey and the member States of the National Digital Elevation Program (NDEP), Federal agencies, State agencies, and others work together to acquire high-quality elevation data for the United States and its territories. New elevation data are acquired using modern technology to replace elevation data that are, on average, more than 30 years old. Through the efforts of the NDEP, a project-by-project data acquisition approach resulted in improved, publicly available data for 28 percent of the conterminous United States and 15 percent of Alaska over the past 15 years. Although the program operates efficiently, the rate of data collection and the typical project specifications are currently insufficient to address the needs of government, the private sector, and other organizations. The National Enhanced Elevation Assessment was conducted to (1) document national-level requirements for improved elevation data, (2) estimate the benefits and costs of meeting those requirements, and (3) evaluate multiple national-level program-implementation scenarios. The assessment was sponsored by the NDEP's member agencies. The study participants came from 34 Federal agencies, agencies from all 50 States, selected local government and Tribal offices, and private and not-for-profit organizations. A total of 602 mission-critical activities were identified that need significantly more accurate data than are currently available. The results of the assessment indicate that a national-level enhanced-elevation-data program has the potential to generate from $1.2 billion to $13 billion in new benefits annually.

  20. Growth of Jobs with Above Average Earnings Projected at All Education Levels. Issues in Labor Statistics. Summary 94-2.

    ERIC Educational Resources Information Center

    Bureau of Labor Statistics, Washington, DC.

    The Bureau of Labor Statistics projects national employment to grow by almost 26.4 million over the 1992-2005 period. The majority of these new jobs will be in higher-paying occupations. Entry requirements of the new jobs in occupations having above-average earnings will range from no more than a high school education to a bachelor's degree or…

  1. Comparison of food consumption in Indian adults between national and sub-national dietary data sources.

    PubMed

    Aleksandrowicz, Lukasz; Tak, Mehroosh; Green, Rosemary; Kinra, Sanjay; Haines, Andy

    2017-04-01

    Accurate data on dietary intake are important for public health, nutrition and agricultural policy. The National Sample Survey is widely used by policymakers in India to estimate nutritional outcomes in the country, but has not been compared with other dietary data sources. To assess relative differences across available Indian dietary data sources, we compare intake of food groups across six national and sub-national surveys between 2004 and 2012, representing various dietary intake estimation methodologies, including Household Consumption Expenditure Surveys (HCES), FFQ, food balance sheets (FBS), and 24-h recall (24HR) surveys. We matched data for relevant years, regions and economic groups, for ages 16-59. One set of national HCES and the 24HR showed a decline in food intake in India between 2004-2005 and 2011-2012, whereas another HCES and FBS showed an increase. Differences in intake were smallest between the two HCES (1 % relative difference). Relative to these, FFQ and FBS had higher intake (13 and 35 %), and the 24HR lower intake (-9 %). Cereal consumption had high agreement across comparisons (average 5 % difference), whereas fruit and nuts, eggs, meat and fish and sugar had the least (120, 119, 56 and 50 % average differences, respectively). Spearman's coefficients showed high correlation of ranked food group intake across surveys. The underlying methods of the compared data highlight possible sources of under- or over-estimation, and influence their relevance for addressing various research questions and programmatic needs.

  2. 7 CFR 1410.44 - Average adjusted gross income.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Average adjusted gross income. 1410.44 Section 1410... Average adjusted gross income. (a) Benefits under this part will not be available to persons or legal entities whose average adjusted gross income exceeds $1,000,000 or as further specified in part 1400...

  3. 7 CFR 1410.44 - Average adjusted gross income.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 10 2011-01-01 2011-01-01 false Average adjusted gross income. 1410.44 Section 1410... Average adjusted gross income. (a) Benefits under this part will not be available to persons or legal entities whose average adjusted gross income exceeds $1,000,000 or as further specified in part 1400...

  4. Hydrologic response of desert wetlands to Holocene climate change: preliminary results from the Soda Springs area, Mojave National Preserve, California

    USGS Publications Warehouse

    Pigati, Jeffrey S.; Reheis, Marith C.; McGeehin, John P.; Honke, Jeffrey S.; Bright, J.

    2016-01-01

    Desert wetlands are common features in arid environments and include a variety of hydrologic facies, including seeps, springs, marshes, wet meadows, ponds, and spring pools. Wet ground conditions and dense stands of vegetation in these settings combine to trap eolian, alluvial, and fluvial sediments that accumulate over time. The resulting deposits are collectively called ground-water discharge (GWD) deposits, and contain information on how small desert watersheds responded to climate change in the past. Most GWD studies in the southwestern U.S. have focused on the late Pleistocene because the Holocene was too dry to support the extensive wetland systems that were so pervasive just a few millennia earlier. Here we describe the results of a pilot project that involves coring extant wetlands and analyzing the sedimentology and microfauna of the recovered sediment to infer Holocene hydrologic conditions. In 2011, a series of cores were taken near wetlands situated along the western margin of the Soda Lake basin in the Mojave National Preserve of southern California. The core sediments appear to show that the wetlands responded to the relatively minor climate fluctuations that characterized the Holocene. However, our analysis was limited by relatively low sediment recovery (which only averaged 70-80%) and a general paucity of datable materials in the cores. Additional studies aimed at improving recovery and developing new techniques for concentrating plant microfossils (plant remains that are <150 m in diameter) for radiocarbon dating are ongoing.

  5. A straightforward method to compute average stochastic oscillations from data samples.

    PubMed

    Júlvez, Jorge

    2015-10-19

    Many biological systems exhibit sustained stochastic oscillations in their steady state. Assessing these oscillations is usually a challenging task due to the potential variability of the amplitude and frequency of the oscillations over time. As a result of this variability, when several stochastic replications are averaged, the oscillations are flattened and can be overlooked. This can easily lead to the erroneous conclusion that the system reaches a constant steady state. This paper proposes a straightforward method to detect and asses stochastic oscillations. The basis of the method is in the use of polar coordinates for systems with two species, and cylindrical coordinates for systems with more than two species. By slightly modifying these coordinate systems, it is possible to compute the total angular distance run by the system and the average Euclidean distance to a reference point. This allows us to compute confidence intervals, both for the average angular speed and for the distance to a reference point, from a set of replications. The use of polar (or cylindrical) coordinates provides a new perspective of the system dynamics. The mean trajectory that can be obtained by averaging the usual cartesian coordinates of the samples informs about the trajectory of the center of mass of the replications. In contrast to such a mean cartesian trajectory, the mean polar trajectory can be used to compute the average circular motion of those replications, and therefore, can yield evidence about sustained steady state oscillations. Both, the coordinate transformation and the computation of confidence intervals, can be carried out efficiently. This results in an efficient method to evaluate stochastic oscillations.

  6. The Inaccuracy of National Character Stereotypes

    PubMed Central

    McCrae, Robert R.; Chan, Wayne; Jussim, Lee; De Fruyt, Filip; Löckenhoff, Corinna E.; De Bolle, Marleen; Costa, Paul T.; Hřebíčková, Martina; Graf, Sylvie; Realo, Anu; Allik, Jüri; Nakazato, Katsuharu; Shimonaka, Yoshiko; Yik, Michelle; Ficková, Emília; Brunner-Sciarra, Marina; Reátigui, Norma; de Figueora, Nora Leibovich; Schmidt, Vanina; Ahn, Chang-kyu; Ahn, Hyun-nie; Aguilar-Vafaie, Maria E.; Siuta, Jerzy; Szmigielska, Barbara; Cain, Thomas R.; Crawford, Jarret T.; Mastor, Khairul Anwar; Rolland, Jean-Pierre; Nansubuga, Florence; Miramontez, Daniel R.; Benet-Martínez, Veronica; Rossier, Jérôme; Bratko, Denis; Marušić, Iris; Halberstadt, Jamin; Yamaguchi, Mami; Knežević, Goran; Purić, Danka; Martin, Thomas A.; Gheorghiu, Mirona; Smith, Peter B.; Barbaranelli, Claudio; Wang, Lei; Shakespeare-Finch, Jane; Lima, Margarida P.; Klinkosz, Waldemar; Sekowski, Andrzej; Alcalay, Lidia; Simonetti, Franco; Avdeyeva, Tatyana V.; Pramila, V. S.; Terracciano, Antonio

    2013-01-01

    Consensual stereotypes of some groups are relatively accurate, whereas others are not. Previous work suggesting that national character stereotypes are inaccurate has been criticized on several grounds. In this article we (a) provide arguments for the validity of assessed national mean trait levels as criteria for evaluating stereotype accuracy; and (b) report new data on national character in 26 cultures from descriptions (N=3,323) of the typical male or female adolescent, adult, or old person in each. The average ratings were internally consistent and converged with independent stereotypes of the typical culture member, but were weakly related to objective assessments of personality. We argue that this conclusion is consistent with the broader literature on the inaccuracy of national character stereotypes. PMID:24187394

  7. Averaging in SU(2) open quantum random walk

    NASA Astrophysics Data System (ADS)

    Clement, Ampadu

    2014-03-01

    We study the average position and the symmetry of the distribution in the SU(2) open quantum random walk (OQRW). We show that the average position in the central limit theorem (CLT) is non-uniform compared with the average position in the non-CLT. The symmetry of distribution is shown to be even in the CLT.

  8. Kinetic energy equations for the average-passage equation system

    NASA Technical Reports Server (NTRS)

    Johnson, Richard W.; Adamczyk, John J.

    1989-01-01

    Important kinetic energy equations derived from the average-passage equation sets are documented, with a view to their interrelationships. These kinetic equations may be used for closing the average-passage equations. The turbulent kinetic energy transport equation used is formed by subtracting the mean kinetic energy equation from the averaged total instantaneous kinetic energy equation. The aperiodic kinetic energy equation, averaged steady kinetic energy equation, averaged unsteady kinetic energy equation, and periodic kinetic energy equation, are also treated.

  9. Determining average yarding distance.

    Treesearch

    Roger H. Twito; Charles N. Mann

    1979-01-01

    Emphasis on environmental and esthetic quality in timber harvesting has brought about increased use of complex boundaries of cutting units and a consequent need for a rapid and accurate method of determining the average yarding distance and area of these units. These values, needed for evaluation of road and landing locations in planning timber harvests, are easily and...

  10. Average Revisited in Context

    ERIC Educational Resources Information Center

    Watson, Jane; Chick, Helen

    2012-01-01

    This paper analyses the responses of 247 middle school students to items requiring the concept of average in three different contexts: a city's weather reported in maximum daily temperature, the number of children in a family, and the price of houses. The mixed but overall disappointing performance on the six items in the three contexts indicates…

  11. Fractures from trampolines: results from a national database, 2002 to 2011.

    PubMed

    Loder, Randall T; Schultz, William; Sabatino, Meagan

    2014-01-01

    No study specifically analyzes trampoline fracture patterns across a large population. The purpose of this study was to determine such patterns. We queried the National Electronic Injury Surveillance System database for trampoline injuries between 2002 and 2011, and the patients were analyzed by age, sex, race, anatomic location of the injury, geographical location of the injury, and disposition from the emergency department (ED). Statistical analyses were performed with SUDAAN 10 software. Estimated expenses were determined using 2010 data. There were an estimated 1,002,735 ED visits for trampoline-related injuries; 288,876 (29.0%) sustained fractures. The average age for those with fractures was 9.5 years; 92.7% were aged 16 years or younger; 51.7% were male, 95.1% occurred at home, and 9.9% were admitted. The fractures were located in the upper extremity (59.9%), lower extremity (35.7%), and axial skeleton (spine, skull/face, rib/sternum) (4.4%-spine 1.0%, skull/face 2.9%, rib/sternum 0.5%). Those in the axial skeleton were older (16.5 y) than the upper extremity (8.7 y) or lower extremity (10.0 y) (P<0.0001) and more frequently male (67.9%). Lower extremity fractures were more frequently female (54.0%) (P<0.0001). The forearm (37%) and elbow (19%) were most common in the upper extremity; elbow fractures were most frequently admitted (20.0%). The tibia/fibula (39.5%) and ankle (31.5%) were most common in the lower extremity; femur fractures were most frequently admitted (57.9%). Cervical (36.4%) and lumbar (24.7%) were most common locations in the spine; cervical fractures were the most frequently admitted (75.6%). The total ED expense for all trampoline injuries over this 10-year period was $1.002 billion and $408 million for fractures. Trampoline fractures most frequently involve the upper extremity followed by the lower extremity, >90% occur in children. The financial burden to society is large. Further efforts for prevention are needed.

  12. Influence of wind speed averaging on estimates of dimethylsulfide emission fluxes

    DOE PAGES

    Chapman, E. G.; Shaw, W. J.; Easter, R. C.; ...

    2002-12-03

    The effect of various wind-speed-averaging periods on calculated DMS emission fluxes is quantitatively assessed. Here, a global climate model and an emission flux module were run in stand-alone mode for a full year. Twenty-minute instantaneous surface wind speeds and related variables generated by the climate model were archived, and corresponding 1-hour-, 6-hour-, daily-, and monthly-averaged quantities calculated. These various time-averaged, model-derived quantities were used as inputs in the emission flux module, and DMS emissions were calculated using two expressions for the mass transfer velocity commonly used in atmospheric models. Results indicate that the time period selected for averaging wind speedsmore » can affect the magnitude of calculated DMS emission fluxes. A number of individual marine cells within the global grid show DMS emissions fluxes that are 10-60% higher when emissions are calculated using 20-minute instantaneous model time step winds rather than monthly-averaged wind speeds, and at some locations the differences exceed 200%. Many of these cells are located in the southern hemisphere where anthropogenic sulfur emissions are low and changes in oceanic DMS emissions may significantly affect calculated aerosol concentrations and aerosol radiative forcing.« less

  13. Model averaging techniques for quantifying conceptual model uncertainty.

    PubMed

    Singh, Abhishek; Mishra, Srikanta; Ruskauff, Greg

    2010-01-01

    In recent years a growing understanding has emerged regarding the need to expand the modeling paradigm to include conceptual model uncertainty for groundwater models. Conceptual model uncertainty is typically addressed by formulating alternative model conceptualizations and assessing their relative likelihoods using statistical model averaging approaches. Several model averaging techniques and likelihood measures have been proposed in the recent literature for this purpose with two broad categories--Monte Carlo-based techniques such as Generalized Likelihood Uncertainty Estimation or GLUE (Beven and Binley 1992) and criterion-based techniques that use metrics such as the Bayesian and Kashyap Information Criteria (e.g., the Maximum Likelihood Bayesian Model Averaging or MLBMA approach proposed by Neuman 2003) and Akaike Information Criterion-based model averaging (AICMA) (Poeter and Anderson 2005). These different techniques can often lead to significantly different relative model weights and ranks because of differences in the underlying statistical assumptions about the nature of model uncertainty. This paper provides a comparative assessment of the four model averaging techniques (GLUE, MLBMA with KIC, MLBMA with BIC, and AIC-based model averaging) mentioned above for the purpose of quantifying the impacts of model uncertainty on groundwater model predictions. Pros and cons of each model averaging technique are examined from a practitioner's perspective using two groundwater modeling case studies. Recommendations are provided regarding the use of these techniques in groundwater modeling practice.

  14. Synoptic observations of Jupiter's radio emissions: Average Statistical properties observed by Voyager

    NASA Technical Reports Server (NTRS)

    Alexander, J. K.; Carr, T. D.; Thieman, J. R.; Schauble, J. J.; Riddle, A. C.

    1980-01-01

    Observations of Jupiter's low frequency radio emissions collected over one month intervals before and after each Voyager encounter were analyzed. Compilations of occurrence probability, average power flux density and average sense of circular polarization are presented as a function of central meridian longitude, phase of Io, and frequency. The results are compared with ground based observations. The necessary geometrical conditions are preferred polarization sense for Io-related decametric emission observed by Voyager from above both the dayside and nightside hemispheres are found to be essentially the same as are observed in Earth based studies. On the other hand, there is a clear local time dependence in the Io-independent decametric emission. Io appears to have an influence on average flux density of the emission down to below 2 MHz. The average power flux density spectrum of Jupiter's emission has a broad peak near 9MHz. Integration of the average spectrum over all frequencies gives a total radiated power for an isotropic source of 4 x 10 to the 11th power W.

  15. Conversion of National Health Insurance Service-National Sample Cohort (NHIS-NSC) Database into Observational Medical Outcomes Partnership-Common Data Model (OMOP-CDM).

    PubMed

    You, Seng Chan; Lee, Seongwon; Cho, Soo-Yeon; Park, Hojun; Jung, Sungjae; Cho, Jaehyeong; Yoon, Dukyong; Park, Rae Woong

    2017-01-01

    It is increasingly necessary to generate medical evidence applicable to Asian people compared to those in Western countries. Observational Health Data Sciences a Informatics (OHDSI) is an international collaborative which aims to facilitate generating high-quality evidence via creating and applying open-source data analytic solutions to a large network of health databases across countries. We aimed to incorporate Korean nationwide cohort data into the OHDSI network by converting the national sample cohort into Observational Medical Outcomes Partnership-Common Data Model (OMOP-CDM). The data of 1.13 million subjects was converted to OMOP-CDM, resulting in average 99.1% conversion rate. The ACHILLES, open-source OMOP-CDM-based data profiling tool, was conducted on the converted database to visualize data-driven characterization and access the quality of data. The OMOP-CDM version of National Health Insurance Service-National Sample Cohort (NHIS-NSC) can be a valuable tool for multiple aspects of medical research by incorporation into the OHDSI research network.

  16. Sample Size Bias in Judgments of Perceptual Averages

    ERIC Educational Resources Information Center

    Price, Paul C.; Kimura, Nicole M.; Smith, Andrew R.; Marshall, Lindsay D.

    2014-01-01

    Previous research has shown that people exhibit a sample size bias when judging the average of a set of stimuli on a single dimension. The more stimuli there are in the set, the greater people judge the average to be. This effect has been demonstrated reliably for judgments of the average likelihood that groups of people will experience negative,…

  17. Historical bac-barrier shoreline changes, Padre Island National Seashore, Texas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prouty, J.S.

    1989-09-01

    Historical progradation rates of the Laguna Madre shoreline at Padre Island National Seashore have varied considerably, largely in response to rainfall fluctuations and perhaps grazing pressure. Analysis of aerial photographs indicates that near the northern margin of the National Seashore the shoreline prograded at an average rate of 26 ft/year between 1941 and 1950. The average rate of progradation increased to 42 ft/year between 1950 and 1964. Average rates then slowed in the late 1960s, and since 1969 the shoreline has prograded at an average rate of only 1 ft/year. Some areas of the shoreline are presently eroding. Early mapsmore » and accounts suggest that a century ago North Padre Island was largely vegetated. Overgrazing and a series of droughts in the late 19th and early 20th centuries denuded the island, and prevailing winds blew sand westward across the island into Laguna Madre. With higher than average rainfall in the past 2 decades and less grazing, the island has significantly revegetated. Winds now carry less sand to Laguna Madre; reduced sand supply is a major cause of present-day shoreline retreat.« less

  18. How much e-waste is there in US basements and attics? Results from a national survey.

    PubMed

    Saphores, Jean-Daniel M; Nixon, Hilary; Ogunseitan, Oladele A; Shapiro, Andrew A

    2009-08-01

    The fate of used electronic products (e-waste) is of increasing concern because of their toxicity and the growing volume of e-waste. Addressing these concerns requires developing the recycling infrastructure, but good estimates of the volume of e-waste stored by US households are still unavailable. In this context, we make two contributions based on a national random survey of 2136 US households. First, we explain how much e-waste is stored by US households using count models. Significant explanatory variables include age, marital and employment status, ethnicity, household size, previous e-waste recycling behavior, and to some extent education, home ownership, and understanding the consequences of recycling, but neither income nor knowledge of e-waste recycling laws. Second, we estimate that on average, each US household has 4.1 small (

  19. Primary School Principals and the Purposes of Education in Australia: Results of a National Survey

    ERIC Educational Resources Information Center

    Cranston, Neil; Mulford, Bill; Keating, Jack; Reid, Alan

    2010-01-01

    Purpose: The purpose of this paper is to report the results of a national survey of government primary school principals in Australia, investigating the purposes of education, in terms of the importance and level of enactment of those purposes in schools. Design/methodology/approach: In 2009, an electronic survey was distributed to government…

  20. Wildland fire, risk, and recovery: results of a national survey with regional and racial perspectives

    Treesearch

    J. Michael Bowker; Siew Hoon Lim; H. Ken Cordell; Gary T. Green; Sandra Rideout-Hanzak; Cassandra Y. Johnson

    2008-01-01

    We used a national household survey to examine knowledge, attitudes, and preferences pertaining to wildland fire. First, we present nationwide results and trends. Then, we examine opinions across region and race. Despite some regional variation, respondents are fairly consistent in their beliefs about assuming personal responsibility for living in fire-prone areas and...

  1. Postpartum Depressive symptomatology: results from a two-stage US national survey.

    PubMed

    Tatano Beck, Cheryl; Gable, Robert K; Sakala, Carol; Declercq, Eugene R

    2011-01-01

    Up to 19% of new mothers have major or minor depression sometime during the first 3 months after birth. This article reports on the prevalence of postpartum depressive symptoms and risk factors obtained from a 2-stage US national survey conducted by Childbirth Connection: Listening to Mothers II (LTM II) and Listening to Mothers II Postpartum Survey. The weighted survey results are based on an initial sample of 1573 women (1373 online, 200 telephone interviews) who had given birth in the year prior to the survey and repeat interviews with 902 women (859 online, 44 telephone) 6 months later. Three main instruments were used to collect data: the Postpartum Depression Screening Scale (PDSS), the Patient Health Questionnaire-2 (PHQ-2), and the Posttraumatic Stress Disorder Symptom Scale-Self Report (PSS-SR). Sixty-three percent of the women in the LTM II sample screened positive for elevated postpartum depressive symptoms with the PDSS, and 6 months later 42% of the women in this sample screened positive for elevated postpartum depressive symptoms with the PHQ-2. A stepwise, multiple regression revealed 2 variables that significantly explained 54% of the variance in postpartum depressive symptom scores: posttraumatic stress symptom scores on the PSS-SR and health promoting behaviors of healthy diet, managing stress, rest, and exercise. The high percentage of mothers who screened positive for elevated postpartum depressive symptoms in this 2-stage national survey highlights the need for prevention and routine screening during the postpartum period and follow-up treatment. © 2011 by the American College of Nurse‐Midwives.

  2. What Are Kids Vaping? Results from a National Survey of U.S. Adolescents

    PubMed Central

    Miech, Richard; Patrick, Megan E.; O’Malley, Patrick M.; Johnston, Lloyd D.

    2016-01-01

    Objective To examine what substances U.S. youth vape. Methods Data come from Monitoring the Future, an annual, nationally-representative survey of U.S. 12th, 10th, and 8th grade students. Respondents reported what substance they vaped the last time they used a vaporizer such as an e-cigarette. Results Among students who had ever used a vaporizer, 65–66% last used “just flavoring” in 12th, in 10th, and in 8th grade, more than all other responses combined. In all three grades the percentage using “just flavoring” was above 57% for males, females, African-Americans, Hispanics, whites, and students both and without a parent with a college degree. Nicotine use came in a distant second, at about 20% in 12th and 10th grade and 13% in 8th grade. Taking into account youth who vaped nicotine at last use increases national estimates of tobacco/nicotine prevalence in the past 30 days by 24%–38% above and beyond cigarette smoking, which is substantial but far less than estimates that assume all vaporizer users inhale nicotine. Conclusions These results challenge the common assumption that all vaporizer users inhale nicotine. They (a) call into question the designation of vaporizers and e-cigarettes as ENDS (“Electronic Nicotine Delivery System”), (b) suggest that the recent rise in adolescent vaporizer use does not necessarily indicate a nicotine epidemic, and (c) indicate that vaporizer users can be candidates for primary prevention programs. Finally, the results suggest the importance of developing different rationales for the regulation of vaporizer devices as compared to regulation of substances marketed for vaporizer use. PMID:27562412

  3. Hybrid Reynolds-Averaged/Large Eddy Simulation of a Cavity Flameholder; Assessment of Modeling Sensitivities

    NASA Technical Reports Server (NTRS)

    Baurle, R. A.

    2015-01-01

    Steady-state and scale-resolving simulations have been performed for flow in and around a model scramjet combustor flameholder. The cases simulated corresponded to those used to examine this flowfield experimentally using particle image velocimetry. A variety of turbulence models were used for the steady-state Reynolds-averaged simulations which included both linear and non-linear eddy viscosity models. The scale-resolving simulations used a hybrid Reynolds-averaged / large eddy simulation strategy that is designed to be a large eddy simulation everywhere except in the inner portion (log layer and below) of the boundary layer. Hence, this formulation can be regarded as a wall-modeled large eddy simulation. This effort was undertaken to formally assess the performance of the hybrid Reynolds-averaged / large eddy simulation modeling approach in a flowfield of interest to the scramjet research community. The numerical errors were quantified for both the steady-state and scale-resolving simulations prior to making any claims of predictive accuracy relative to the measurements. The steady-state Reynolds-averaged results showed a high degree of variability when comparing the predictions obtained from each turbulence model, with the non-linear eddy viscosity model (an explicit algebraic stress model) providing the most accurate prediction of the measured values. The hybrid Reynolds-averaged/large eddy simulation results were carefully scrutinized to ensure that even the coarsest grid had an acceptable level of resolution for large eddy simulation, and that the time-averaged statistics were acceptably accurate. The autocorrelation and its Fourier transform were the primary tools used for this assessment. The statistics extracted from the hybrid simulation strategy proved to be more accurate than the Reynolds-averaged results obtained using the linear eddy viscosity models. However, there was no predictive improvement noted over the results obtained from the explicit

  4. The National Visitor Use Monitoring methodology and final results for round 1

    Treesearch

    S.J. Zarnoch; E.M. White; D.B.K. English; Susan M. Kocis; Ross Arnold

    2011-01-01

    A nationwide, systematic monitoring process has been developed to provide improved estimates of recreation visitation on National Forest System lands. Methodology is presented to provide estimates of site visits and national forest visits based on an onsite sampling design of site-days and last-exiting recreationists. Stratification of the site days, based on site type...

  5. Using Flexible Busing to Meet Average Class Size Targets

    ERIC Educational Resources Information Center

    Felt, Andrew J.; Koelemay, Ryan; Richter, Alexander

    2008-01-01

    This article describes a method of flexible redistricting for K-12 public school districts that allows students from the same geographical region to be bused to different schools, with the goal of meeting average class size (ACS) target ranges. Results of a case study on a geographically large school district comparing this method to a traditional…

  6. Hurricanes and anchors: preliminary results from the National Park Service regional reef assessment program

    USGS Publications Warehouse

    Rogers, Caroline S.

    1994-01-01

    The U .S . National Park Service NPS began a Regional Assessment Program for coral reefs in the U.S. Virgin Islands and Florida in 1988. Scientists from NPS and six other institutions have now established longterm monitoring sites at Virgin Islands National Park St. John, USVI, Buck Island Reef National Monument St. Croix, USVI, Biscayne National Park Florida and Fort Jefferson National Monument Florida. Hurricane Hugo passed through the USVI in 1989, causing severe destruction in some reef areas while leaving others untouched. Patchy damage to reefs in Florida was also noted after Hurricane Andrew; damage from this August 1992 storm is still being assessed. Fort Jefferson National Monument escaped the onslaught of Andrew. No significant recovery in live coral cover has been evident at the Buck Island or Virgin Islands National Park VINP study sites 3.5 years after Hurricane Hugo. Similarly, no recovery was evident at another site in St. John which was destroyed by a large anchor 4.5 years ago.

  7. Dynamic time warping-based averaging framework for functional near-infrared spectroscopy brain imaging studies

    NASA Astrophysics Data System (ADS)

    Zhu, Li; Najafizadeh, Laleh

    2017-06-01

    We investigate the problem related to the averaging procedure in functional near-infrared spectroscopy (fNIRS) brain imaging studies. Typically, to reduce noise and to empower the signal strength associated with task-induced activities, recorded signals (e.g., in response to repeated stimuli or from a group of individuals) are averaged through a point-by-point conventional averaging technique. However, due to the existence of variable latencies in recorded activities, the use of the conventional averaging technique can lead to inaccuracies and loss of information in the averaged signal, which may result in inaccurate conclusions about the functionality of the brain. To improve the averaging accuracy in the presence of variable latencies, we present an averaging framework that employs dynamic time warping (DTW) to account for the temporal variation in the alignment of fNIRS signals to be averaged. As a proof of concept, we focus on the problem of localizing task-induced active brain regions. The framework is extensively tested on experimental data (obtained from both block design and event-related design experiments) as well as on simulated data. In all cases, it is shown that the DTW-based averaging technique outperforms the conventional-based averaging technique in estimating the location of task-induced active regions in the brain, suggesting that such advanced averaging methods should be employed in fNIRS brain imaging studies.

  8. Socio-demographic predictors and average annual rates of caesarean section in Bangladesh between 2004 and 2014

    PubMed Central

    Khan, Md. Nuruzzaman; Islam, M. Mofizul; Shariff, Asma Ahmad; Alam, Md. Mahmudul; Rahman, Md. Mostafizur

    2017-01-01

    Background Globally the rates of caesarean section (CS) have steadily increased in recent decades. This rise is not fully accounted for by increases in clinical factors which indicate the need for CS. We investigated the socio-demographic predictors of CS and the average annual rates of CS in Bangladesh between 2004 and 2014. Methods Data were derived from four waves of nationally representative Bangladesh Demographic and Health Survey (BDHS) conducted between 2004 and 2014. Rate of change analysis was used to calculate the average annual rate of increase in CS from 2004 to 2014, by socio-demographic categories. Multi-level logistic regression was used to identify the socio-demographic predictors of CS in a cross-sectional analysis of the 2014 BDHS data. Result CS rates increased from 3.5% in 2004 to 23% in 2014. The average annual rate of increase in CS was higher among women of advanced maternal age (≥35 years), urban areas, and relatively high socio-economic status; with higher education, and who regularly accessed antenatal services. The multi-level logistic regression model indicated that lower (≤19) and advanced maternal age (≥35), urban location, relatively high socio-economic status, higher education, birth of few children (≤2), antenatal healthcare visits, overweight or obese were the key factors associated with increased utilization of CS. Underweight was a protective factor for CS. Conclusion The use of CS has increased considerably in Bangladesh over the survey years. This rising trend and the risk of having CS vary significantly across regions and socio-economic status. Very high use of CS among women of relatively high socio-economic status and substantial urban-rural difference call for public awareness and practice guideline enforcement aimed at optimizing the use of CS. PMID:28493956

  9. The First National Study of Neighborhood Parks

    PubMed Central

    Cohen, Deborah A.; Han, Bing; Nagel, Catherine; Harnik, Peter; McKenzie, Thomas L.; Evenson, Kelly R.; Marsh, Terry; Williamson, Stephanie; Vaughan, Christine; Katta, Sweatha

    2016-01-01

    Introduction An extensive infrastructure of neighborhood parks supports leisure time physical activity in most U.S. cities; yet, most Americans do not meet national guidelines for physical activity. Neighborhood parks have never been assessed nationally to identify their role in physical activity. Methods Using a stratified multistage sampling strategy, a representative sample of 174 neighborhood parks in 25 major cities (population >100,000) across the U.S. was selected. Park use, park-based physical activity, and park conditions were observed during a typical week using systematic direct observation during spring/summer of 2014. Park administrators were interviewed to assess policies and practices. Data were analyzed in 2014–2015 using repeated-measure negative binomial regressions to estimate weekly park use and park-based physical activity. Results Nationwide, the average neighborhood park of 8.8 acres averaged 23 users/hour or an estimated 1,533 person hours of weekly use. Walking loops and gymnasia each generated 221 hours/week of moderate to vigorous physical activity. Seniors represented 4% of park users, but 20% of the general population. Parks were used less in low-income than in high-income neighborhoods, largely explained by fewer supervised activities and marketing/outreach efforts. Programming and marketing were associated with 37% and 63% more hours of moderate to vigorous physical activity/week in parks, respectively. Conclusions The findings establish national benchmarks for park use, which can guide future park investments and management practices to improve population health. Offering more programming, using marketing tools like banners and posters, and installing facilities like walking loops may help currently underutilized parks increase population physical activity. PMID:27209496

  10. Runoff and leaching of metolachlor from Mississippi River alluvial soil during seasons of average and below-average rainfall.

    PubMed

    Southwick, Lloyd M; Appelboom, Timothy W; Fouss, James L

    2009-02-25

    The movement of the herbicide metolachlor [2-chloro-N-(2-ethyl-6-methylphenyl)-N-(2-methoxy-1-methylethyl)acetamide] via runoff and leaching from 0.21 ha plots planted to corn on Mississippi River alluvial soil (Commerce silt loam) was measured for a 6-year period, 1995-2000. The first three years received normal rainfall (30 year average); the second three years experienced reduced rainfall. The 4-month periods prior to application plus the following 4 months after application were characterized by 1039 +/- 148 mm of rainfall for 1995-1997 and by 674 +/- 108 mm for 1998-2000. During the normal rainfall years 216 +/- 150 mm of runoff occurred during the study seasons (4 months following herbicide application), accompanied by 76.9 +/- 38.9 mm of leachate. For the low-rainfall years these amounts were 16.2 +/- 18.2 mm of runoff (92% less than the normal years) and 45.1 +/- 25.5 mm of leachate (41% less than the normal seasons). Runoff of metolachlor during the normal-rainfall seasons was 4.5-6.1% of application, whereas leaching was 0.10-0.18%. For the below-normal periods, these losses were 0.07-0.37% of application in runoff and 0.22-0.27% in leachate. When averages over the three normal and the three less-than-normal seasons were taken, a 35% reduction in rainfall was characterized by a 97% reduction in runoff loss and a 71% increase in leachate loss of metolachlor on a percent of application basis. The data indicate an increase in preferential flow in the leaching movement of metolachlor from the surface soil layer during the reduced rainfall periods. Even with increased preferential flow through the soil during the below-average rainfall seasons, leachate loss (percent of application) of the herbicide remained below 0.3%. Compared to the average rainfall seasons of 1995-1997, the below-normal seasons of 1998-2000 were characterized by a 79% reduction in total runoff and leachate flow and by a 93% reduction in corresponding metolachlor movement via these routes

  11. Declining average daily census. Part 1: Implications and options.

    PubMed

    Weil, T P

    1985-12-01

    A national trend toward declining average daily (inpatient) census (ADC) started in late 1982 even before the Medicare prospective payment system began. The decrease in total days will continue despite an increasing number of aged persons in the U.S. population. This decline could have been predicted from trends during 1978 to 1983, such as increasing available beds but decreasing occupancy, 100 percent increases in hospital expenses, and declining lengths of stay. Assuming that health care costs will remain as a relatively fixed part of the gross national product and no major medical advances will occur in the next five years, certain implications and options exist for facilities experiencing a declining ADC. This article discusses several considerations: Attempts to improve market share; Reduction of full-time equivalent employees; Impact of greater acuity of illness among remaining inpatients; Implications of increasing the number of physicians on medical staffs; Option of a closed medical staff by clinical specialty; Unbundling with not-for-profit and profit-making corporations; Review of mergers, consolidations, and multihospital systems to decide when this option is most appropriate; Sale of a not-for-profit hospital to an investor-owned chain, with implications facing Catholic hospitals choosing this option; Impact and difficulty of developing meaningful alternative health care systems with the hospital's medical staff; Special problems of teaching hospitals; The social issue of the hospital shifting from the community's health center to a cost center; Increased turnover of hospital CEOs; With these in mind, institutions can then focus on solutions that can sometimes be used in tandem to resolve this problem's impact. The second part of this article will discuss some of them.

  12. Mothers Working Outside the Home: What Do National Assessment Results Tell Us?

    ERIC Educational Resources Information Center

    Anderson, Bernice; And Others

    National Assessment of Educational Progress (NAEP) data show that children in grades 4, 8, and 11 whose mothers work outside the home read better than children whose mothers do not work outside the home--but the difference is small. This conclusion represents one segment of the findings of the 1983-84 National Assessment, which focused on reading…

  13. Public library consumer health information pilot project: results of a National Library of Medicine evaluation

    PubMed Central

    Wood, Fred B.; Lyon, Becky; Schell, Mary Beth; Kitendaugh, Paula; Cid, Victor H.; Siegel, Elliot R.

    2000-01-01

    In October 1998, the National Library of Medicine (NLM) launched a pilot project to learn about the role of public libraries in providing health information to the public and to generate information that would assist NLM and the National Network of Libraries of Medicine (NN/LM) in learning how best to work with public libraries in the future. Three regional medical libraries (RMLs), eight resource libraries, and forty-one public libraries or library systems from nine states and the District of Columbia were selected for participation. The pilot project included an evaluation component that was carried out in parallel with project implementation. The evaluation ran through September 1999. The results of the evaluation indicated that participating public librarians were enthusiastic about the training and information materials provided as part of the project and that many public libraries used the materials and conducted their own outreach to local communities and groups. Most libraries applied the modest funds to purchase additional Internet-accessible computers and/or upgrade their health-reference materials. However, few of the participating public libraries had health information centers (although health information was perceived as a top-ten or top-five topic of interest to patrons). Also, the project generated only minimal usage of NLM's consumer health database, known as MEDLINEplus, from the premises of the monitored libraries (patron usage from home or office locations was not tracked). The evaluation results suggested a balanced follow-up by NLM and the NN/LM, with a few carefully selected national activities, complemented by a package of targeted activities that, as of January 2000, are being planned, developed, or implemented. The results also highlighted the importance of building an evaluation component into projects like this one from the outset, to assure that objectives were met and that evaluative information was available on a timely basis, as was

  14. Public library consumer health information pilot project: results of a National Library of Medicine evaluation.

    PubMed

    Wood, F B; Lyon, B; Schell, M B; Kitendaugh, P; Cid, V H; Siegel, E R

    2000-10-01

    In October 1998, the National Library of Medicine (NLM) launched a pilot project to learn about the role of public libraries in providing health information to the public and to generate information that would assist NLM and the National Network of Libraries of Medicine (NN/LM) in learning how best to work with public libraries in the future. Three regional medical libraries (RMLs), eight resource libraries, and forty-one public libraries or library systems from nine states and the District of Columbia were selected for participation. The pilot project included an evaluation component that was carried out in parallel with project implementation. The evaluation ran through September 1999. The results of the evaluation indicated that participating public librarians were enthusiastic about the training and information materials provided as part of the project and that many public libraries used the materials and conducted their own outreach to local communities and groups. Most libraries applied the modest funds to purchase additional Internet-accessible computers and/or upgrade their health-reference materials. However, few of the participating public libraries had health information centers (although health information was perceived as a top-ten or top-five topic of interest to patrons). Also, the project generated only minimal usage of NLM's consumer health database, known as MEDLINEplus, from the premises of the monitored libraries (patron usage from home or office locations was not tracked). The evaluation results suggested a balanced follow-up by NLM and the NN/LM, with a few carefully selected national activities, complemented by a package of targeted activities that, as of January 2000, are being planned, developed, or implemented. The results also highlighted the importance of building an evaluation component into projects like this one from the outset, to assure that objectives were met and that evaluative information was available on a timely basis, as was

  15. National stereotypes of older people's competence are related to older adults' participation in paid and volunteer work.

    PubMed

    Bowen, Catherine E; Skirbekk, Vegard

    2013-11-01

    Why are older people perceived as more competent in some countries relative to others? In the current study, we investigate the extent to which national variation in perceptions of older people's competence is systematically related to national variation in the extent to which older people participate in paid and volunteer work. We used multilevel regression to analyze data from the European Social Survey and test the relationship between perceptions of older people's competence and older people's participation in paid and volunteer work across 28 countries. We controlled for a number of potentially confounding variables, including life expectancy as well as the gender ratio and average education of the older population in each country. We controlled for the average objective cognitive abilities of the older population in a subsample of 11 countries. Older people were perceived as more competent in countries in which more older people participated in paid or volunteer work, independent of life expectancy and the average education, gender makeup, and average cognitive abilities of the older population. The results suggest that older people's participation in paid and volunteer work is related to perceptions of older people's competence independent of older people's actual competence.

  16. LANDSAT-4 horizon scanner full orbit data averages

    NASA Technical Reports Server (NTRS)

    Stanley, J. P.; Bilanow, S.

    1983-01-01

    Averages taken over full orbit data spans of the pitch and roll residual measurement errors of the two conical Earth sensors operating on the LANDSAT 4 spacecraft are described. The variability of these full orbit averages over representative data throughtout the year is analyzed to demonstrate the long term stability of the sensor measurements. The data analyzed consist of 23 segments of sensor measurements made at 2 to 4 week intervals. Each segment is roughly 24 hours in length. The variation of full orbit average as a function of orbit within a day as a function of day of year is examined. The dependence on day of year is based on association the start date of each segment with the mean full orbit average for the segment. The peak-to-peak and standard deviation values of the averages for each data segment are computed and their variation with day of year are also examined.

  17. Similarity-based distortion of visual short-term memory is due to perceptual averaging.

    PubMed

    Dubé, Chad; Zhou, Feng; Kahana, Michael J; Sekuler, Robert

    2014-03-01

    A task-irrelevant stimulus can distort recall from visual short-term memory (VSTM). Specifically, reproduction of a task-relevant memory item is biased in the direction of the irrelevant memory item (Huang & Sekuler, 2010a). The present study addresses the hypothesis that such effects reflect the influence of neural averaging under conditions of uncertainty about the contents of VSTM (Alvarez, 2011; Ball & Sekuler, 1980). We manipulated subjects' attention to relevant and irrelevant study items whose similarity relationships were held constant, while varying how similar the study items were to a subsequent recognition probe. On each trial, subjects were shown one or two Gabor patches, followed by the probe; their task was to indicate whether the probe matched one of the study items. A brief cue told subjects which Gabor, first or second, would serve as that trial's target item. Critically, this cue appeared either before, between, or after the study items. A distributional analysis of the resulting mnemometric functions showed an inflation in probability density in the region spanning the spatial frequency of the average of the two memory items. This effect, due to an elevation in false alarms to probes matching the perceptual average, was diminished when cues were presented before both study items. These results suggest that (a) perceptual averages are computed obligatorily and (b) perceptual averages are relied upon to a greater extent when item representations are weakened. Implications of these results for theories of VSTM are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. A Group Neighborhood Average Clock Synchronization Protocol for Wireless Sensor Networks

    PubMed Central

    Lin, Lin; Ma, Shiwei; Ma, Maode

    2014-01-01

    Clock synchronization is a very important issue for the applications of wireless sensor networks. The sensors need to keep a strict clock so that users can know exactly what happens in the monitoring area at the same time. This paper proposes a novel internal distributed clock synchronization solution using group neighborhood average. Each sensor node collects the offset and skew rate of the neighbors. Group averaging of offset and skew rate value are calculated instead of conventional point-to-point averaging method. The sensor node then returns compensated value back to the neighbors. The propagation delay is considered and compensated. The analytical analysis of offset and skew compensation is presented. Simulation results validate the effectiveness of the protocol and reveal that the protocol allows sensor networks to quickly establish a consensus clock and maintain a small deviation from the consensus clock. PMID:25120163

  19. Appropriateness of selecting different averaging times for modelling chronic and acute exposure to environmental odours

    NASA Astrophysics Data System (ADS)

    Drew, G. H.; Smith, R.; Gerard, V.; Burge, C.; Lowe, M.; Kinnersley, R.; Sneath, R.; Longhurst, P. J.

    Odour emissions are episodic, characterised by periods of high emission rates, interspersed with periods of low emissions. It is frequently the short term, high concentration peaks that result in annoyance in the surrounding population. Dispersion modelling is accepted as a useful tool for odour impact assessment, and two approaches can be adopted. The first approach of modelling the hourly average concentration can underestimate total odour concentration peaks, resulting in annoyance and complaints. The second modelling approach involves the use of short averaging times. This study assesses the appropriateness of using different averaging times to model the dispersion of odour from a landfill site. We also examine perception of odour in the community in conjunction with the modelled odour dispersal, by using community monitors to record incidents of odour. The results show that with the shorter averaging times, the modelled pattern of dispersal reflects the pattern of observed odour incidents recorded in the community monitoring database, with the modelled odour dispersing further in a north easterly direction. Therefore, the current regulatory method of dispersion modelling, using hourly averaging times, is less successful at capturing peak concentrations, and does not capture the pattern of odour emission as indicated by the community monitoring database. The use of short averaging times is therefore of greater value in predicting the likely nuisance impact of an odour source and in framing appropriate regulatory controls.

  20. Inverse methods for estimating primary input signals from time-averaged isotope profiles

    NASA Astrophysics Data System (ADS)

    Passey, Benjamin H.; Cerling, Thure E.; Schuster, Gerard T.; Robinson, Todd F.; Roeder, Beverly L.; Krueger, Stephen K.

    2005-08-01

    Mammalian teeth are invaluable archives of ancient seasonality because they record along their growth axes an isotopic record of temporal change in environment, plant diet, and animal behavior. A major problem with the intra-tooth method is that intra-tooth isotope profiles can be extremely time-averaged compared to the actual pattern of isotopic variation experienced by the animal during tooth formation. This time-averaging is a result of the temporal and spatial characteristics of amelogenesis (tooth enamel formation), and also results from laboratory sampling. This paper develops and evaluates an inverse method for reconstructing original input signals from time-averaged intra-tooth isotope profiles. The method requires that the temporal and spatial patterns of amelogenesis are known for the specific tooth and uses a minimum length solution of the linear system Am = d, where d is the measured isotopic profile, A is a matrix describing temporal and spatial averaging during amelogenesis and sampling, and m is the input vector that is sought. Accuracy is dependent on several factors, including the total measurement error and the isotopic structure of the measured profile. The method is shown to accurately reconstruct known input signals for synthetic tooth enamel profiles and the known input signal for a rabbit that underwent controlled dietary changes. Application to carbon isotope profiles of modern hippopotamus canines reveals detailed dietary histories that are not apparent from the measured data alone. Inverse methods show promise as an effective means of dealing with the time-averaging problem in studies of intra-tooth isotopic variation.

  1. Statistical strategies for averaging EC50 from multiple dose-response experiments.

    PubMed

    Jiang, Xiaoqi; Kopp-Schneider, Annette

    2015-11-01

    In most dose-response studies, repeated experiments are conducted to determine the EC50 value for a chemical, requiring averaging EC50 estimates from a series of experiments. Two statistical strategies, the mixed-effect modeling and the meta-analysis approach, can be applied to estimate average behavior of EC50 values over all experiments by considering the variabilities within and among experiments. We investigated these two strategies in two common cases of multiple dose-response experiments in (a) complete and explicit dose-response relationships are observed in all experiments and in (b) only in a subset of experiments. In case (a), the meta-analysis strategy is a simple and robust method to average EC50 estimates. In case (b), all experimental data sets can be first screened using the dose-response screening plot, which allows visualization and comparison of multiple dose-response experimental results. As long as more than three experiments provide information about complete dose-response relationships, the experiments that cover incomplete relationships can be excluded from the meta-analysis strategy of averaging EC50 estimates. If there are only two experiments containing complete dose-response information, the mixed-effects model approach is suggested. We subsequently provided a web application for non-statisticians to implement the proposed meta-analysis strategy of averaging EC50 estimates from multiple dose-response experiments.

  2. Thermal motion in proteins: Large effects on the time-averaged interaction energies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goethe, Martin, E-mail: martingoethe@ub.edu; Rubi, J. Miguel; Fita, Ignacio

    As a consequence of thermal motion, inter-atomic distances in proteins fluctuate strongly around their average values, and hence, also interaction energies (i.e. the pair-potentials evaluated at the fluctuating distances) are not constant in time but exhibit pronounced fluctuations. These fluctuations cause that time-averaged interaction energies do generally not coincide with the energy values obtained by evaluating the pair-potentials at the average distances. More precisely, time-averaged interaction energies behave typically smoother in terms of the average distance than the corresponding pair-potentials. This averaging effect is referred to as the thermal smoothing effect. Here, we estimate the strength of the thermal smoothingmore » effect on the Lennard-Jones pair-potential for globular proteins at ambient conditions using x-ray diffraction and simulation data of a representative set of proteins. For specific atom species, we find a significant smoothing effect where the time-averaged interaction energy of a single atom pair can differ by various tens of cal/mol from the Lennard-Jones potential at the average distance. Importantly, we observe a dependency of the effect on the local environment of the involved atoms. The effect is typically weaker for bulky backbone atoms in beta sheets than for side-chain atoms belonging to other secondary structure on the surface of the protein. The results of this work have important practical implications for protein software relying on free energy expressions. We show that the accuracy of free energy expressions can largely be increased by introducing environment specific Lennard-Jones parameters accounting for the fact that the typical thermal motion of protein atoms depends strongly on their local environment.« less

  3. Thermal motion in proteins: Large effects on the time-averaged interaction energies

    NASA Astrophysics Data System (ADS)

    Goethe, Martin; Fita, Ignacio; Rubi, J. Miguel

    2016-03-01

    As a consequence of thermal motion, inter-atomic distances in proteins fluctuate strongly around their average values, and hence, also interaction energies (i.e. the pair-potentials evaluated at the fluctuating distances) are not constant in time but exhibit pronounced fluctuations. These fluctuations cause that time-averaged interaction energies do generally not coincide with the energy values obtained by evaluating the pair-potentials at the average distances. More precisely, time-averaged interaction energies behave typically smoother in terms of the average distance than the corresponding pair-potentials. This averaging effect is referred to as the thermal smoothing effect. Here, we estimate the strength of the thermal smoothing effect on the Lennard-Jones pair-potential for globular proteins at ambient conditions using x-ray diffraction and simulation data of a representative set of proteins. For specific atom species, we find a significant smoothing effect where the time-averaged interaction energy of a single atom pair can differ by various tens of cal/mol from the Lennard-Jones potential at the average distance. Importantly, we observe a dependency of the effect on the local environment of the involved atoms. The effect is typically weaker for bulky backbone atoms in beta sheets than for side-chain atoms belonging to other secondary structure on the surface of the protein. The results of this work have important practical implications for protein software relying on free energy expressions. We show that the accuracy of free energy expressions can largely be increased by introducing environment specific Lennard-Jones parameters accounting for the fact that the typical thermal motion of protein atoms depends strongly on their local environment.

  4. Disparities in tobacco marketing and product availability at the point of sale: Results of a national study.

    PubMed

    Ribisl, Kurt M; D'Angelo, Heather; Feld, Ashley L; Schleicher, Nina C; Golden, Shelley D; Luke, Douglas A; Henriksen, Lisa

    2017-12-01

    Neighborhood socioeconomic and racial/ethnic disparities exist in the amount and type of tobacco marketing at retail, but most studies are limited to a single city or state, and few have examined flavored little cigars. Our purpose is to describe tobacco product availability, marketing, and promotions in a national sample of retail stores and to examine associations with neighborhood characteristics. At a national sample of 2230 tobacco retailers in the contiguous US, we collected in-person store audit data on: Availability of products (e.g., flavored cigars), quantity of interior and exterior tobacco marketing, presence of price promotions, and marketing with youth appeal. Observational data were matched to census tract demographics. Over 95% of stores displayed tobacco marketing; the average store featured 29.5 marketing materials. 75.1% of stores displayed at least one tobacco product price promotion, including 87.2% of gas/convenience stores and 85.5% of pharmacies. 16.8% of stores featured marketing below three feet, and 81.3% of stores sold flavored cigars, both of which appeal to youth. Stores in neighborhoods with the highest (vs. lowest) concentration of African-American residents had more than two times greater odds of displaying a price promotion (OR=2.1) and selling flavored cigars (OR=2.6). Price promotions were also more common in stores located in neighborhoods with more residents under age 18. Tobacco companies use retail marketing extensively to promote their products to current customers and youth, with disproportionate targeting of African Americans. Local, state, and federal policies are needed to counteract this unhealthy retail environment. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Thermal averages in a quantum point contact with a single coherent wave packet.

    PubMed

    Heller, E J; Aidala, K E; LeRoy, B J; Bleszynski, A C; Kalben, A; Westervelt, R M; Maranowski, K D; Gossard, A C

    2005-07-01

    A novel formal equivalence between thermal averages of coherent properties (e.g., conductance) and time averages of a single wave packet arises for Fermi gases and certain geometries. In the case of one open channel in a quantum point contact (QPC), only one wave packet history, with the wave packet width equal to the thermal length, completely determines the thermally averaged conductance. The formal equivalence moreover allows very simple physical interpretations of interference features surviving under thermal averaging. Simply put, pieces of the thermal wave packet returning to the QPC along independent paths must arrive at the same time in order to interfere. Remarkably, one immediate result of this approach is that higher temperature leads to narrower wave packets and therefore better resolution of events in the time domain. In effect, experiments at 4.2 K are performing time-gated experiments at better than a gigahertz. Experiments involving thermally averaged ballistic conductance in 2DEGS are presented as an application of this picture.

  6. Improved simulation of group averaged CO2 surface concentrations using GEOS-Chem and fluxes from VEGAS

    NASA Astrophysics Data System (ADS)

    Chen, Z. H.; Zhu, J.; Zeng, N.

    2013-01-01

    CO2 measurements have been combined with simulated CO2 distributions from a transport model in order to produce the optimal estimates of CO2 surface fluxes in inverse modeling. However one persistent problem in using model-observation comparisons for this goal relates to the issue of compatibility. Observations at a single site reflect all underlying processes of various scales that usually cannot be fully resolved by model simulations at the grid points nearest the site due to lack of spatial or temporal resolution or missing processes in models. In this article we group site observations of multiple stations according to atmospheric mixing regimes and surface characteristics. The group averaged values of CO2 concentration from model simulations and observations are used to evaluate the regional model results. Using the group averaged measurements of CO2 reduces the noise of individual stations. The difference of group averaged values between observation and modeled results reflects the uncertainties of the large scale flux in the region where the grouped stations are. We compared the group averaged values between model results with two biospheric fluxes from the model Carnegie-Ames-Stanford-Approach (CASA) and VEgetation-Global-Atmosphere-Soil (VEGAS) and observations to evaluate the regional model results. Results show that the modeling group averaged values of CO2 concentrations in all regions with fluxes from VEGAS have significant improvements for most regions. There is still large difference between two model results and observations for grouped average values in North Atlantic, Indian Ocean, and South Pacific Tropics. This implies possible large uncertainties in the fluxes there.

  7. German "National Cancer Aid Monitoring" 2015-2019 - study protocol and initial results.

    PubMed

    Schneider, Sven; Görig, Tatiana; Schilling, Laura; Breitbart, Eckhard W; Greinert, Rüdiger; Diehl, Katharina

    2017-09-01

    The National Cancer Aid Monitoring of Tanning Bed Use (NCAM) project is a major German study that aims to observe the most significant risk factors for skin cancer: natural sunlight and artificial UV radiation. NCAM is a nationwide cross-sectional survey that will initially involve four rounds of data collection (so-called waves) between 2015 and 2018. Every year, a representative nationwide sample consisting of 3,000 individuals aged between 14 and 45 years will be surveyed. The cross-sectional survey will be complemented by a panel of n  =  450 current tanning bed users. The initial wave in 2015 shows an overall prevalence of tanning bed use of 29.5 %. Eleven percent of all participants had used a tanning bed within the past twelve months. Determinants of current tanning bed use included younger age, female gender, and full-time/part-time employment. The main motivations for tanning bed use reported were relaxation and increased attractiveness. NCAM is the first study worldwide to monitor skin cancer risk factors at one-year intervals using a large, nationally representative sample. Initial results indicate that, despite WHO warnings, millions of Germans use tanning beds, and that many of these users are adolescents despite legal restrictions aimed at preventing minors from using tanning beds. © 2017 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  8. What are kids vaping? Results from a national survey of US adolescents.

    PubMed

    Miech, Richard; Patrick, Megan E; O'Malley, Patrick M; Johnston, Lloyd D

    2017-07-01

    To examine what substances US youth vape. Data come from Monitoring the Future, an annual, nationally representative survey of USA 12th-grade, 10th-grade and 8th-grade students. Respondents reported what substance they vaped the last time they used a vaporiser such as an e-cigarette. Among students who had ever used a vaporiser, 65-66% last used 'just flavouring' in 12th, in 10th and in 8th grade, more than all other responses combined. In all three grades, the percentage using 'just flavouring' was above 57% for males, females, African-Americans, Hispanics, Whites, and students both with and without a parent with a college degree. Nicotine use came in a distant second, at about 20% in 12th and 10th grade and 13% in 8th grade. Taking into account youth who vaped nicotine at last use increases national estimates of tobacco/nicotine prevalence in the past 30 days by 24-38% above and beyond cigarette smoking, which is substantial but far less than estimates that assume all vaporiser users inhale nicotine. These results challenge the common assumption that all vaporiser users inhale nicotine. They (a) call into question the designation of vaporisers and e-cigarettes as ENDS ('Electronic Nicotine Delivery System'), (b) suggest that the recent rise in adolescent vaporiser use does not necessarily indicate a nicotine epidemic, and (c) indicate that vaporiser users can be candidates for primary prevention programmes. Finally, the results suggest the importance of developing different rationales for the regulation of vaporiser devices as compared to the regulation of substances marketed for vaporiser use. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  9. FEL system with homogeneous average output

    DOEpatents

    Douglas, David R.; Legg, Robert; Whitney, R. Roy; Neil, George; Powers, Thomas Joseph

    2018-01-16

    A method of varying the output of a free electron laser (FEL) on very short time scales to produce a slightly broader, but smooth, time-averaged wavelength spectrum. The method includes injecting into an accelerator a sequence of bunch trains at phase offsets from crest. Accelerating the particles to full energy to result in distinct and independently controlled, by the choice of phase offset, phase-energy correlations or chirps on each bunch train. The earlier trains will be more strongly chirped, the later trains less chirped. For an energy recovered linac (ERL), the beam may be recirculated using a transport system with linear and nonlinear momentum compactions M.sub.56, which are selected to compress all three bunch trains at the FEL with higher order terms managed.

  10. Preliminary results of the large experimental wind turbine phase of the national wind energy program

    NASA Technical Reports Server (NTRS)

    Thomas, R. L.; Sholes, J. E.

    1975-01-01

    A major phase of the wind energy program is the development of reliable wind turbines for supplying cost-competitive electrical energy. This paper discusses the preliminary results of two projects in this phase of the program. First an experimental 100 kW wind turbine design and its status are reviewed. Also discussed are the results of two parallel design studies for determining the configurations and power levels for wind turbines with minimum energy costs. These studies show wind energy costs of 7 to 1.5 c/kWH for wind turbines produced in quantities of 100 to 1000 a year and located at sites having average winds of 12 to 18 mph.

  11. Forecasting sex differences in mortality in high income nations: The contribution of smoking

    PubMed Central

    Pampel, Fred

    2011-01-01

    To address the question of whether sex differences in mortality will in the future rise, fall, or stay the same, this study uses relative smoking prevalence among males and females to forecast future changes in relative smoking-attributed mortality. Data on 21 high income nations from 1975 to 2000 and a lag between smoking prevalence and mortality allow forecasts up to 2020. Averaged across nations, the results for logged male/female ratios in smoking mortality reveal equalization of the sex differential. However, continued divergence in non-smoking mortality rates would counter convergence in smoking mortality rates and lead to future increases in the female advantage overall, particularly in nations at late stages of the cigarette epidemic (such as the United States and the United Kingdom). PMID:21874120

  12. Fundamental techniques for resolution enhancement of average subsampled images

    NASA Astrophysics Data System (ADS)

    Shen, Day-Fann; Chiu, Chui-Wen

    2012-07-01

    Although single image resolution enhancement, otherwise known as super-resolution, is widely regarded as an ill-posed inverse problem, we re-examine the fundamental relationship between a high-resolution (HR) image acquisition module and its low-resolution (LR) counterpart. Analysis shows that partial HR information is attenuated but still exists, in its LR version, through the fundamental averaging-and-subsampling process. As a result, we propose a modified Laplacian filter (MLF) and an intensity correction process (ICP) as the pre and post process, respectively, with an interpolation algorithm to partially restore the attenuated information in a super-resolution (SR) enhanced image image. Experiments show that the proposed MLF and ICP provide significant and consistent quality improvements on all 10 test images with three well known interpolation methods including bilinear, bi-cubic, and the SR graphical user interface program provided by Ecole Polytechnique Federale de Lausanne. The proposed MLF and ICP are simple in implementation and generally applicable to all average-subsampled LR images. MLF and ICP, separately or together, can be integrated into most interpolation methods that attempt to restore the original HR contents. Finally, the idea of MLF and ICP can also be applied for average, subsampled one-dimensional signal.

  13. Detection of microcystin and other cyanotoxins in lakes at Isle Royale National Park, Pictured Rocks National Lakeshore, and Sleeping Bear Dunes National Lakeshore, northern Michigan, 2012–13

    USGS Publications Warehouse

    Fuller, Lori M.; Brennan, Angela K.; Fogarty, Lisa R.; Loftin, Keith A.; Johnson, Heather E.; VanderMeulen, David D.; Lafrancois, Brenda Moraska

    2017-12-05

    is filtered surface water.Approximately 18 percent of the samples (39 of 211 samples) were analyzed by LC/MS/MS to confirm the ELISA results and to evaluate the samples for a larger suite of algal toxins. In general, the microcystin results between the ELISA and LC/MS/MS methods were similar; although, the ELISA results tended to be slightly higher than the summation of LC/MS/MS microcystin congeners. The slightly higher ELISA results might be because the ELISA microcystin method is reactive with the ADDA functional group common to all microcystins, and because not all microcystin congeners are included in the LC/MS/MS method. The LC/MS/MS method indicated that the congener microcystin-LR was the most frequently detected, followed by microcystin-WR and microcystin-YR.Sixteen of the lakes included in this study also were monitored by the NPS for nutrients. Total phosphorus (TP) concentrations were, on average, highest at the ISRO lakes, whereas total nitrogen (TN) concentrations were highest at SLBE. The average annual TN:TP ratios for the 16 lakes within the national park and national lakeshores ranged from ratios of 20 to 89. Overall, results indicated a slight increase in percentage of microcystin detections with an increase in the TN:TP ratio (R-squared 0.269 and 0.340, respectively [2012 and 2013 combined dataset] derived from linear regression).This study also indicated that even in the absence of visible algal blooms, microcystin may be present. Most microcystin concentrations did not exceed the EPA’s 10-day health advisory drinking-water benchmark. In general, these results provide a useful baseline with which to evaluate potential future changes in algal toxin concentrations.

  14. Drug resistance in Mexico: results from the National Survey on Drug-Resistant Tuberculosis.

    PubMed

    Bojorquez-Chapela, I; Bäcker, C E; Orejel, I; López, A; Díaz-Quiñonez, A; Hernández-Serrato, M I; Balandrano, S; Romero, M; Téllez-Rojo Solís, M M; Castellanos, M; Alpuche, C; Hernández-Ávila, M; López-Gatell, H

    2013-04-01

    To present estimations obtained from a population-level survey conducted in Mexico of prevalence rates of mono-, poly- and multidrug-resistant strains among newly diagnosed cases of pulmonary tuberculosis (TB), as well as the main factors associated with multidrug resistance (combined resistance to isoniazid and rifampicin). Study data came from the National Survey on TB Drug Resistance (ENTB-2008), a nationally representative survey conducted during 2008-2009 in nine states with a stratified cluster sampling design. Samples were obtained for all newly diagnosed cases of pulmonary TB in selected sites. Drug susceptibility testing (DST) was performed for anti-tuberculosis drugs. DST results were obtained for 75% of the cases. Of these, 82.2% (95%CI 79.5-84.7) were susceptible to all drugs. The prevalence of multidrug-resistant TB (MDR-TB) was estimated at 2.8% (95%CI 1.9-4.0). MDR-TB was associated with previous treatment (OR 3.3, 95%CI 1.1-9.4). The prevalence of drug resistance is relatively low in Mexico. ENTB-2008 can be used as a baseline for future follow-up of drug resistance.

  15. Childhood adversity and personality disorders: results from a nationally representative population-based study.

    PubMed

    Afifi, Tracie O; Mather, Amber; Boman, Jonathon; Fleisher, William; Enns, Murray W; Macmillan, Harriet; Sareen, Jitender

    2011-06-01

    Although, a large population-based literature exists on the relationship between childhood adversity and Axis I mental disorders, research on the link between childhood adversity and Axis II personality disorders (PDs) relies mainly on clinical samples. The purpose of the current study was to examine the relationship between a range of childhood adversities and PDs in a nationally representative sample while adjusting for Axis I mental disorders. Data were from the National Epidemiologic Survey on Alcohol and Related Conditions (NESARC; n=34,653; data collection 2004-2005); a nationally representative sample of the United States population aged 20 years and older. The results indicated that many types of childhood adversity were highly prevalent among individuals with PDs in the general population and childhood adversity was most consistently associated with schizotypal, antisocial, borderline, and narcissistic PDs. The most robust childhood adversity findings were for child abuse and neglect with cluster A and cluster B PDs after adjusting for all other types of childhood adversity, mood disorders, anxiety disorders, substance use disorders, other PD clusters, and sociodemographic variables (Odd Ratios ranging from 1.22 to 1.63). In these models, mood disorders, anxiety disorders, and substance use disorders also remained significantly associated with PD clusters (Odds Ratios ranging from 1.26 to 2.38). Further research is necessary to understand whether such exposure has a causal role in the association with PDs. In addition to preventing child maltreatment, it is important to determine ways to prevent impairment among those exposed to adversity, as this may reduce the development of PDs. Copyright © 2010 Elsevier Ltd. All rights reserved.

  16. Early Impact of a National Multi-Faceted Road Safety Intervention Program in Mexico: Results of a Time-Series Analysis

    PubMed Central

    Chandran, Aruna; Pérez-Núñez, Ricardo; Bachani, Abdulgafoor M.; Híjar, Martha; Salinas-Rodríguez, Aarón; Hyder, Adnan A.

    2014-01-01

    Background In January 2008, a national multifaceted road safety intervention program (IMESEVI) funded by the Bloomberg Philanthropies was launched in Mexico. Two years later in 2010, IMESEVI was refocused as part of a 10-country international consortium demonstration project (IMESEVI/RS10). We evaluate the initial effects of each phase of the road safety intervention project on numbers of RT crashes, injuries and deaths in Mexico and in the two main target cities of Guadalajara-Zapopan and León. Methods An interrupted time series analysis using autoregressive integrated moving average (ARIMA) modeling was performed using monthly data of rates of RT crashes and injuries (police data), as well as deaths (mortality system data) from 1999–2011 with dummy variables representing each intervention phase. Results In the period following the first intervention phase at the country level and in the city of León, the rate of RT crashes decreased significantly (p<0.05). Notably, following the second intervention phase although there was no reduction at the country level, there has been a decrease in the RT crash rate in both Guadalajara-Zapopan (p = 0.029) and in León (p = 0.029). There were no significant differences in the RT injury or death rates following either intervention phase in either city. Conclusion These initial results suggest that a multi-faceted road safety intervention program appears to be effective in reducing road crashes in a middle-income country setting. Further analysis is needed to differentiate the effects of various interventions, and to determine what other economic and political factors might have affected this change. PMID:24498114

  17. Improved diabetes management in Swedish schools: results from two national surveys.

    PubMed

    Särnblad, Stefan; Åkesson, Karin; Fernström, Lillemor; Ilvered, Rosita; Forsander, Gun

    2017-09-01

    Support in diabetes self-care in school is essential to achieve optimal school performance and metabolic control. Swedish legislation regulating support to children with chronic diseases was strengthened 2009. To compare the results of a national survey conducted 2008 and 2015 measuring parents' and diabetes specialist teams' perceptions of support in school. All pediatric diabetes centers in Sweden were invited to participate in the 2015 study. In each center, families with a child being treated for T1DM and attending preschool class or compulsory school were eligible. The parents' and the diabetes teams' opinions were collected in two separate questionnaires. Forty-one out of 42 eligible diabetes centers participated and 568 parents answered the parental questionnaire in 2015. Metabolic control had improved since the 2008 survey (55.2 ± 10.6 mmol/mol, 7.2% ± 1.0%, in 2015 compared with 61.8 ± 12.4 mmol/mol, 7.8% ± 1.1% in 2008). The proportion of children with a designated staff member responsible for supporting the child's self-care increased from 43% to 59%, (P < .01). An action plan to treat hypoglycemia was present for 65% of the children in 2015 compared with 55% in 2008 (P < .01). More parents were satisfied with the support in 2015 (65% compared with 55%, P < .01). This study shows that staff support has increased and that more parents were satisfied with the support for self-care in school in 2015 compared with 2008. More efforts are needed to implement the national legislation to achieve equal support in all Swedish schools. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  18. Multi-Model Combination techniques for Hydrological Forecasting: Application to Distributed Model Intercomparison Project Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ajami, N K; Duan, Q; Gao, X

    2005-04-11

    This paper examines several multi-model combination techniques: the Simple Multi-model Average (SMA), the Multi-Model Super Ensemble (MMSE), Modified Multi-Model Super Ensemble (M3SE) and the Weighted Average Method (WAM). These model combination techniques were evaluated using the results from the Distributed Model Intercomparison Project (DMIP), an international project sponsored by the National Weather Service (NWS) Office of Hydrologic Development (OHD). All of the multi-model combination results were obtained using uncalibrated DMIP model outputs and were compared against the best uncalibrated as well as the best calibrated individual model results. The purpose of this study is to understand how different combination techniquesmore » affect the skill levels of the multi-model predictions. This study revealed that the multi-model predictions obtained from uncalibrated single model predictions are generally better than any single member model predictions, even the best calibrated single model predictions. Furthermore, more sophisticated multi-model combination techniques that incorporated bias correction steps work better than simple multi-model average predictions or multi-model predictions without bias correction.« less

  19. Impact of B-Scan Averaging on Spectralis Optical Coherence Tomography Image Quality before and after Cataract Surgery

    PubMed Central

    Podkowinski, Dominika; Sharian Varnousfaderani, Ehsan; Simader, Christian; Bogunovic, Hrvoje; Philip, Ana-Maria; Gerendas, Bianca S.

    2017-01-01

    Background and Objective To determine optimal image averaging settings for Spectralis optical coherence tomography (OCT) in patients with and without cataract. Study Design/Material and Methods In a prospective study, the eyes were imaged before and after cataract surgery using seven different image averaging settings. Image quality was quantitatively evaluated using signal-to-noise ratio, distinction between retinal layer image intensity distributions, and retinal layer segmentation performance. Measures were compared pre- and postoperatively across different degrees of averaging. Results 13 eyes of 13 patients were included and 1092 layer boundaries analyzed. Preoperatively, increasing image averaging led to a logarithmic growth in all image quality measures up to 96 frames. Postoperatively, increasing averaging beyond 16 images resulted in a plateau without further benefits to image quality. Averaging 16 frames postoperatively provided comparable image quality to 96 frames preoperatively. Conclusion In patients with clear media, averaging 16 images provided optimal signal quality. A further increase in averaging was only beneficial in the eyes with senile cataract. However, prolonged acquisition time and possible loss of details have to be taken into account. PMID:28630764

  20. 1973 U.S. national roadside breathtesting survey : procedures and results

    DOT National Transportation Integrated Search

    1974-10-01

    Author's abstract: This first U.S. national roadside breathtesting survey was conducted at 185 roadside locations in 18 states. Random samples of 3,698 motorists were stopped between 10PM and 3AM on eight weekends in the fall of 1973. From these driv...

  1. The tidally averaged momentum balance in a partially and periodically stratified estuary

    USGS Publications Warehouse

    Stacey, M.T.; Brennan, Matthew L.; Burau, J.R.; Monismith, Stephen G.

    2010-01-01

    Observations of turbulent stresses and mean velocities over an entire spring-neap cycle are used to evaluate the dynamics of tidally averaged flows in a partially stratified estuarine channel. In a depth-averaged sense, the net flow in this channel is up estuary due to interaction of tidal forcing with the geometry of the larger basin. The depth-variable tidally averaged flow has the form of an estuarine exchange flow (downstream at the surface, upstream at depth) and varies in response to the neap-spring transition. The weakening of the tidally averaged exchange during the spring tides appears to be a result of decreased stratification on the tidal time scale rather than changes in bed stress. The dynamics of the estuarine exchange flow are defined by a balance between the vertical divergence of the tidally averaged turbulent stress and the tidally averaged pressure gradient in the lower water column. In the upper water column, tidal stresses are important contributors, particularly during the neap tides. The usefulness of an effective eddy viscosity in the tidally averaged momentum equation is explored, and it is seen that the effective eddy viscosity on the subtidal time scale would need to be negative to close the momentum balance. This is due to the dominant contribution of tidally varying turbulent momentum fluxes, which have no specific relation to the subtidal circulation. Using a water column model, the validity of an effective eddy viscosity is explored; for periodically stratified water columns, a negative effective viscosity is required. ?? 2010 American Meteorological Society.

  2. Average expansion rate and light propagation in a cosmological Tardis spacetime

    NASA Astrophysics Data System (ADS)

    Lavinto, Mikko; Räsänen, Syksy; Szybka, Sebastian J.

    2013-12-01

    We construct the first exact statistically homogeneous and isotropic cosmological solution in which inhomogeneity has a significant effect on the expansion rate. The universe is modelled as a Swiss Cheese, with dust FRW background and inhomogeneous holes. We show that if the holes are described by the quasispherical Szekeres solution, their average expansion rate is close to the background under certain rather general conditions. We specialise to spherically symmetric holes and violate one of these conditions. As a result, the average expansion rate at late times grows relative to the background, ie backreaction is significant. The holes fit smoothly into the background, but are larger on the inside than a corresponding background domain: we call them Tardis regions. We study light propagation, find the effective equations of state and consider the relation of the spatially averaged expansion rate to the redshift and the angular diameter distance.

  3. On the average configuration of the geomagnetic tail

    NASA Technical Reports Server (NTRS)

    Fairfield, D. H.

    1978-01-01

    Over 3000 hours of IMP-6 magnetic field data obtained between 20 and 33 R sub E in the geomagnetic tail have been used in a statistical study of the tail configuration. A distribution of 2.5 minute averages of B sub Z as a function of position across the tail reveals that more flux crosses the equatorial plane near the dawn and dusk flanks than near midnight. The tail field projected in the solar magnetospheric equatorial plane deviates from the X axis due to flaring and solar wind aberration by an angle alpha = -0.9 y sub SM - 1.7 where Y sub SM is in earth radii and alpha is in degrees. After removing these effects the Y component of the tail field is found to depend on interplanetary sector structure. During an away sector the B sub Y component of the tail field is on average 0.5 gamma greater than that during a toward sector, a result that is true in both tail lobes and is independent of location across the tail.

  4. Conceptual Analysis of System Average Water Stability

    NASA Astrophysics Data System (ADS)

    Zhang, H.

    2016-12-01

    Averaging over time and area, the precipitation in an ecosystem (SAP - system average precipitation) depends on the average surface temperature and relative humidity (RH) in the system if uniform convection is assumed. RH depends on the evapotranspiration of the system (SAE - system average evapotranspiration). There is a non-linear relationship between SAP and SAE. Studying this relationship can lead mechanistic understanding of the ecosystem health status and trend under different setups. If SAP is higher than SAE, the system will have a water runoff which flows out through rivers. If SAP is lower than SAE, irrigation is needed to maintain the vegetation status. This presentation will give a conceptual analysis of the stability in this relationship under different assumed areas, water or forest coverages, elevations and latitudes. This analysis shows that desert is a stable system. Water circulation in basins is also stabilized at a specific SAP based on the basin profile. It further shows that deforestation will reduce SAP, and can flip the system to an irrigation required status. If no irrigation is provided, the system will automatically reduce to its stable point - desert, which is extremely difficult to turn around.

  5. Making the Transition: Interim Results of the National Guard Youth ChalleNGe Evaluation. [Executive Summary

    ERIC Educational Resources Information Center

    Millenky, Megan; Bloom, Dan; Dillon, Colleen

    2010-01-01

    Young people who drop out of high school face long odds of success in a labor market that increasingly values education and skills. This report presents interim results from a rigorous, ongoing evaluation of the National Guard Youth ChalleNGe Program, which aims to "reclaim the lives of at-risk youth" who have dropped out of high school. ChalleNGe…

  6. Variation of socioeconomic gradients in children's developmental health across advanced Capitalist societies: analysis of 22 OECD nations.

    PubMed

    Siddiqi, Arjumand; Kawachi, Ichiro; Berkman, Lisa; Subramanian, S V; Hertzman, Clyde

    2007-01-01

    Within societies, there is a well-established relation between socioeconomic position and a wide range of outcomes related to well-being, and this relation is known to vary in magnitude across countries. Using a large sample of nations, the authors explored whether differences in social policies explain differences in socioeconomic gradients across nations. Analyses were conducted on reading literacy in 15-year-olds, as an outcome related to cognitive development and to a host of factors that contribute to future well-being, including educational attainment and health. The results show a systematic variation in socioeconomic gradients and average scores across countries. Scores were favorable in countries with a long history of welfare state regimes, but countries where institutional change unfolded more recently and rapidly, or where welfare states are less well developed, clustered at the bottom of the rankings. Strong support was found for the "flattening up" hypothesis, which suggests that nations with higher average scores have less socioeconomic inequality in scores (or flatter gradients). Potential explanations for the observed patterns include differences between nations in the extent and distribution of income and social goods important for children's development.

  7. A comparative analysis of 9 multi-model averaging approaches in hydrological continuous streamflow simulation

    NASA Astrophysics Data System (ADS)

    Arsenault, Richard; Gatien, Philippe; Renaud, Benoit; Brissette, François; Martel, Jean-Luc

    2015-10-01

    This study aims to test whether a weighted combination of several hydrological models can simulate flows more accurately than the models taken individually. In addition, the project attempts to identify the most efficient model averaging method and the optimal number of models to include in the weighting scheme. In order to address the first objective, streamflow was simulated using four lumped hydrological models (HSAMI, HMETS, MOHYSE and GR4J-6), each of which were calibrated with three different objective functions on 429 watersheds. The resulting 12 hydrographs (4 models × 3 metrics) were weighted and combined with the help of 9 averaging methods which are the simple arithmetic mean (SAM), Akaike information criterion (AICA), Bates-Granger (BGA), Bayes information criterion (BICA), Bayesian model averaging (BMA), Granger-Ramanathan average variant A, B and C (GRA, GRB and GRC) and the average by SCE-UA optimization (SCA). The same weights were then applied to the hydrographs in validation mode, and the Nash-Sutcliffe Efficiency metric was measured between the averaged and observed hydrographs. Statistical analyses were performed to compare the accuracy of weighted methods to that of individual models. A Kruskal-Wallis test and a multi-objective optimization algorithm were then used to identify the most efficient weighted method and the optimal number of models to integrate. Results suggest that the GRA, GRB, GRC and SCA weighted methods perform better than the individual members. Model averaging from these four methods were superior to the best of the individual members in 76% of the cases. Optimal combinations on all watersheds included at least one of each of the four hydrological models. None of the optimal combinations included all members of the ensemble of 12 hydrographs. The Granger-Ramanathan average variant C (GRC) is recommended as the best compromise between accuracy, speed of execution, and simplicity.

  8. Neutron Thermal Cross Sections, Westcott Factors, Resonance Integrals, Maxwellian Averaged Cross Sections and Astrophysical Reaction Rates Calculated from the ENDF/B-VII.1, JEFF-3.1.2, JENDL-4.0, ROSFOND-2010, CENDL-3.1 and EAF-2010 Evaluated Data Libraries

    NASA Astrophysics Data System (ADS)

    Pritychenko, B.; Mughabghab, S. F.

    2012-12-01

    We present calculations of neutron thermal cross sections, Westcott factors, resonance integrals, Maxwellian-averaged cross sections and astrophysical reaction rates for 843 ENDF materials using data from the major evaluated nuclear libraries and European activation file. Extensive analysis of newly-evaluated neutron reaction cross sections, neutron covariances, and improvements in data processing techniques motivated us to calculate nuclear industry and neutron physics quantities, produce s-process Maxwellian-averaged cross sections and astrophysical reaction rates, systematically calculate uncertainties, and provide additional insights on currently available neutron-induced reaction data. Nuclear reaction calculations are discussed and new results are presented. Due to space limitations, the present paper contains only calculated Maxwellian-averaged cross sections and their uncertainties. The complete data sets for all results are published in the Brookhaven National Laboratory report.

  9. 20 CFR 404.221 - Computing your average monthly wage.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... your average monthly wage, we consider all the wages, compensation, self-employment income, and deemed... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Computing your average monthly wage. 404.221... DISABILITY INSURANCE (1950- ) Computing Primary Insurance Amounts Average-Monthly-Wage Method of Computing...

  10. 20 CFR 404.221 - Computing your average monthly wage.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... your average monthly wage, we consider all the wages, compensation, self-employment income, and deemed... 20 Employees' Benefits 2 2011-04-01 2011-04-01 false Computing your average monthly wage. 404.221... DISABILITY INSURANCE (1950- ) Computing Primary Insurance Amounts Average-Monthly-Wage Method of Computing...

  11. Model averaging in linkage analysis.

    PubMed

    Matthysse, Steven

    2006-06-05

    Methods for genetic linkage analysis are traditionally divided into "model-dependent" and "model-independent," but there may be a useful place for an intermediate class, in which a broad range of possible models is considered as a parametric family. It is possible to average over model space with an empirical Bayes prior that weights models according to their goodness of fit to epidemiologic data, such as the frequency of the disease in the population and in first-degree relatives (and correlations with other traits in the pleiotropic case). For averaging over high-dimensional spaces, Markov chain Monte Carlo (MCMC) has great appeal, but it has a near-fatal flaw: it is not possible, in most cases, to provide rigorous sufficient conditions to permit the user safely to conclude that the chain has converged. A way of overcoming the convergence problem, if not of solving it, rests on a simple application of the principle of detailed balance. If the starting point of the chain has the equilibrium distribution, so will every subsequent point. The first point is chosen according to the target distribution by rejection sampling, and subsequent points by an MCMC process that has the target distribution as its equilibrium distribution. Model averaging with an empirical Bayes prior requires rapid estimation of likelihoods at many points in parameter space. Symbolic polynomials are constructed before the random walk over parameter space begins, to make the actual likelihood computations at each step of the random walk very fast. Power analysis in an illustrative case is described. (c) 2006 Wiley-Liss, Inc.

  12. Cruise survey of oxidant air pollution injury to Pinus ponderosa and Pinus jeffreyi in Saguaro National Monument, Yosemite National Park, and Sequoia and Kings Canyon National Parks. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duriscoe, D.M.

    1990-08-01

    The yellow pine populations in Saguaro National Monument, Yosemite National Park, and Sequoia and Kings Canyon National Parks were surveyed in 1986 to evaluate and quantify the extent and severity of ozone injury (chlorotic mottle) to foliage of ponderosa and Jeffrey pines. A total of 3780 trees were observed. Severity of ozone injury was quantified, using an approximate square root transformation of the percentage of foliage exhibiting chlorotic mottle in branches pruned from each tree. Foliage of different ages was examined separately. Of all trees examined at Saguaro National Monument, 15% had visible chlorotic mottle; at Yosemite, 28%; and atmore » Sequoia and Kings Canyon, 39%. Severity of injury averaged very slight for all three parks, with least injury at Saguaro and greatest at Sequoia and Kings Canyon.« less

  13. Quasi-analytical treatment of spatially averaged radiation transfer in complex terrain

    NASA Astrophysics Data System (ADS)

    LöWe, H.; Helbig, N.

    2012-10-01

    We provide a new quasi-analytical method to compute the subgrid topographic influences on the shortwave radiation fluxes and the effective albedo in complex terrain as required for large-scale meteorological, land surface, or climate models. We investigate radiative transfer in complex terrain via the radiosity equation on isotropic Gaussian random fields. Under controlled approximations we derive expressions for domain-averaged fluxes of direct, diffuse, and terrain radiation and the sky view factor. Domain-averaged quantities can be related to a type of level-crossing probability of the random field, which is approximated by long-standing results developed for acoustic scattering at ocean boundaries. This allows us to express all nonlocal horizon effects in terms of a local terrain parameter, namely, the mean-square slope. Emerging integrals are computed numerically, and fit formulas are given for practical purposes. As an implication of our approach, we provide an expression for the effective albedo of complex terrain in terms of the Sun elevation angle, mean-square slope, the area-averaged surface albedo, and the ratio of atmospheric direct beam to diffuse radiation. For demonstration we compute the decrease of the effective albedo relative to the area-averaged albedo in Switzerland for idealized snow-covered and clear-sky conditions at noon in winter. We find an average decrease of 5.8% and spatial patterns which originate from characteristics of the underlying relief. Limitations and possible generalizations of the method are discussed.

  14. Assessing the Efficacy of Adjustable Moving Averages Using ASEAN-5 Currencies.

    PubMed

    Chan Phooi M'ng, Jacinta; Zainudin, Rozaimah

    2016-01-01

    The objective of this research is to examine the trends in the exchange rate markets of the ASEAN-5 countries (Indonesia (IDR), Malaysia (MYR), the Philippines (PHP), Singapore (SGD), and Thailand (THB)) through the application of dynamic moving average trading systems. This research offers evidence of the usefulness of the time-varying volatility technical analysis indicator, Adjustable Moving Average (AMA') in deciphering trends in these ASEAN-5 exchange rate markets. This time-varying volatility factor, referred to as the Efficacy Ratio in this paper, is embedded in AMA'. The Efficacy Ratio adjusts the AMA' to the prevailing market conditions by avoiding whipsaws (losses due, in part, to acting on wrong trading signals, which generally occur when there is no general direction in the market) in range trading and by entering early into new trends in trend trading. The efficacy of AMA' is assessed against other popular moving-average rules. Based on the January 2005 to December 2014 dataset, our findings show that the moving averages and AMA' are superior to the passive buy-and-hold strategy. Specifically, AMA' outperforms the other models for the United States Dollar against PHP (USD/PHP) and USD/THB currency pairs. The results show that different length moving averages perform better in different periods for the five currencies. This is consistent with our hypothesis that a dynamic adjustable technical indicator is needed to cater for different periods in different markets.

  15. Assessing the Efficacy of Adjustable Moving Averages Using ASEAN-5 Currencies

    PubMed Central

    2016-01-01

    The objective of this research is to examine the trends in the exchange rate markets of the ASEAN-5 countries (Indonesia (IDR), Malaysia (MYR), the Philippines (PHP), Singapore (SGD), and Thailand (THB)) through the application of dynamic moving average trading systems. This research offers evidence of the usefulness of the time-varying volatility technical analysis indicator, Adjustable Moving Average (AMA′) in deciphering trends in these ASEAN-5 exchange rate markets. This time-varying volatility factor, referred to as the Efficacy Ratio in this paper, is embedded in AMA′. The Efficacy Ratio adjusts the AMA′ to the prevailing market conditions by avoiding whipsaws (losses due, in part, to acting on wrong trading signals, which generally occur when there is no general direction in the market) in range trading and by entering early into new trends in trend trading. The efficacy of AMA′ is assessed against other popular moving-average rules. Based on the January 2005 to December 2014 dataset, our findings show that the moving averages and AMA′ are superior to the passive buy-and-hold strategy. Specifically, AMA′ outperforms the other models for the United States Dollar against PHP (USD/PHP) and USD/THB currency pairs. The results show that different length moving averages perform better in different periods for the five currencies. This is consistent with our hypothesis that a dynamic adjustable technical indicator is needed to cater for different periods in different markets. PMID:27574972

  16. Measuring Time-Averaged Blood Pressure

    NASA Technical Reports Server (NTRS)

    Rothman, Neil S.

    1988-01-01

    Device measures time-averaged component of absolute blood pressure in artery. Includes compliant cuff around artery and external monitoring unit. Ceramic construction in monitoring unit suppresses ebb and flow of pressure-transmitting fluid in sensor chamber. Transducer measures only static component of blood pressure.

  17. Limitations of signal averaging due to temporal correlation in laser remote-sensing measurements.

    PubMed

    Menyuk, N; Killinger, D K; Menyuk, C R

    1982-09-15

    Laser remote sensing involves the measurement of laser-beam transmission through the atmosphere and is subject to uncertainties caused by strong fluctuations due primarily to speckle, glint, and atmospheric-turbulence effects. These uncertainties are generally reduced by taking average values of increasing numbers of measurements. An experiment was carried out to directly measure the effect of signal averaging on back-scattered laser return signals from a diffusely reflecting target using a direct-detection differential-absorption lidar (DIAL) system. The improvement in accuracy obtained by averaging over increasing numbers of data points was found to be smaller than that predicted for independent measurements. The experimental results are shown to be in excellent agreement with a theoretical analysis which considers the effect of temporal correlation. The analysis indicates that small but long-term temporal correlation severely limits the improvement available through signal averaging.

  18. Thermal effects in high average power optical parametric amplifiers.

    PubMed

    Rothhardt, Jan; Demmler, Stefan; Hädrich, Steffen; Peschel, Thomas; Limpert, Jens; Tünnermann, Andreas

    2013-03-01

    Optical parametric amplifiers (OPAs) have the reputation of being average power scalable due to the instantaneous nature of the parametric process (zero quantum defect). This Letter reveals serious challenges originating from thermal load in the nonlinear crystal caused by absorption. We investigate these thermal effects in high average power OPAs based on beta barium borate. Absorption of both pump and idler waves is identified to contribute significantly to heating of the nonlinear crystal. A temperature increase of up to 148 K with respect to the environment is observed and mechanical tensile stress up to 40 MPa is found, indicating a high risk of crystal fracture under such conditions. By restricting the idler to a wavelength range far from absorption bands and removing the crystal coating we reduce the peak temperature and the resulting temperature gradient significantly. Guidelines for further power scaling of OPAs and other nonlinear devices are given.

  19. Tropospheric ozone in the Nisqually River Drainage, Mount Rainier National Park

    USGS Publications Warehouse

    Peterson, D.L.; Bowers, Darci

    1999-01-01

    We quantified the summertime distribution of tropospheric ozone in the topographically complex Nisqually River drainage of Mount Rainier National Park from 1994 to 1997. Passive ozone samplers were used along an elevational transect to measure weekly average ozone concentrations ranging from 570 m to 2040 m elevation. Weekly average ozone concentrations were positively correlated with elevation, with the highest concentrations consistently measured at the highest sampling site (Panorama Point). Weekly average ozone concentrations at Mount Rainier National Park are considerably higher than those in the Seattle-Tacoma metropolitan area to the west. The anthropogenic contribution to ozone within the Nisqually drainage was evaluated by comparing measurements at this location with measurements from a 'reference' site in the western Olympic Mountains. The comparison suggests there is a significant anthropogenic source of ozone reaching the Cascade Range via atmospheric transport from urban areas to the west. In addition. temporal (week to week) variation in ozone distribution is synchronous within the Nisqually drainage, which indicates that subregional patterns are detectable with weekly averages. The Nisqually drainage is likely the 'hot spot' for air pollution in Mount Rainier National Park. By using passive ozone samplers in this drainage in conjunction with a limited number of continuous analyzers, the park will have a robust monitoring approach for measuring tropospheric ozone over time and protecting vegetative and human health.

  20. Succession planning in local health departments: results from a national survey.

    PubMed

    Darnell, Julie S; Campbell, Richard T

    2015-01-01

    Succession planning has received scant attention in the public health sector, despite its potential to generate operational efficiencies in a sector facing chronic budgetary pressures and an aging workforce. We examined the extent to which local health departments (LHDs) are engaged in succession planning and assessed the factors associated with having a succession plan. We conducted a national cross-sectional Web-based survey of workforce recruitment and retention activities in a sample of LHDs responding to the National Association of County & City Health Officials' 2010 Profile Study and then linked these data sets to fit a multivariable logistic regression model to explain why some LHDs have succession plans and others do not. Top executives in a national sample of LHDs. Presence or absence of succession planning. Two hundred twenty-five LHDs responded to the survey, yielding a 43.3% response rate, but no statistically significant differences between respondents and nonrespondents were detected. Only 39.5% reported having a succession plan. Performance evaluation activities are more common in LHDs with a succession plan than in LHDs without a plan. In adjusted analyses, the largest LHDs were 7 times more likely to have a succession plan than the smallest. Compared with state-governed LHDs, locally governed LHDs were 3.5 times more likely, and shared governance LHDs were 6 times more likely, to have a succession plan. Every additional year of experience by the top executive was associated with a 5% increase in the odds of having a succession plan. Local health departments that report high levels of concern about retaining staff (vs low concern) had 2.5 times higher adjusted odds of having a succession plan. This study provides the first national data on succession planning in LHDs and sheds light on LHDs' readiness to meet the workforce-related accreditation standards.