Sample records for validation study table

  1. Validation of a Task Network Human Performance Model of Driving

    DTIC Science & Technology

    2007-04-01

    34 Table 23. NASA - TLX scores for study conditions...35 Table 24. ANOVA for NASA - TLX scores for study conditions (α = 0.05)...............................35 Table 25...Significant difference between conditions for NASA - TLX in the simulator study.....36 Table 26. ANOVA table for mental demand subscale of NASA - TLX

  2. The Dutch motor skills assessment as tool for talent development in table tennis: a reproducibility and validity study.

    PubMed

    Faber, Irene R; Nijhuis-Van Der Sanden, Maria W G; Elferink-Gemser, Marije T; Oosterveld, Frits G J

    2015-01-01

    A motor skills assessment could be helpful in talent development by estimating essential perceptuo-motor skills of young players, which are considered requisite to develop excellent technical and tactical qualities. The Netherlands Table Tennis Association uses a motor skills assessment in their talent development programme consisting of eight items measuring perceptuo-motor skills specific to table tennis under varying conditions. This study aimed to investigate this assessment regarding its reproducibility, internal consistency, underlying dimensions and concurrent validity in 113 young table tennis players (6-10 years). Intraclass correlation coefficients of six test items met the criteria of 0.7 with coefficients of variation between 3% and 8%. Cronbach's alpha valued 0.853 for internal consistency. The principal components analysis distinguished two conceptually meaningful factors: "ball control" and "gross motor function." Concurrent validity analyses demonstrated moderate associations between the motor skills assessment's results and national ranking; boys r = -0.53 (P < 0.001) and girls r = -0.45 (P = 0.015). In conclusion, this evaluation demonstrated six test items with acceptable reproducibility, good internal consistency and good prospects for validity. Two test items need revision to upgrade reproducibility. Since the motor skills assessment seems to be a reproducible, objective part of a talent development programme, more longitudinal studies are required to investigate its predictive validity.

  3. Does an eye-hand coordination test have added value as part of talent identification in table tennis? A validity and reproducibility study.

    PubMed

    Faber, Irene R; Oosterveld, Frits G J; Nijhuis-Van der Sanden, Maria W G

    2014-01-01

    This study investigated the added value, i.e. discriminative and concurrent validity and reproducibility, of an eye-hand coordination test relevant to table tennis as part of talent identification. Forty-three table tennis players (7-12 years) from national (n = 13), regional (n = 11) and local training centres (n = 19) participated. During the eye-hand coordination test, children needed to throw a ball against a vertical positioned table tennis table with one hand and to catch the ball correctly with the other hand as frequently as possible in 30 seconds. Four different test versions were assessed varying the distance to the table (1 or 2 meter) and using a tennis or table tennis ball. 'Within session' reproducibility was estimated for the two attempts of the initial tests and ten youngsters were retested after 4 weeks to estimate 'between sessions' reproducibility. Validity analyses using age as covariate showed that players from the national and regional centres scored significantly higher than players from the local centre in all test versions (p<0.05). The tests at 1 meter demonstrated better discriminative ability than those at 2 meter. While all tests but one had a positive significant association with competition outcome, which were corrected for age influences, the version with a table tennis ball at 1 meter showed the highest association (r = 0.54; p = 0.001). Differences between the first and second attempts were comparable for all test versions (between -8 and +7 repetitions) with ICC's ranging from 0.72 to 0.87. The smallest differences were found for the test with a table tennis ball at 1 meter (between -3 and +3 repetitions). Best test version as part of talent identification appears to be the version with a table tennis ball at 1 meter regarding the psychometric characteristics evaluated. Longitudinal studies are necessary to evaluate the predictive value of this test.

  4. A multisite validation of whole slide imaging for primary diagnosis using standardized data collection and analysis.

    PubMed

    Wack, Katy; Drogowski, Laura; Treloar, Murray; Evans, Andrew; Ho, Jonhan; Parwani, Anil; Montalto, Michael C

    2016-01-01

    Text-based reporting and manual arbitration for whole slide imaging (WSI) validation studies are labor intensive and do not allow for consistent, scalable, and repeatable data collection or analysis. The objective of this study was to establish a method of data capture and analysis using standardized codified checklists and predetermined synoptic discordance tables and to use these methods in a pilot multisite validation study. Fifteen case report form checklists were generated from the College of American Pathology cancer protocols. Prior to data collection, all hypothetical pairwise comparisons were generated, and a level of harm was determined for each possible discordance. Four sites with four pathologists each generated 264 independent reads of 33 cases. Preestablished discordance tables were applied to determine site by site and pooled accuracy, intrareader/intramodality, and interreader intramodality error rates. Over 10,000 hypothetical pairwise comparisons were evaluated and assigned harm in discordance tables. The average difference in error rates between WSI and glass, as compared to ground truth, was 0.75% with a lower bound of 3.23% (95% confidence interval). Major discordances occurred on challenging cases, regardless of modality. The average inter-reader agreement across sites for glass was 76.5% (weighted kappa of 0.68) and for digital it was 79.1% (weighted kappa of 0.72). These results demonstrate the feasibility and utility of employing standardized synoptic checklists and predetermined discordance tables to gather consistent, comprehensive diagnostic data for WSI validation studies. This method of data capture and analysis can be applied in large-scale multisite WSI validations.

  5. Dietary Screener in the 2009 CHIS: Validation

    Cancer.gov

    In the Eating at America's Table Study and the Observing Protein and Energy Nutrition Study, Risk Factors Branch staff assessed the validity of created aggregate variables from the 2009 CHIS Dietary Screener.

  6. 40 CFR Table 3 of Subpart Aaaa to... - Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Emission Monitoring Systems (CEMS) 3 Table 3 of Subpart AAAA to Part 60 Protection of Environment... SOURCES Pt. 60, Subpt. AAAA, Table 3 Table 3 of Subpart AAAA to Part 60—Requirements for Validating... following methods in appendix A of this part to measure oxygen (or carbon dioxide) 1. Nitrogen Oxides (Class...

  7. Does an Eye-Hand Coordination Test Have Added Value as Part of Talent Identification in Table Tennis? A Validity and Reproducibility Study

    PubMed Central

    Faber, Irene R.; Oosterveld, Frits G. J.; Nijhuis-Van der Sanden, Maria W. G.

    2014-01-01

    This study investigated the added value, i.e. discriminative and concurrent validity and reproducibility, of an eye-hand coordination test relevant to table tennis as part of talent identification. Forty-three table tennis players (7–12 years) from national (n = 13), regional (n = 11) and local training centres (n = 19) participated. During the eye-hand coordination test, children needed to throw a ball against a vertical positioned table tennis table with one hand and to catch the ball correctly with the other hand as frequently as possible in 30 seconds. Four different test versions were assessed varying the distance to the table (1 or 2 meter) and using a tennis or table tennis ball. ‘Within session’ reproducibility was estimated for the two attempts of the initial tests and ten youngsters were retested after 4 weeks to estimate ‘between sessions’ reproducibility. Validity analyses using age as covariate showed that players from the national and regional centres scored significantly higher than players from the local centre in all test versions (p<0.05). The tests at 1 meter demonstrated better discriminative ability than those at 2 meter. While all tests but one had a positive significant association with competition outcome, which were corrected for age influences, the version with a table tennis ball at 1 meter showed the highest association (r = 0.54; p = 0.001). Differences between the first and second attempts were comparable for all test versions (between −8 and +7 repetitions) with ICC's ranging from 0.72 to 0.87. The smallest differences were found for the test with a table tennis ball at 1 meter (between −3 and +3 repetitions). Best test version as part of talent identification appears to be the version with a table tennis ball at 1 meter regarding the psychometric characteristics evaluated. Longitudinal studies are necessary to evaluate the predictive value of this test. PMID:24465638

  8. Development and Preliminary Validation of the Strategic Thinking Mindset Test (STMT)

    DTIC Science & Technology

    2017-06-01

    reliability. The test’s three subscales (intellectual flexibility, inclusiveness, and humility) each correlated significantly with alternative measures of...34 TABLE 9. STAGE 4 SAMPLE DEMOGRAPHICS ................................................................ 35 TABLE 10. INTERITEM CORRELATION ...MATRIX (ALL ITEMS) ...................................... 39 TABLE 11. ITEM-SCALE AND VALIDITY CORRELATIONS (ALL ITEMS) .................... 40

  9. Ecological periodic tables for benthic macrofaunal usage of estuarine habitats : Insights from a case study in Tillamook bay, Oregon, USA

    EPA Science Inventory

    This study validates the ecological relevance of estuarine habitat types to the benthic macrofaunal community and, together with previous similar studies, suggests they can serve as elements in ecological periodic tables of benthic macrofaunal usage in the bioregion. We compared...

  10. Five-Factor Screener in the 2005 National Health Interview Survey Cancer Control Supplement: Validation Results

    Cancer.gov

    Risk Factor Assessment Branch staff have assessed indirectly the validity of parts of the Five-Factor Screener in two studies: NCI's Observing Protein and Energy (OPEN) Study and the Eating at America's Table Study (EATS). In both studies, multiple 24-hour recalls in conjunction with a measurement error model were used to assess validity.

  11. Multifactor Screener in the 2000 National Health Interview Survey Cancer Control Supplement: Validation Results

    Cancer.gov

    Risk Factor Assessment Branch (RFAB) staff have assessed the validity of the Multifactor Screener in several studies: NCI's Observing Protein and Energy (OPEN) Study, the Eating at America's Table Study (EATS), and the joint NIH-AARP Diet and Health Study.

  12. A new framework of statistical inferences based on the valid joint sampling distribution of the observed counts in an incomplete contingency table.

    PubMed

    Tian, Guo-Liang; Li, Hui-Qiong

    2017-08-01

    Some existing confidence interval methods and hypothesis testing methods in the analysis of a contingency table with incomplete observations in both margins entirely depend on an underlying assumption that the sampling distribution of the observed counts is a product of independent multinomial/binomial distributions for complete and incomplete counts. However, it can be shown that this independency assumption is incorrect and can result in unreliable conclusions because of the under-estimation of the uncertainty. Therefore, the first objective of this paper is to derive the valid joint sampling distribution of the observed counts in a contingency table with incomplete observations in both margins. The second objective is to provide a new framework for analyzing incomplete contingency tables based on the derived joint sampling distribution of the observed counts by developing a Fisher scoring algorithm to calculate maximum likelihood estimates of parameters of interest, the bootstrap confidence interval methods, and the bootstrap testing hypothesis methods. We compare the differences between the valid sampling distribution and the sampling distribution under the independency assumption. Simulation studies showed that average/expected confidence-interval widths of parameters based on the sampling distribution under the independency assumption are shorter than those based on the new sampling distribution, yielding unrealistic results. A real data set is analyzed to illustrate the application of the new sampling distribution for incomplete contingency tables and the analysis results again confirm the conclusions obtained from the simulation studies.

  13. Hazing DEOCS 4.1 Construct Validity Summary

    DTIC Science & Technology

    2017-08-01

    Hazing DEOCS 4.1 Construct Validity Summary DEFENSE EQUAL OPPORTUNITY MANAGEMENT INSTITUTE DIRECTORATE OF...the analysis. Tables 4 – 6 provide additional information regarding the descriptive statistics and reliability of the Hazing items. Table 7 provides

  14. 40 CFR Table 6 to Subpart Bbbb of... - Model Rule-Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Continuous Emission Monitoring Systems (CEMS) 6 Table 6 to Subpart BBBB of Part 60 Protection of Environment...—Requirements for Validating Continuous Emission Monitoring Systems (CEMS) For the following continuous emission monitoring systems Use the following methods in appendix A of this part to validate poollutant concentratin...

  15. 40 CFR Table 6 to Subpart Bbbb of... - Model Rule-Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Continuous Emission Monitoring Systems (CEMS) 6 Table 6 to Subpart BBBB of Part 60 Protection of Environment...—Requirements for Validating Continuous Emission Monitoring Systems (CEMS) For the following continuous emission monitoring systems Use the following methods in appendix A of this part to validate poollutant concentratin...

  16. 40 CFR Table 6 to Subpart Jjj of... - Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 8 2011-07-01 2011-07-01 false Requirements for Validating Continuous Emission Monitoring Systems (CEMS) 6 Table 6 to Subpart JJJ of Part 62 Protection of Environment... of Part 62—Requirements for Validating Continuous Emission Monitoring Systems (CEMS) ER31JA03.012 ...

  17. 40 CFR Table 6 to Subpart Jjj of... - Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 8 2010-07-01 2010-07-01 false Requirements for Validating Continuous Emission Monitoring Systems (CEMS) 6 Table 6 to Subpart JJJ of Part 62 Protection of Environment... of Part 62—Requirements for Validating Continuous Emission Monitoring Systems (CEMS) ER31JA03.012 ...

  18. Some applications of categorical data analysis to epidemiological studies.

    PubMed Central

    Grizzle, J E; Koch, G G

    1979-01-01

    Several examples of categorized data from epidemiological studies are analyzed to illustrate that more informative analysis than tests of independence can be performed by fitting models. All of the analyses fit into a unified conceptual framework that can be performed by weighted least squares. The methods presented show how to calculate point estimate of parameters, asymptotic variances, and asymptotically valid chi 2 tests. The examples presented are analysis of relative risks estimated from several 2 x 2 tables, analysis of selected features of life tables, construction of synthetic life tables from cross-sectional studies, and analysis of dose-response curves. PMID:540590

  19. Validating and comparing GNSS antenna calibrations

    NASA Astrophysics Data System (ADS)

    Kallio, Ulla; Koivula, Hannu; Lahtinen, Sonja; Nikkonen, Ville; Poutanen, Markku

    2018-03-01

    GNSS antennas have no fixed electrical reference point. The variation of the phase centre is modelled and tabulated in antenna calibration tables, which include the offset vector (PCO) and phase centre variation (PCV) for each frequency according to the elevations and azimuths of the incoming signal. Used together, PCV and PCO reduce the phase observations to the antenna reference point. The remaining biases, called the residual offsets, can be revealed by circulating and rotating the antennas on pillars. The residual offsets are estimated as additional parameters when combining the daily GNSS network solutions with full covariance matrix. We present a procedure for validating the antenna calibration tables. The dedicated test field, called Revolver, was constructed at Metsähovi. We used the procedure to validate the calibration tables of 17 antennas. Tables from the IGS and three different calibration institutions were used. The tests show that we were able to separate the residual offsets at the millimetre level. We also investigated the influence of the calibration tables from the different institutions on site coordinates by performing kinematic double-difference baseline processing of the data from one site with different antenna tables. We found small but significant differences between the tables.

  20. The Development and Validation of the Social Privilege Measure

    ERIC Educational Resources Information Center

    Black, Linda L.; Stone, David A.; Hutchinson, Susan R.; Suarez, Elisabeth C.

    2007-01-01

    Privilege and oppression have an impact on society in numerous ways. Although studied in many disciplines, few empirical measures of these social constructs exist for educators or researchers. The 2 studies presented in this article describe the development and validation of the scores yielded by the Social Privilege Measure. (Contains 4 tables.)

  1. Pretreatment tables predicting pathologic stage of locally advanced prostate cancer.

    PubMed

    Joniau, Steven; Spahn, Martin; Briganti, Alberto; Gandaglia, Giorgio; Tombal, Bertrand; Tosco, Lorenzo; Marchioro, Giansilvio; Hsu, Chao-Yu; Walz, Jochen; Kneitz, Burkhard; Bader, Pia; Frohneberg, Detlef; Tizzani, Alessandro; Graefen, Markus; van Cangh, Paul; Karnes, R Jeffrey; Montorsi, Francesco; van Poppel, Hein; Gontero, Paolo

    2015-02-01

    Pretreatment tables for the prediction of pathologic stage have been published and validated for localized prostate cancer (PCa). No such tables are available for locally advanced (cT3a) PCa. To construct tables predicting pathologic outcome after radical prostatectomy (RP) for patients with cT3a PCa with the aim to help guide treatment decisions in clinical practice. This was a multicenter retrospective cohort study including 759 consecutive patients with cT3a PCa treated with RP between 1987 and 2010. Retropubic RP and pelvic lymphadenectomy. Patients were divided into pretreatment prostate-specific antigen (PSA) and biopsy Gleason score (GS) subgroups. These parameters were used to construct tables predicting pathologic outcome and the presence of positive lymph nodes (LNs) after RP for cT3a PCa using ordinal logistic regression. In the model predicting pathologic outcome, the main effects of biopsy GS and pretreatment PSA were significant. A higher GS and/or higher PSA level was associated with a more unfavorable pathologic outcome. The validation procedure, using a repeated split-sample method, showed good predictive ability. Regression analysis also showed an increasing probability of positive LNs with increasing PSA levels and/or higher GS. Limitations of the study are the retrospective design and the long study period. These novel tables predict pathologic stage after RP for patients with cT3a PCa based on pretreatment PSA level and biopsy GS. They can be used to guide decision making in men with locally advanced PCa. Our study might provide physicians with a useful tool to predict pathologic stage in locally advanced prostate cancer that might help select patients who may need multimodal treatment. Copyright © 2014 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  2. Evaluations of the psychometric properties of the Recovery-Stress Questionnaire for Athletes among a sample of young French table tennis players.

    PubMed

    Martinent, Guillaume; Decret, Jean-Claude; Isoard-Gautheur, Sandrine; Filaire, Edith; Ferrand, Claude

    2014-04-01

    This study used confirmatory factor analyses (CFAs) among a sample of young French table tennis players to test: (a) original 19-factor structure, (b) 14-factor structure recently suggested in literature, and (c) hierarchical factor structure of the Recovery-Stress Questionnaire for Athletes (RESTQ-Sport). 148 table tennis players completed the RESTQ-Sport and other self-report questionnaires between one to five occasions with a delay of 1 mo. between each completion. Results of CFAs showed: (a) evidence for relative superiority of the original model in comparison to an alternative model recently proposed in literature, (b) a good fit of the data for the 67-item 17-factor model of the RESTQ-Sport, and (c) an acceptable fit of the data for the hierarchical model of the RESTQ-Sport. Correlations between RESTQ-Sport subscales and burnout and motivation subscales also provided evidence for criterion-related validity of the RESTQ-Sport. This study provided support for reliability and validity of the RESTQ-Sport.

  3. Group Cohesion DEOCS 4.1 Construct Validity Summary

    DTIC Science & Technology

    2017-08-01

    Group Cohesion DEOCS 4.1 Construct Validity Summary DEFENSE EQUAL OPPORTUNITY MANAGEMENT INSTITUTE DIRECTORATE...See Table 4 for more information regarding item reliabilities. The relationship between the original four-point scale (Organizational Cohesion) and...future analyses, including those using the seven-point scale. Tables 4 and 5 provide additional information regarding the reliability and descriptive

  4. 40 CFR Table 3 of Subpart Aaaa to... - Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Requirements for Validating Continuous Emission Monitoring Systems (CEMS) 3 Table 3 of Subpart AAAA to Part 60 Protection of Environment... Continuous Emission Monitoring Systems (CEMS) For the following continuous emission monitoring systems Use...

  5. 40 CFR Table 3 of Subpart Aaaa to... - Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 6 2011-07-01 2011-07-01 false Requirements for Validating Continuous Emission Monitoring Systems (CEMS) 3 Table 3 of Subpart AAAA to Part 60 Protection of Environment... Continuous Emission Monitoring Systems (CEMS) For the following continuous emission monitoring systems Use...

  6. Connectedness DEOCS 4.1 Construct Validity Summary

    DTIC Science & Technology

    2017-08-01

    Connectedness DEOCS 4.1 Construct Validity Summary DEFENSE EQUAL OPPORTUNITY MANAGEMENT INSTITUTE DIRECTORATE OF...appropriate statistical method to analyze these data. This EFA yielded a single factor solution. Refer to Table 6 for more information . Table 6...top management team perceptions of CEO charisma. Academy of Management Journal, 49(1), 161-174. Assessment to Solutions. (2016). Retrieved from https

  7. 40 CFR Table 3 of Subpart Aaaa of... - Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Emission Monitoring Systems (CEMS) 3 Table 3 of Subpart AAAA of Part 60 Protection of Environment... Definitions What definitions must I know? Pt. 60, Subpt. AAAA, Table 3 Table 3 of Subpart AAAA of Part 60... levels Use the following methods in appendix A of this part to measure oxygen (or carbon dioxide) 1...

  8. TSP Symposium 2012 Proceedings

    DTIC Science & Technology

    2012-11-01

    and Statistical Model 78 7.3 Analysis and Results 79 7.4 Threats to Validity and Limitations 85 7.5 Conclusions 86 7.6 Acknowledgments 87 7.7...Table 12: Overall Statistics of the Experiment 32 Table 13: Results of Pairwise ANOVA Analysis, Highlighting Statistically Significant Differences...we calculated the percentage of defects injected. The distribution statistics are shown in Table 2. Table 2: Mean Lower, Upper Confidence Interval

  9. 40 CFR Table 6 to Subpart Bbbb of... - Model Rule-Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Continuous Emission Monitoring Systems (CEMS) 6 Table 6 to Subpart BBBB of Part 60 Protection of Environment... or Before August 30, 1999 Pt. 60, Subpt. BBBB, Table 6 Table 6 to Subpart BBBB of Part 60—Model Rule... levels Use the following methods in appendix A of this part to measure oxygen (or carbon dioxide) 1...

  10. 40 CFR Table 6 to Subpart Bbbb of... - Model Rule-Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Continuous Emission Monitoring Systems (CEMS) 6 Table 6 to Subpart BBBB of Part 60 Protection of Environment... or Before August 30, 1999 Pt. 60, Subpt. BBBB, Table 6 Table 6 to Subpart BBBB of Part 60—Model Rule... levels Use the following methods in appendix A of this part to measure oxygen (or carbon dioxide) 1...

  11. Construct Validation of the Louisiana School Analysis Model (SAM) Instructional Staff Questionnaire

    ERIC Educational Resources Information Center

    Bray-Clark, Nikki; Bates, Reid

    2005-01-01

    The purpose of this study was to validate the Louisiana SAM Instructional Staff Questionnaire, a key component of the Louisiana School Analysis Model. The model was designed as a comprehensive evaluation tool for schools. Principle axis factoring with oblique rotation was used to uncover the underlying structure of the SISQ. (Contains 1 table.)

  12. Permanent-File-Validation Utility Computer Program

    NASA Technical Reports Server (NTRS)

    Derry, Stephen D.

    1988-01-01

    Errors in files detected and corrected during operation. Permanent File Validation (PFVAL) utility computer program provides CDC CYBER NOS sites with mechanism to verify integrity of permanent file base. Locates and identifies permanent file errors in Mass Storage Table (MST) and Track Reservation Table (TRT), in permanent file catalog entries (PFC's) in permit sectors, and in disk sector linkage. All detected errors written to listing file and system and job day files. Program operates by reading system tables , catalog track, permit sectors, and disk linkage bytes to vaidate expected and actual file linkages. Used extensively to identify and locate errors in permanent files and enable online correction, reducing computer-system downtime.

  13. Measuring Microaggression and Organizational Climate Factors in Military Units

    DTIC Science & Technology

    2011-04-01

    i.e., items) to accurately assess what we intend for them to measure. To assess construct and convergent validity, the author assessed the statistical ...sample indicated both convergent and construct validity of the microaggression scale. Table 5 presents these statistics . Measuring Microaggressions...models. As shown in Table 7, the measurement models had acceptable fit indices. That is, the Chi-square statistics were at their minimum; although the

  14. What is the Eating at America's Table Study (EATS)?

    Cancer.gov

    EATS is a study that was designed to validate the Diet History Questionnaire, a new and improved food frequency questionnaire developed by NCI staff. The study was novel in that it examined not only the DHQ, but also two other widely used FFQs.

  15. Validation of a modified table to map the 1998 Abbreviated Injury Scale to the 2008 scale and the use of adjusted severities.

    PubMed

    Tohira, Hideo; Jacobs, Ian; Mountain, David; Gibson, Nick; Yeo, Allen; Ueno, Masato; Watanabe, Hiroaki

    2011-12-01

    The Abbreviated Injury Scale 2008 (AIS 2008) is the most recent injury coding system. A mapping table from a previous AIS 98 to AIS 2008 is available. However, AIS 98 codes that are unmappable to AIS 2008 codes exist in this table. Furthermore, some AIS 98 codes can be mapped to multiple candidate AIS 2008 codes with different severities. We aimed to modify the original table to adjust the severities and to validate these changes. We modified the original table by adding links from unmappable AIS 98 codes to AIS 2008 codes. We applied the original table and our modified table to AIS 98 codes for major trauma patients. We also assigned candidate codes with different severities the weighted averages of their severities as an adjusted severity. The proportion of cases whose injury severity scores (ISSs) were computable were compared. We also compared the agreement of the ISS and New ISS (NISS) between manually determined AIS 2008 codes (MAN) and mapped codes by using our table (MAP) with unadjusted or adjusted severities. All and 72.3% of cases had their ISSs computed by our modified table and the original table, respectively. The agreement between MAN and MAP with respect to the ISS and NISS was substantial (intraclass correlation coefficient = 0.939 for ISS and 0.943 for NISS). Using adjusted severities, the agreements of the ISS and NISS improved to 0.953 (p = 0.11) and 0.963 (p = 0.007), respectively. Our modified mapping table seems to allow more ISSs to be computed than the original table. Severity scores exhibited substantial agreement between MAN and MAP. The use of adjusted severities improved these agreements further.

  16. Validating a topographically driven model of peatland water table: Implications for understanding land cover controls on water table.

    NASA Astrophysics Data System (ADS)

    Evans, Martin; Allott, Tim; Worrall, Fred; Rowson, James; Maskill, Rachael

    2014-05-01

    Water table is arguably the dominant control on biogeochemical cycling in peatland systems. Local water tables are controlled by peat surface water balance and lateral transfer of water driven by slope can be a significant component of this balance. In particular, blanket peatlands typically have relatively high surface slope compared to other peatland types so that there is the potential for water table to be significantly contolled by topographic context. UK blanket peatlands are also significantly eroded so that there is the potential for additional topographic drainage of the peatland surface. This paper presents a topographically driven model of blanket peat water table. An initial model presented in Allott et al. (2009) has been refined and tested against further water table data collected across the Bleaklow and Kinderscout plateaux of the English Peak District. The water table model quantifies the impact of peat erosion on water table throughout this dramatically dissected landscape demonstrating that almost 50% of the landscape has suffered significant water table drawdown. The model calibrates the impact of slope and degree of dissection on local water tables but does not incorporate any effects of surface cover on water table conditions. Consequently significant outliers in the test data are potentially indicative of important impacts of surface cover on water table conditions. In the test data presented here sites associated with regular moorland burning are significant outliers. The data currently available do not allow us to draw conclusions around the impact of land cover but they indicate an important potential application of the validated model in controlling for topographic position in further testing of the impact of land cover on peatland water tables. Allott, T.E.H. & Evans, M.G., Lindsay, J.B., Agnew, C.T., Freer, J.E., Jones, A. & Parnell, M. Water tables in Peak District blanket peatlands. Moors for the Future Report No. 17. Moors for the Future Partnership, Edale, 47pp.

  17. Discoloration of polyvinyl chloride (PVC) tape as a proxy for water-table depth in peatlands: validation and assessment of seasonal variability

    USGS Publications Warehouse

    Booth, Robert K.; Hotchkiss, Sara C.; Wilcox, Douglas A.

    2005-01-01

    Summary: 1. Discoloration of polyvinyl chloride (PVC) tape has been used in peatland ecological and hydrological studies as an inexpensive way to monitor changes in water-table depth and reducing conditions. 2. We investigated the relationship between depth of PVC tape discoloration and measured water-table depth at monthly time steps during the growing season within nine kettle peatlands of northern Wisconsin. Our specific objectives were to: (1) determine if PVC discoloration is an accurate method of inferring water-table depth in Sphagnum-dominated kettle peatlands of the region; (2) assess seasonal variability in the accuracy of the method; and (3) determine if systematic differences in accuracy occurred among microhabitats, PVC tape colour and peatlands. 3. Our results indicated that PVC tape discoloration can be used to describe gradients of water-table depth in kettle peatlands. However, accuracy differed among the peatlands studied, and was systematically biased in early spring and late summer/autumn. Regardless of the month when the tape was installed, the highest elevations of PVC tape discoloration showed the strongest correlation with midsummer (around July) water-table depth and average water-table depth during the growing season. 4. The PVC tape discoloration method should be used cautiously when precise estimates are needed of seasonal changes in the water-table.

  18. Validating Future Force Performance Measures (Army Class): Concluding Analyses

    DTIC Science & Technology

    2016-06-01

    32 Table 3.10. Descriptive Statistics and Intercorrelations for LV Final Predictor Factor Scores...55 Table 4.7. Descriptive Statistics for Analysis Criteria...Soldier attrition and performance: Dependability (Non- Delinquency ), Adjustment, Physical Conditioning, Leadership, Work Orientation, and Agreeableness

  19. Sensitivity of stream flow and water table depth to potential climatic variability in a coastal forested watershed

    Treesearch

    Zhaohua Dai; Carl Trettin; Changsheng Li; Devendra M. Amatya; Ge Sun; Harbin Li

    2010-01-01

    A physically based distributed hydrological model, MIKE SHE, was used to evaluate the effects of altered temperature and precipitation regimes on the streamflow and water table in a forested watershed on the southeastern Atlantic coastal plain. The model calibration and validation against both streamflow and water table depth showed that the MIKE SHE was applicable for...

  20. High Ripples Reduction in DTC of Induction Motor by Using a New Reduced Switching Table

    NASA Astrophysics Data System (ADS)

    Mokhtari, Bachir; Benkhoris, Mohamed F.

    2016-05-01

    The direct torque and flux control (DTC) of electrical motors is characterized by ripples of torque and flux. Among the many solutions proposed to reduce them is to use modified switching tables which is very advantageous; because its implementation is easy and requires no additional cost compared to other solutions. This paper proposes a new reduced switching table (RST) to improve the DTC by reducing harmful ripples of torque and flux. This new switching table is smaller than the conventional one (CST) and depends principally at the flux error. This solution is studied by simulation under Matlab/Simulink and experimentally validated on a testbed with DSPACE1103. The results obtained of a DTC with RST applied to a three-phase induction motor (IM) show a good improvement and an effectiveness of proposed solution, the torque ripple decreases about 47% and 3% for the stator flux compared with a basic DTC.

  1. Next-Generation NATO Reference Mobility Model (NRMM) Development (Developpement de la nouvella generation du modele de mobilite de reference de l’OTAN (NRMM))

    DTIC Science & Technology

    2018-01-01

    Profile Database E-17 Attachment 2: NRMM Data Input Requirements E-25 Attachment 3: General Physics -Based Model Data Input Requirements E-28...E-15 Figure E-11 Examples of Unique Surface Types E-20 Figure E-12 Correlating Physical Testing with Simulation E-21 Figure E-13 Simplified Tire...Table 10-8 Scoring Values 10-19 Table 10-9 Accuracy – Physics -Based 10-20 Table 10-10 Accuracy – Validation Through Measurement 10-22 Table 10-11

  2. A generic method for improving the spatial interoperability of medical and ecological databases.

    PubMed

    Ghenassia, A; Beuscart, J B; Ficheur, G; Occelli, F; Babykina, E; Chazard, E; Genin, M

    2017-10-03

    The availability of big data in healthcare and the intensive development of data reuse and georeferencing have opened up perspectives for health spatial analysis. However, fine-scale spatial studies of ecological and medical databases are limited by the change of support problem and thus a lack of spatial unit interoperability. The use of spatial disaggregation methods to solve this problem introduces errors into the spatial estimations. Here, we present a generic, two-step method for merging medical and ecological databases that avoids the use of spatial disaggregation methods, while maximizing the spatial resolution. Firstly, a mapping table is created after one or more transition matrices have been defined. The latter link the spatial units of the original databases to the spatial units of the final database. Secondly, the mapping table is validated by (1) comparing the covariates contained in the two original databases, and (2) checking the spatial validity with a spatial continuity criterion and a spatial resolution index. We used our novel method to merge a medical database (the French national diagnosis-related group database, containing 5644 spatial units) with an ecological database (produced by the French National Institute of Statistics and Economic Studies, and containing with 36,594 spatial units). The mapping table yielded 5632 final spatial units. The mapping table's validity was evaluated by comparing the number of births in the medical database and the ecological databases in each final spatial unit. The median [interquartile range] relative difference was 2.3% [0; 5.7]. The spatial continuity criterion was low (2.4%), and the spatial resolution index was greater than for most French administrative areas. Our innovative approach improves interoperability between medical and ecological databases and facilitates fine-scale spatial analyses. We have shown that disaggregation models and large aggregation techniques are not necessarily the best ways to tackle the change of support problem.

  3. How well do testate amoebae transfer functions relate to high-resolution water-table records?

    NASA Astrophysics Data System (ADS)

    Holden, Joseph; Swindles, Graeme; Raby, Cassandra; Blundell, Antony

    2014-05-01

    Testate amoebae (TA) community composition records from peat cores are often used to infer past water-table conditions on peatland sites. However, one of the problems is that validation of water-table depths used in such work typically comes from a one-off water-table measurement or a few measurements of water-table depth from the testate amoebae sample extraction point. Furthermore, one value of water-table depth is produced by the transfer function reconstruction, with sample-specific errors generated through a statistical resampling approach. However, we know that water tables fluctuate in peatlands and are dynamic. Traditional TA water-table data may not adequately capture a mean value from a site, and may not account for water-table dynamics (e.g. seasonal or annual variability) that could influence the TA community composition. We analysed automatically logged (at least hourly, mainly 15-min) peatland water-table data from 72 different dipwells located across northern Sweden, Wales and the Pennine region of England. Each location had not been subject to recent management intervention. A suite of characteristics of water-table dynamics for each point were determined. At each point surface samples were extracted and the TA community composition was determined. Our results show that estimated water-table depth based on the TA community transfer functions poorly represents the real mean or median water tables for the study sites. The TA approach does, however, generally identify sites that have water tables that are closer to the surface for a greater proportion of the year compared to sites with deeper water tables for large proportions of the year. However, the traditional TA approach does not differentiate between sites with similar mean (or median) water-table depths yet which have quite different water table variability (e.g. interquartile range). We suggest some ways of improving water-table metrics for use in Holocene peatland hydrology reconstructions.

  4. Clinical Usefulness of the Pendulum Test Using a NK Table to Measure the Spasticity of Patients with Brain Lesions

    PubMed Central

    Kim, Yong-Wook

    2013-01-01

    . [Purpose] The purpose of the present study was to investigate the clinical usefulness (reliability and validity) of the pendulum test using a Noland-Kuckhoff (NK) table with an attached electrogoniometer to measure the spasticity of patients with brain lesions. [Subjects] The subjects were 31 patients with stroke or traumatic brain injury. [Methods] The intraclass correlation coefficient (ICC) was used to verify the test–retest reliability of spasticity measures obtained using the pendulum test. Pearson's product correlation coefficient was used to examine the validity of the pendulum test using the amplitude of the patellar tendon reflex (PTR) test, an objective and quantitative measure of spasticity. [Results] The test–retest reliability was high, reflecting a significant correlation between the test and the retest (ICCs = 0.95–0.97). A significant negative correlation was found between the amplitude of the PTR test and the four variables measured in the pendulum test (r = −0.77– −0.85). [Conclusion] The pendulum test using a NK table is an objective measure of spasticity and can be used in the clinical setting in place of more expensive and complicated equipment. Further studies are needed to investigate the therapeutic effect of this method on spasticity. PMID:24259775

  5. Clinical usefulness of the pendulum test using a NK table to measure the spasticity of patients with brain lesions.

    PubMed

    Kim, Yong-Wook

    2013-10-01

    . [Purpose] The purpose of the present study was to investigate the clinical usefulness (reliability and validity) of the pendulum test using a Noland-Kuckhoff (NK) table with an attached electrogoniometer to measure the spasticity of patients with brain lesions. [Subjects] The subjects were 31 patients with stroke or traumatic brain injury. [Methods] The intraclass correlation coefficient (ICC) was used to verify the test-retest reliability of spasticity measures obtained using the pendulum test. Pearson's product correlation coefficient was used to examine the validity of the pendulum test using the amplitude of the patellar tendon reflex (PTR) test, an objective and quantitative measure of spasticity. [Results] The test-retest reliability was high, reflecting a significant correlation between the test and the retest (ICCs = 0.95-0.97). A significant negative correlation was found between the amplitude of the PTR test and the four variables measured in the pendulum test (r = -0.77- -0.85). [Conclusion] The pendulum test using a NK table is an objective measure of spasticity and can be used in the clinical setting in place of more expensive and complicated equipment. Further studies are needed to investigate the therapeutic effect of this method on spasticity.

  6. A Validation Study of the School Attitude Assessment Survey.

    ERIC Educational Resources Information Center

    McCoach, D. Betsy

    2002-01-01

    This article describes the development of the School Attitude Assessment Survey (SAAS), an instrument that measures self-concept, self-motivation and self-regulation, attitude toward school, and peer attitudes to predict the academic achievement of adolescents. (Contains 43 references and 5 tables.) (Author)

  7. Validation study of human figure drawing test in a Colombian school children population.

    PubMed

    Vélez van Meerbeke, Alberto; Sandoval-Garcia, Carolina; Ibáñez, Milciades; Talero-Gutiérrez, Claudia; Fiallo, Dolly; Halliday, Karen

    2011-05-01

    The aim of this article was to assess the validity of the emotional and developmental components of the Koppitz human figure drawing test. 2420 children's drawings available in a database resulting from a previous cross sectional study designed to determine the prevalence of neurological diseases in children between 0 and 12 years old in Bogota schools were evaluated. They were scored using the criteria proposed by Koppitz, and classified into 16 groups according to age, gender, and presence/absence of learning or attention problems. The overall results were then compared with the normative study to assess whether descriptive parameters of the two populations were significantly different. There were no significant differences associated with presence/absence of learning and attention disorders or school attended within the overall sample. An Interrater reliability test has been made to assure the homogeneity of scoring by the evaluator team. There were significant differences between this population and that of the original study. New scoring tables contextualized for our population based on the frequency of appearance in this sample are presented. We can conclude that various ethnic, social, and cultural factors can influence the way children draw the human figure. It is thus important to establish local reference values to adequately distinguish between normality and abnormality. The new scoring tables proposed here should be followed up with a clinical study to corroborate their validity.

  8. LGM-30B, Stage II Dissected Motors Test Report,

    DTIC Science & Technology

    1980-07-01

    Relaxation Test Data (Outer Propellant) 29 Table 9, Stress Relaxation Test Data (Inner Propellant) 31 Table 10 , Cohesive Tear Energy Test Data (Outer...Outer) 45 7 Maximum Stress (Inner) 46 8Strain at Rupture (Inner) 47 9 Modulus (Inner) 48 Regression Plot, Low Rate Tensile 10 Maximum Stress (Outer...outer propellants are almost the same. H. TEAR ENERGY TEST: Data from this test period are contained in Tables 10 and 11. Sufficient valid data became

  9. Multi-institutional external validation of seminal vesicle invasion nomograms: head-to-head comparison of Gallina nomogram versus 2007 Partin tables.

    PubMed

    Zorn, Kevin C; Capitanio, Umberto; Jeldres, Claudio; Arjane, Philippe; Perrotte, Paul; Shariat, Shahrokh F; Lee, David I; Shalhav, Arieh L; Zagaja, Gregory P; Shikanov, Sergey A; Gofrit, Ofer N; Thong, Alan E; Albala, David M; Sun, Leon; Karakiewicz, Pierre I

    2009-04-01

    The Partin tables represent one of the most widely used prostate cancer staging tools for seminal vesicle invasion (SVI) prediction. Recently, Gallina et al. reported a novel staging tool for the prediction of SVI that further incorporated the use of the percentage of positive biopsy cores. We performed an external validation of the Gallina et al. nomogram and the 2007 Partin tables in a large, multi-institutional North American cohort of men treated with robotic-assisted radical prostatectomy. Clinical and pathologic data were prospectively gathered from 2,606 patients treated with robotic-assisted radical prostatectomy at one of four North American robotic referral centers between 2002 and 2007. Discrimination was quantified with the area under the receiver operating characteristics curve. The calibration compared the predicted and observed SVI rates throughout the entire range of predictions. At robotic-assisted radical prostatectomy, SVI was recorded in 4.2% of patients. The discriminant properties of the Gallina et al. nomogram resulted in 81% accuracy compared with 78% for the 2007 Partin tables. The Gallina et al. nomogram overestimated the true rate of SVI. Conversely, the Partin tables underestimated the true rate of SVI. The Gallina et al. nomogram offers greater accuracy (81%) than the 2007 Partin tables (78%). However, both tools are associated with calibration limitations that need to be acknowledged and considered before their implementation into clinical practice.

  10. Critical Values for Lawshe's Content Validity Ratio: Revisiting the Original Methods of Calculation

    ERIC Educational Resources Information Center

    Ayre, Colin; Scally, Andrew John

    2014-01-01

    The content validity ratio originally proposed by Lawshe is widely used to quantify content validity and yet methods used to calculate the original critical values were never reported. Methods for original calculation of critical values are suggested along with tables of exact binomial probabilities.

  11. Programmable stream prefetch with resource optimization

    DOEpatents

    Boyle, Peter; Christ, Norman; Gara, Alan; Mawhinney, Robert; Ohmacht, Martin; Sugavanam, Krishnan

    2013-01-08

    A stream prefetch engine performs data retrieval in a parallel computing system. The engine receives a load request from at least one processor. The engine evaluates whether a first memory address requested in the load request is present and valid in a table. The engine checks whether there exists valid data corresponding to the first memory address in an array if the first memory address is present and valid in the table. The engine increments a prefetching depth of a first stream that the first memory address belongs to and fetching a cache line associated with the first memory address from the at least one cache memory device if there is not yet valid data corresponding to the first memory address in the array. The engine determines whether prefetching of additional data is needed for the first stream within its prefetching depth. The engine prefetches the additional data if the prefetching is needed.

  12. Simulation validation and management

    NASA Astrophysics Data System (ADS)

    Illgen, John D.

    1995-06-01

    Illgen Simulation Technologies, Inc., has been working interactive verification and validation programs for the past six years. As a result, they have evolved a methodology that has been adopted and successfully implemented by a number of different verification and validation programs. This methodology employs a unique case of computer-assisted software engineering (CASE) tools to reverse engineer source code and produce analytical outputs (flow charts and tables) that aid the engineer/analyst in the verification and validation process. We have found that the use of CASE tools saves time,which equate to improvements in both schedule and cost. This paper will describe the ISTI-developed methodology and how CASe tools are used in its support. Case studies will be discussed.

  13. Cardiorespiratory fitness of a Brazilian regional sample distributed in different tables.

    PubMed

    Belli, Karlyse Claudino; Callegaro, Carine C; Calegaro, Carine; Richter, Cleusa Maria; Klafke, Jonatas Zeni; Stein, Ricardo; Viecili, Paulo Ricardo Nazario

    2012-09-01

    Most classification tables of cardiorespiratory fitness (CRF) used in clinical practice are international and have not been validated for the Brazilian population. That can result in important discrepancies when that classification is extrapolated to our population. To assess the use of major CRF tables available in a Brazilian population sample of the Central High Plan of the state of Rio Grande do Sul (RS). This study assessed the retrospective data of 2,930 individuals, living in 36 cities of the Central High Plan of the state of RS, and considered the following: presence of risk factors for cardiovascular disease and estimated maximum oxygen consumption (VO2peak) values obtained through exercise test with Bruce protocol. To classify CRF, the individuals were distributed according to sex, inserted in their respective age groups in the Cooper, American Heart Association (AHA) and Universidade Federal de São Paulo (Unifesp) tables, and classified according to their VO2peak. Women had lower VO2peak values as compared with those of men (23.5 ± 8.5 vs. 31.7 ± 10.8 mL.kg-1.min-1, p < 0.001). Considering both sexes, VO2peak showed an inverse and moderate correlation with age (R = -0.48, p < 0.001). An important discrepancy in the CRF classification levels was observed between the tables, ranging from 49% (Cooper x AHA) to 75% (Unifesp x AHA). Our findings indicate important discrepancy in the CRF classification levels of the tables assessed. Future studies could assess whether international tables could be used for the Brazilian population and populations of different regions of Brazil.

  14. A New Compression Method for FITS Tables

    NASA Technical Reports Server (NTRS)

    Pence, William; Seaman, Rob; White, Richard L.

    2010-01-01

    As the size and number of FITS binary tables generated by astronomical observatories increases, so does the need for a more efficient compression method to reduce the amount disk space and network bandwidth required to archive and down1oad the data tables. We have developed a new compression method for FITS binary tables that is modeled after the FITS tiled-image compression compression convention that has been in use for the past decade. Tests of this new method on a sample of FITS binary tables from a variety of current missions show that on average this new compression technique saves about 50% more disk space than when simply compressing the whole FITS file with gzip. Other advantages of this method are (1) the compressed FITS table is itself a valid FITS table, (2) the FITS headers remain uncompressed, thus allowing rapid read and write access to the keyword values, and (3) in the common case where the FITS file contains multiple tables, each table is compressed separately and may be accessed without having to uncompress the whole file.

  15. Adding EUNIS and VAULT rocket data to the VSO with Modern Perl frameworks

    NASA Astrophysics Data System (ADS)

    Mansky, Edmund

    2017-08-01

    A new Perl code is described, that uses the modern Object-oriented Moose framework, to add EUNIS and VAULT rocket data to the Virtual Solar Observatory website. The code permits the easy fixing of FITS header fields in the case where some FITS fields that are required are missing from the original data files. The code makes novel use of the Moose extensions “before” and “after” to build in dependencies so that database creation of tables occurs before the loading of data, and that the validation of file-dependent tables occurs after the loading is completed. Also described is the computation and loading of the deferred FITS field CHECKSUM into the database following the loading and validation of the file-dependent tables. The loading of the EUNIS 2006 and 2007 flight data, and the VAULT 2.0 flight data is described in detail as illustrative examples.

  16. Multi-Institutional External Validation of Seminal Vesicle Invasion Nomograms: Head-to-Head Comparison of Gallina Nomogram Versus 2007 Partin Tables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zorn, Kevin C.; Capitanio, Umberto; Jeldres, Claudio

    2009-04-01

    Purpose: The Partin tables represent one of the most widely used prostate cancer staging tools for seminal vesicle invasion (SVI) prediction. Recently, Gallina et al. reported a novel staging tool for the prediction of SVI that further incorporated the use of the percentage of positive biopsy cores. We performed an external validation of the Gallina et al. nomogram and the 2007 Partin tables in a large, multi-institutional North American cohort of men treated with robotic-assisted radical prostatectomy. Methods and Materials: Clinical and pathologic data were prospectively gathered from 2,606 patients treated with robotic-assisted radical prostatectomy at one of four Northmore » American robotic referral centers between 2002 and 2007. Discrimination was quantified with the area under the receiver operating characteristics curve. The calibration compared the predicted and observed SVI rates throughout the entire range of predictions. Results: At robotic-assisted radical prostatectomy, SVI was recorded in 4.2% of patients. The discriminant properties of the Gallina et al. nomogram resulted in 81% accuracy compared with 78% for the 2007 Partin tables. The Gallina et al. nomogram overestimated the true rate of SVI. Conversely, the Partin tables underestimated the true rate of SVI. Conclusion: The Gallina et al. nomogram offers greater accuracy (81%) than the 2007 Partin tables (78%). However, both tools are associated with calibration limitations that need to be acknowledged and considered before their implementation into clinical practice.« less

  17. Cubic Foot Volume Tables for Slash Pine Plantations of the Middle Coastal Plain of Georgia and the Carolina Sandhills

    Treesearch

    C.E. McGee; F.A. Bennett

    1959-01-01

    Proper management of any timber species or type requires valid estimates of volume from time to time. Tables 1 and 2 were constructed to meet this need for the expanding area of slash pine plantations in the middle coastal plain of Georgia and the Carolina Sandhills.

  18. Tidal Data Collection Options Study.

    DTIC Science & Technology

    1985-11-01

    area in which it was collected and. actualls. onlN tide measurements, the number would be even higher . ..- - - - - - - - - - - - - Table 1. Survey...sonobuoy. and consumer entertainment small number of easily interconnected semiconductor in- markets (typically 10,000 or more units per buy). tegrated...1985. validation of the computer model used to provide 7. Market Surve’ -Automatic Tide Monitoring. un- * detailed tide calculations. The design study

  19. The Substance Abuse Subtle Screening Inventory-3 and Stages of Change: A Screening Validity Study

    ERIC Educational Resources Information Center

    Laux, John M.; Piazza, Nick J.; Salyers, Kathleen; Roseman, Christopher P.

    2012-01-01

    The sensitivity of the Substance Abuse Subtle Screening Inventory-3 (SASSI-3) was examined among substance-dependent adults enrolled in a family drug court. The SASSI-3 had a high sensitivity rate with this population, even across varying levels of motivation to change. (Contains 2 tables.)

  20. Evaluating ATM Technology for Distance Education in Library and Information Science.

    ERIC Educational Resources Information Center

    Stanford, Serena W.

    1997-01-01

    Investigates the impact of asynchronous transfer mode (ATM) technology in an interactive environment providing distance education in library and information science at two San Jose State University (California) sites. The main purpose of the study was to develop a reliable and valid evaluation instrument. Contains 6 tables. (Author/AEF)

  1. Community-wide Validation of Geospace Model Ground Magnetic Field Perturbation Predictions to Support Model Transition to Operations

    NASA Technical Reports Server (NTRS)

    Pulkkinen, A.; Rastaetter, L.; Kuznetsova, M.; Singer, H.; Balch, C.; Weimer, D.; Toth, G.; Ridley, A.; Gombosi, T.; Wiltberger, M.; hide

    2013-01-01

    In this paper we continue the community-wide rigorous modern space weather model validation efforts carried out within GEM, CEDAR and SHINE programs. In this particular effort, in coordination among the Community Coordinated Modeling Center (CCMC), NOAA Space Weather Prediction Center (SWPC), modelers, and science community, we focus on studying the models' capability to reproduce observed ground magnetic field fluctuations, which are closely related to geomagnetically induced current phenomenon. One of the primary motivations of the work is to support NOAA SWPC in their selection of the next numerical model that will be transitioned into operations. Six geomagnetic events and 12 geomagnetic observatories were selected for validation.While modeled and observed magnetic field time series are available for all 12 stations, the primary metrics analysis is based on six stations that were selected to represent the high-latitude and mid-latitude locations. Events-based analysis and the corresponding contingency tables were built for each event and each station. The elements in the contingency table were then used to calculate Probability of Detection (POD), Probability of False Detection (POFD) and Heidke Skill Score (HSS) for rigorous quantification of the models' performance. In this paper the summary results of the metrics analyses are reported in terms of POD, POFD and HSS. More detailed analyses can be carried out using the event by event contingency tables provided as an online appendix. An online interface built at CCMC and described in the supporting information is also available for more detailed time series analyses.

  2. A content validity study of signs, symptoms and diseases/health problems expressed in LIBRAS1

    PubMed Central

    Aragão, Jamilly da Silva; de França, Inacia Sátiro Xavier; Coura, Alexsandro Silva; de Sousa, Francisco Stélio; Batista, Joana D'arc Lyra; Magalhães, Isabella Medeiros de Oliveira

    2015-01-01

    Objectives: to validate the content of signs, symptoms and diseases/health problems expressed in LIBRAS for people with deafness Method: methodological development study, which involved 36 people with deafness and three LIBRAS specialists. The study was conducted in three stages: investigation of the signs, symptoms and diseases/health problems, referred to by people with deafness, reported in a questionnaire; video recordings of how people with deafness express, through LIBRA, the signs, symptoms and diseases/health problems; and validation of the contents of the recordings of the expressions by LIBRAS specialists. Data were processed in a spreadsheet and analyzed using univariate tables, with absolute frequencies and percentages. The validation results were analyzed using the Content Validity Index (CVI). Results: 33 expressions in LIBRAS, of signs, symptoms and diseases/health problems were evaluated, and 28 expressions obtained a satisfactory CVI (1.00). Conclusions: the signs, symptoms and diseases/health problems expressed in LIBRAS presented validity, in the study region, for health professionals, especially nurses, for use in the clinical anamnesis of the nursing consultation for people with deafness. PMID:26625991

  3. A content validity study of signs, symptoms and diseases/health problems expressed in LIBRAS.

    PubMed

    Aragão, Jamilly da Silva; de França, Inacia Sátiro Xavier; Coura, Alexsandro Silva; de Sousa, Francisco Stélio; Batista, Joana D'arc Lyra; Magalhães, Isabella Medeiros de Oliveira

    2015-01-01

    To validate the content of signs, symptoms and diseases/health problems expressed in LIBRAS for people with deafness. Method: Methodological development study, which involved 36 people with deafness and three LIBRAS specialists. The study was conducted in three stages: investigation of the signs, symptoms and diseases/health problems, referred to by people with deafness, reported in a questionnaire; video recordings of how people with deafness express, through LIBRA, the signs, symptoms and diseases/health problems; and validation of the contents of the recordings of the expressions by LIBRAS specialists. Data were processed in a spreadsheet and analyzed using univariate tables, with absolute frequencies and percentages. The validation results were analyzed using the Content Validity Index (CVI). 33 expressions in LIBRAS, of signs, symptoms and diseases/health problems were evaluated, and 28 expressions obtained a satisfactory CVI (1.00). The signs, symptoms and diseases/health problems expressed in LIBRAS presented validity, in the study region, for health professionals, especially nurses, for use in the clinical anamnesis of the nursing consultation for people with deafness.

  4. Bridging Ground Validation and Algorithms: Using Scattering and Integral Tables to Incorporate Observed DSD Correlations into Satellite Algorithms

    NASA Astrophysics Data System (ADS)

    Williams, C. R.

    2012-12-01

    The NASA Global Precipitation Mission (GPM) raindrop size distribution (DSD) Working Group is composed of NASA PMM Science Team Members and is charged to "investigate the correlations between DSD parameters using Ground Validation (GV) data sets that support, or guide, the assumptions used in satellite retrieval algorithms." Correlations between DSD parameters can be used to constrain the unknowns and reduce the degrees-of-freedom in under-constrained satellite algorithms. Over the past two years, the GPM DSD Working Group has analyzed GV data and has found correlations between the mass-weighted mean raindrop diameter (Dm) and the mass distribution standard deviation (Sm) that follows a power-law relationship. This Dm-Sm power-law relationship appears to be robust and has been observed in surface disdrometer and vertically pointing radar observations. One benefit of a Dm-Sm power-law relationship is that a three parameter DSD can be modeled with just two parameters: Dm and Nw that determines the DSD amplitude. In order to incorporate observed DSD correlations into satellite algorithms, the GPM DSD Working Group is developing scattering and integral tables that can be used by satellite algorithms. Scattering tables describe the interaction of electromagnetic waves on individual particles to generate cross sections of backscattering, extinction, and scattering. Scattering tables are independent of the distribution of particles. Integral tables combine scattering table outputs with DSD parameters and DSD correlations to generate integrated normalized reflectivity, attenuation, scattering, emission, and asymmetry coefficients. Integral tables contain both frequency dependent scattering properties and cloud microphysics. The GPM DSD Working Group has developed scattering tables for raindrops at both Dual Precipitation Radar (DPR) frequencies and at all GMI radiometer frequencies less than 100 GHz. Scattering tables include Mie and T-matrix scattering with H- and V-polarization at the instrument view angles of nadir to 17 degrees (for DPR) and 48 & 53 degrees off nadir (for GMI). The GPM DSD Working Group is generating integral tables with GV observed DSD correlations and is performing sensitivity and verification tests. One advantage of keeping scattering tables separate from integral tables is that research can progress on the electromagnetic scattering of particles independent of cloud microphysics research. Another advantage of keeping the tables separate is that multiple scattering tables will be needed for frozen precipitation. Scattering tables are being developed for individual frozen particles based on habit, density and operating frequency. And a third advantage of keeping scattering and integral tables separate is that this framework provides an opportunity to communicate GV findings about DSD correlations into integral tables, and thus, into satellite algorithms.

  5. Estimating drain flow from measured water table depth in layered soils under free and controlled drainage

    NASA Astrophysics Data System (ADS)

    Saadat, Samaneh; Bowling, Laura; Frankenberger, Jane; Kladivko, Eileen

    2018-01-01

    Long records of continuous drain flow are important for quantifying annual and seasonal changes in the subsurface drainage flow from drained agricultural land. Missing data due to equipment malfunction and other challenges have limited conclusions that can be made about annual flow and thus nutrient loads from field studies, including assessments of the effect of controlled drainage. Water table depth data may be available during gaps in flow data, providing a basis for filling missing drain flow data; therefore, the overall goal of this study was to examine the potential to estimate drain flow using water table observations. The objectives were to evaluate how the shape of the relationship between drain flow and water table height above drain varies depending on the soil hydraulic conductivity profile, to quantify how well the Hooghoudt equation represented the water table-drain flow relationship in five years of measured data at the Davis Purdue Agricultural Center (DPAC), and to determine the impact of controlled drainage on drain flow using the filled dataset. The shape of the drain flow-water table height relationship was found to depend on the selected hydraulic conductivity profile. Estimated drain flow using the Hooghoudt equation with measured water table height for both free draining and controlled periods compared well to observed flow with Nash-Sutcliffe Efficiency values above 0.7 and 0.8 for calibration and validation periods, respectively. Using this method, together with linear regression for the remaining gaps, a long-term drain flow record for a controlled drainage experiment at the DPAC was used to evaluate the impacts of controlled drainage on drain flow. In the controlled drainage sites, annual flow was 14-49% lower than free drainage.

  6. A generalized groundwater fluctuation model based on precipitation for estimating water table levels of deep unconfined aquifers

    NASA Astrophysics Data System (ADS)

    Jeong, Jina; Park, Eungyu; Shik Han, Weon; Kim, Kue-Young; Suk, Heejun; Beom Jo, Si

    2018-07-01

    A generalized water table fluctuation model based on precipitation was developed using a statistical conceptualization of unsaturated infiltration fluxes. A gamma distribution function was adopted as a transfer function due to its versatility in representing recharge rates with temporally dispersed infiltration fluxes, and a Laplace transformation was used to obtain an analytical solution. To prove the general applicability of the model, convergences with previous water table fluctuation models were shown as special cases. For validation, a few hypothetical cases were developed, where the applicability of the model to a wide range of unsaturated zone conditions was confirmed. For further validation, the model was applied to water table level estimations of three monitoring wells with considerably thick unsaturated zones on Jeju Island. The results show that the developed model represented the pattern of hydrographs from the two monitoring wells fairly well. The lag times from precipitation to recharge estimated from the developed system transfer function were found to agree with those from a conventional cross-correlation analysis. The developed model has the potential to be adopted for the hydraulic characterization of both saturated and unsaturated zones by being calibrated to actual data when extraneous and exogenous causes of water table fluctuation are limited. In addition, as it provides reference estimates, the model can be adopted as a tool for surveilling groundwater resources under hydraulically stressed conditions.

  7. Sexual Harassment Retaliation Climate DEOCS 4.1 Construct Validity Summary

    DTIC Science & Technology

    2017-08-01

    exploratory factor analysis, and bivariate correlations (sample 1) 2) To determine the factor structure of the remaining (final) questions via...statistics, reliability analysis, exploratory factor analysis, and bivariate correlations of the prospective Sexual Harassment Retaliation Climate...reported by the survey requester). For information regarding the composition of sample, refer to Table 1. Table 1. Sample 1 Demographics n

  8. Validating Future Force Performance Measures (Army Class): End of Training Longitudinal Validation

    DTIC Science & Technology

    2009-09-01

    Organization 66 Canal Center Plaza, Suite 700 Alexandria, Virginia 22314 8 . PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING...4 Table B. 8 . Intercorrelations among RBI Scale Scores...12. Intercorrelations among WPA Dimension and Facet Scores ...................................... 8 x xi CONTENTS (continued) Page

  9. Estimating spatiotemporal variability and sustainability of shallow groundwater in a well-irrigated plain of the Haihe River basin using SWAT model

    NASA Astrophysics Data System (ADS)

    Zhang, Xueliang; Ren, Li; Kong, Xiangbin

    2016-10-01

    Quantitatively estimating the spatiotemporal variability and sustainability of shallow groundwater with a distributed hydrological model could provide an important basis for proper groundwater management, especially in well-irrigated areas. In this study, the Soil and Water Assessment Tool (SWAT) model was modified and applied to a well-irrigated plain of the Haihe River basin. First, appropriate initial values of the parameters in the groundwater module were determined based on abundant hydrogeological investigations and assessment. Then, the model was satisfactorily calibrated and validated using shallow groundwater table data from 16 national wells monitored monthly from 1993 to 2010 and 148 wells investigated yearly from 2006 to 2012. To further demonstrate the model's rationality, the multi-objective validation was conducted by comparing the simulated groundwater balance components, actual evapotranspiration, and crop yields to multiple sources data. Finally, the established SWAT was used to estimate both shallow groundwater table fluctuation and shallow aquifer water storage change in time and space. Results showed that the average shallow groundwater table declined at a rate of 0.69-1.56 m a-1, which depleted almost 350 × 108 m3 of shallow aquifer water storage in the cropland during the period of 1993-2012. Because of the heterogeneity of the underlying surface and precipitation, these variations were spatiotemporally different. Generally, the shallow groundwater table declined 1.43-1.88 m during the winter wheat (Triticum aestivum L.) growing season, while it recovered 0.28-0.57 m during the summer maize (Zea mays L.) growing season except when precipitation was exceptionally scarce. According to the simulated depletion rate, the shallow aquifer in the study area may face a depletion crisis within the next 80 years. This study identified the regions where prohibitions or restrictions on shallow groundwater exploitation should be urgently carried out.

  10. Turning the table on advice programs for parents: Using placemats to enhance family interaction at restaurants

    PubMed Central

    Green, Richard B.; Hardison, William L.; Greene, Brandon F.

    1984-01-01

    There are many opportunities in a family's daily routine to enrich the interactions among its members. One such opportunity arises at family restaurants. Surveys of restaurant personnel and customers suggested the possibility of enriching family interactions by redesigning indigenous materials such as table placemats. Accordingly, we developed Table-Talk placemats that provided conversational topics and illustrated games in which the entire family could participate. After some testing of these placemats in a preschool, a field experiment was conducted with families dining in restaurants. Table-Talk placements occasioned more social and educational dialogue among family members than either traditional-placemat or no-material conditions. Social validation ratings provided by mental health counselors and the parents suggested that Table-Talk placemats occasioned healthy and enjoyable interactions among family members. PMID:16795680

  11. The Missing Middle in Validation Research

    ERIC Educational Resources Information Center

    Taylor, Erwin K.; Griess, Thomas

    1976-01-01

    In most selection validation research, only the upper and lower tails of the criterion distribution are used, often yielding misleading or incorrect results. Provides formulas and tables which enable the researcher to account more accurately for the distribution of criterion within the middle range of population. (Author/RW)

  12. Development of an Instrument to Evaluate the Knowledge of Elementary Teachers about Venereal Disease.

    ERIC Educational Resources Information Center

    Schmidt, Norma G.

    The purpose of this study was to develop a valid, reliable test to measure the knowledge of elementary school teachers about venereal disease. Recommended scientific test construction procedures were carefully followed. These included the development of a content outline and a table of specification; submitting potential test items to a review…

  13. A Comparative Study of Soviet versus Western Helicopters. Part 1. General Comparison of Designs

    DTIC Science & Technology

    1983-03-01

    kid in hover and in forward flight may be considerably different. Consequently, the validity of using the hovering point in conjunction with the two...coefficients computed for two porn -weight values, using data from Fig. .. 17. ,:.:. assumed a being correct, and is shown in Table 5.5 with the corresponding

  14. Use of geospatial technology for delineating groundwater potential zones with an emphasis on water-table analysis in Dwarka River basin, Birbhum, India

    NASA Astrophysics Data System (ADS)

    Thapa, Raju; Gupta, Srimanta; Gupta, Arindam; Reddy, D. V.; Kaur, Harjeet

    2018-05-01

    Dwarka River basin in Birbhum, West Bengal (India), is an agriculture-dominated area where groundwater plays a crucial role. The basin experiences seasonal water stress conditions with a scarcity of surface water. In the presented study, delineation of groundwater potential zones (GWPZs) is carried out using a geospatial multi-influencing factor technique. Geology, geomorphology, soil type, land use/land cover, rainfall, lineament and fault density, drainage density, slope, and elevation of the study area were considered for the delineation of GWPZs in the study area. About 9.3, 71.9 and 18.8% of the study area falls within good, moderate and poor groundwater potential zones, respectively. The potential groundwater yield data corroborate the outcome of the model, with maximum yield in the older floodplain and minimum yield in the hard-rock terrains in the western and south-western regions. Validation of the GWPZs using the yield of 148 wells shows very high accuracy of the model prediction, i.e., 89.1% on superimposition and 85.1 and 81.3% on success and prediction rates, respectively. Measurement of the seasonal water-table fluctuation with a multiplicative model of time series for predicting the short-term trend of the water table, followed by chi-square analysis between the predicted and observed water-table depth, indicates a trend of falling groundwater levels, with a 5% level of significance and a p-value of 0.233. The rainfall pattern for the last 3 years of the study shows a moderately positive correlation ( R 2 = 0.308) with the average water-table depth in the study area.

  15. Sensory classification of table olives using an electronic tongue: Analysis of aqueous pastes and brines.

    PubMed

    Marx, Ítala; Rodrigues, Nuno; Dias, Luís G; Veloso, Ana C A; Pereira, José A; Drunkler, Deisy A; Peres, António M

    2017-01-01

    Table olives are highly appreciated and consumed worldwide. Different aspects are used for trade category classification being the sensory assessment of negative defects present in the olives and brines one of the most important. The trade category quality classification must follow the International Olive Council directives, requiring the organoleptic assessment of defects by a trained sensory panel. However, the training process is a hard, complex and sometimes subjective task, being the low number of samples that can be evaluated per day a major drawback considering the real needs of the olive industry. In this context, the development of electronic tongues as taste sensors for defects' sensory evaluation is of utmost relevance. So, an electronic tongue was used for table olives classification according to the presence and intensity of negative defects. Linear discrimination models were established based on sub-sets of sensor signals selected by a simulated annealing algorithm. The predictive potential of the novel approach was first demonstrated for standard solutions of chemical compounds that mimic butyric, putrid and zapateria defects (≥93% for cross-validation procedures). Then its applicability was verified; using reference table olives/brine solutions samples identified with a single intense negative attribute, namely butyric, musty, putrid, zapateria or winey-vinegary defects (≥93% cross-validation procedures). Finally, the E-tongue coupled with the same chemometric approach was applied to classify table olive samples according to the trade commercial categories (extra, 1 st choice, 2 nd choice and unsuitable for consumption) and an additional quality category (extra free of defects), established based on sensory analysis data. Despite the heterogeneity of the samples studied and number of different sensory defects perceived, the predictive linear discriminant model established showed sensitivities greater than 86%. So, the overall performance achieved showed that the electrochemical device could be used as a taste sensor for table olives organoleptic trade successful classification, allowing a preliminary quality assessment, which could facilitate, in the future, the complex task of sensory panelists. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Student Reasoning from Data Tables: Data Interpretation in Light of Student Ability and Prior Belief

    NASA Astrophysics Data System (ADS)

    Bogdan, Abigail Marie

    Here I present my work studying introductory physics students proficiency with the control of variables strategy to evaluate simple data tables. In this research, a primary goal was to identify and to describe the reasoning strategies that students use preferentially when evaluating simple data tables where the control of variables strategy is the normative evaluation strategy. In addition, I aimed to identify and describe the factors that affect students reasoning strategies when analyzing these simple data tables. In a series of experiments, I tested 1,360 introductory physics students, giving them simple tables of experimental data to analyze. Generally, each of the experiments that I conducted had two conditions. In both of these conditions, the data filling the tables was identical; however, in the first condition, the data table was presented in a physical context and students were given a short pre-test to measure their beliefs about the context. In the second condition, the table was given in a more generic context. This was repeated with multiple data tables and physical contexts. In addition to the data table task, students were given several measures of cognitive ability. By using students answers on the pretest about physical context, I was able to measure whether or not each students prior beliefs were consistent with the relationships shown in the data tables. Across all the experiments conducted here, I found that those students whose prior beliefs were consistent with the data were over three times more likely to draw a valid inference from the table than students whose prior beliefs were inconsistent with the data. By further analyzing students responses, I found evidence that this difference in performance could be accounted for by the presence of a belief bias. Students tended to cite data in suboptimal ways, frequently treating their own theories as a source of evidence to be supplemented by or illustrated with examples from the data. Because of this tendency to hunt piecemeal through the tables for supporting examples, contradictory data was often simply overlooked. However, even when noticed, data that contradicted their theories was often ignored, misinterpreted to conform, or discounted in some way.

  17. Developing a Long Short-Term Memory (LSTM) based model for predicting water table depth in agricultural areas

    NASA Astrophysics Data System (ADS)

    Zhang, Jianfeng; Zhu, Yan; Zhang, Xiaoping; Ye, Ming; Yang, Jinzhong

    2018-06-01

    Predicting water table depth over the long-term in agricultural areas presents great challenges because these areas have complex and heterogeneous hydrogeological characteristics, boundary conditions, and human activities; also, nonlinear interactions occur among these factors. Therefore, a new time series model based on Long Short-Term Memory (LSTM), was developed in this study as an alternative to computationally expensive physical models. The proposed model is composed of an LSTM layer with another fully connected layer on top of it, with a dropout method applied in the first LSTM layer. In this study, the proposed model was applied and evaluated in five sub-areas of Hetao Irrigation District in arid northwestern China using data of 14 years (2000-2013). The proposed model uses monthly water diversion, evaporation, precipitation, temperature, and time as input data to predict water table depth. A simple but effective standardization method was employed to pre-process data to ensure data on the same scale. 14 years of data are separated into two sets: training set (2000-2011) and validation set (2012-2013) in the experiment. As expected, the proposed model achieves higher R2 scores (0.789-0.952) in water table depth prediction, when compared with the results of traditional feed-forward neural network (FFNN), which only reaches relatively low R2 scores (0.004-0.495), proving that the proposed model can preserve and learn previous information well. Furthermore, the validity of the dropout method and the proposed model's architecture are discussed. Through experimentation, the results show that the dropout method can prevent overfitting significantly. In addition, comparisons between the R2 scores of the proposed model and Double-LSTM model (R2 scores range from 0.170 to 0.864), further prove that the proposed model's architecture is reasonable and can contribute to a strong learning ability on time series data. Thus, one can conclude that the proposed model can serve as an alternative approach predicting water table depth, especially in areas where hydrogeological data are difficult to obtain.

  18. Damage Assessment of a Full-Scale Six-Story wood-frame Building Following Triaxial shake Table Tests

    Treesearch

    John W. van de Lindt; Rakesh Gupta; Shiling Pei; Kazuki Tachibana; Yasuhiro Araki; Douglas Rammer; Hiroshi Isoda

    2012-01-01

    In the summer of 2009, a full-scale midrise wood-frame building was tested under a series of simulated earthquakes on the world's largest shake table in Miki City, Japan. The objective of this series of tests was to validate a performance-based seismic design approach by qualitatively and quantitatively examining the building's seismic performance in terms of...

  19. Improved Hydrology over Peatlands in a Global Land Modeling System

    NASA Technical Reports Server (NTRS)

    Bechtold, M.; Delannoy, G.; Reichle, R.; Koster, R.; Mahanama, S.; Roose, Dirk

    2018-01-01

    Peatlands of the Northern Hemisphere represent an important carbon pool that mainly accumulated since the last ice age under permanently wet conditions in specific geological and climatic settings. The carbon balance of peatlands is closely coupled to water table dynamics. Consequently, the future carbon balance over peatlands is strongly dependent on how hydrology in peatlands will react to changing boundary conditions, e.g. due to climate change or regional water level drawdown of connected aquifers or streams. Global land surface modeling over organic-rich regions can provide valuable global-scale insights on where and how peatlands are in transition due to changing boundary conditions. However, the current global land surface models are not able to reproduce typical hydrological dynamics in peatlands well. We implemented specific structural and parametric changes to account for key hydrological characteristics of peatlands into NASA's GEOS-5 Catchment Land Surface Model (CLSM, Koster et al. 2000). The main modifications pertain to the modeling of partial inundation, and the definition of peatland-specific runoff and evapotranspiration schemes. We ran a set of simulations on a high performance cluster using different CLSM configurations and validated the results with a newly compiled global in-situ dataset of water table depths in peatlands. The results demonstrate that an update of soil hydraulic properties for peat soils alone does not improve the performance of CLSM over peatlands. However, structural model changes for peatlands are able to improve the skill metrics for water table depth. The validation results for the water table depth indicate a reduction of the bias from 2.5 to 0.2 m, and an improvement of the temporal correlation coefficient from 0.5 to 0.65, and from 0.4 to 0.55 for the anomalies. Our validation data set includes both bogs (rain-fed) and fens (ground and/or surface water influence) and reveals that the metrics improved less for fens. In addition, a comparison of evapotranspiration and soil moisture estimates over peatlands will be presented, albeit only with limited ground-based validation data. We will discuss strengths and weaknesses of the new model by focusing on time series of specific validation sites.

  20. Interpolations of groundwater table elevation in dissected uplands.

    PubMed

    Chung, Jae-won; Rogers, J David

    2012-01-01

    The variable elevation of the groundwater table in the St. Louis area was estimated using multiple linear regression (MLR), ordinary kriging, and cokriging as part of a regional program seeking to assess liquefaction potential. Surface water features were used to determine the minimum water table for MLR and supplement the principal variables for ordinary kriging and cokriging. By evaluating the known depth to the water and the minimum water table elevation, the MLR analysis approximates the groundwater elevation for a contiguous hydrologic system. Ordinary kriging and cokriging estimate values in unsampled areas by calculating the spatial relationships between the unsampled and sampled locations. In this study, ordinary kriging did not incorporate topographic variations as an independent variable, while cokriging included topography as a supporting covariable. Cross validation suggests that cokriging provides a more reliable estimate at known data points with less uncertainty than the other methods. Profiles extending through the dissected uplands terrain suggest that: (1) the groundwater table generated by MLR mimics the ground surface and elicits a exaggerated interpolation of groundwater elevation; (2) the groundwater table estimated by ordinary kriging tends to ignore local topography and exhibits oversmoothing of the actual undulations in the water table; and (3) cokriging appears to give the realistic water surface, which rises and falls in proportion to the overlying topography. The authors concluded that cokriging provided the most realistic estimate of the groundwater surface, which is the key variable in assessing soil liquefaction potential in unconsolidated sediments. © 2011, The Author(s). Ground Water © 2011, National Ground Water Association.

  1. Integration and Validation of Avian Radars (IVAR)

    DTIC Science & Technology

    2011-07-01

    IVAR study locations. ‘X’ indicates the specified type of demonstration was conducted at that location; ‘B’ indicates a potential Back-up locations... nation two radar with parabolic dishes tilted nsor (AR-1; specifically The initial deployment of avian radar systems at a civil airport was completed...ARTI. 45 Table 4-1. Summary of the IVAR study locations. ‘X’ indicates the specified type of demonstration was conducted at that

  2. The Utility of the SASSI-3 in Early Detection of Substance Use Disorders in Not Guilty by Reason of Insanity Acquittees: An Exploratory Study

    ERIC Educational Resources Information Center

    Wright, Ervin E., II; Piazza, Nick J.; Laux, John M.

    2008-01-01

    Previous studies have shown the Substance Abuse Subtle Screening Inventory-3 (G. Miller, 1999) to be valid in classifying substance use disorders in forensic and mentally ill populations. The authors found that it also correctly classified substance use disorders in the understudied not guilty by reason of insanity population. (Contains 3 tables.)

  3. Experimental Validation of the Butyl-Rubber Finite Element (FE) Material Model for the Blast-Mitigating Floor Mat

    DTIC Science & Technology

    2015-08-01

    Analysis ( FEA ) results of each FE-material model, and the errors in each material model are discussed on various metrics. 15. SUBJECT TERMS ESEP... FEAs ...................................................................... 9 Fig. 8 Velocity histories on the loading table in FEAs for 4-millisecond...10 Fig. 9 Velocity histories on the loading table in FEAs for 8-msec-pulse loading ................... 10 Fig. 10 Velocity histories on

  4. A Validity Review of the Color Company Competition at the United States Naval Academy

    DTIC Science & Technology

    2006-06-01

    Joy of Wading: Leadership and Team Working in Swampy Conditions. The Mental Health Review, 9 (3), 35-41. Polk, C. J. (2003). Effective ...morale question indicated negative effects . Question 3 (The midshipmen chain of command is working hard to make my company the best in the Brigade...49 Table 6. Color Company Designation Effects on Academic Performance .......................................52 Table 7. Brigade Climate Survey

  5. Sexual Assault Prevention and Response Climate DEOCS 4.1 Construct Validity Summary

    DTIC Science & Technology

    2017-08-01

    DEOCS, (7) examining variance and descriptive statistics (8) examining the relationship among items/areas to reduce multicollinearity, and (9...selecting items that demonstrate the strongest scale properties. Included is a review of the 4.0 description and items, followed by the proposed...Tables 1 – 7 for the description of each measure and corresponding items. Table 1. DEOCS 4.0 Perceptions of Safety Measure Description

  6. Validation of the Information/Communications Technology Literacy Test

    DTIC Science & Technology

    2016-10-01

    nested set. Table 11 presents the results of incremental validity analyses for job knowledge/performance criteria by MOS. Figure 7 presents much...Systems Operator-Analyst (25B) and Nodal Network Systems Operator-Maintainer (25N) MOS. This report documents technical procedures and results of the...research effort. Results suggest that the ICTL test has potential as a valid and highly efficient predictor of valued outcomes in Signal school MOS. Not

  7. Apparent and internal validity of a Monte Carlo-Markov model for cardiovascular disease in a cohort follow-up study.

    PubMed

    Nijhuis, Rogier L; Stijnen, Theo; Peeters, Anna; Witteman, Jacqueline C M; Hofman, Albert; Hunink, M G Myriam

    2006-01-01

    To determine the apparent and internal validity of the Rotterdam Ischemic heart disease & Stroke Computer (RISC) model, a Monte Carlo-Markov model, designed to evaluate the impact of cardiovascular disease (CVD) risk factors and their modification on life expectancy (LE) and cardiovascular disease-free LE (DFLE) in a general population (hereinafter, these will be referred to together as (DF)LE). The model is based on data from the Rotterdam Study, a cohort follow-up study of 6871 subjects aged 55 years and older who visited the research center for risk factor assessment at baseline (1990-1993) and completed a follow-up visit 7 years later (original cohort). The transition probabilities and risk factor trends used in the RISC model were based on data from 3501 subjects (the study cohort). To validate the RISC model, the number of simulated CVD events during 7 years' follow-up were compared with the observed number of events in the study cohort and the original cohort, respectively, and simulated (DF)LEs were compared with the (DF)LEs calculated from multistate life tables. Both in the study cohort and in the original cohort, the simulated distribution of CVD events was consistent with the observed number of events (CVD deaths: 7.1% v. 6.6% and 7.4% v. 7.6%, respectively; non-CVD deaths: 11.2% v. 11.5% and 12.9% v. 13.0%, respectively). The distribution of (DF)LEs estimated with the RISC model consistently encompassed the (DF)LEs calculated with multistate life tables. The simulated events and (DF)LE estimates from the RISC model are consistent with observed data from a cohort follow-up study.

  8. Experimental/analytical approaches to modeling, calibrating and optimizing shaking table dynamics for structural dynamic applications

    NASA Astrophysics Data System (ADS)

    Trombetti, Tomaso

    This thesis presents an Experimental/Analytical approach to modeling and calibrating shaking tables for structural dynamic applications. This approach was successfully applied to the shaking table recently built in the structural laboratory of the Civil Engineering Department at Rice University. This shaking table is capable of reproducing model earthquake ground motions with a peak acceleration of 6 g's, a peak velocity of 40 inches per second, and a peak displacement of 3 inches, for a maximum payload of 1500 pounds. It has a frequency bandwidth of approximately 70 Hz and is designed to test structural specimens up to 1/5 scale. The rail/table system is mounted on a reaction mass of about 70,000 pounds consisting of three 12 ft x 12 ft x 1 ft reinforced concrete slabs, post-tensioned together and connected to the strong laboratory floor. The slip table is driven by a hydraulic actuator governed by a 407 MTS controller which employs a proportional-integral-derivative-feedforward-differential pressure algorithm to control the actuator displacement. Feedback signals are provided by two LVDT's (monitoring the slip table relative displacement and the servovalve main stage spool position) and by one differential pressure transducer (monitoring the actuator force). The dynamic actuator-foundation-specimen system is modeled and analyzed by combining linear control theory and linear structural dynamics. The analytical model developed accounts for the effects of actuator oil compressibility, oil leakage in the actuator, time delay in the response of the servovalve spool to a given electrical signal, foundation flexibility, and dynamic characteristics of multi-degree-of-freedom specimens. In order to study the actual dynamic behavior of the shaking table, the transfer function between target and actual table accelerations were identified using experimental results and spectral estimation techniques. The power spectral density of the system input and the cross power spectral density of the table input and output were estimated using the Bartlett's spectral estimation method. The experimentally-estimated table acceleration transfer functions obtained for different working conditions are correlated with their analytical counterparts. As a result of this comprehensive correlation study, a thorough understanding of the shaking table dynamics and its sensitivities to control and payload parameters is obtained. Moreover, the correlation study leads to a calibrated analytical model of the shaking table of high predictive ability. It is concluded that, in its present conditions, the Rice shaking table is able to reproduce, with a high degree of accuracy, model earthquake accelerations time histories in the frequency bandwidth from 0 to 75 Hz. Furthermore, the exhaustive analysis performed indicates that the table transfer function is not significantly affected by the presence of a large (in terms of weight) payload with a fundamental frequency up to 20 Hz. Payloads having a higher fundamental frequency do affect significantly the shaking table performance and require a modification of the table control gain setting that can be easily obtained using the predictive analytical model of the shaking table. The complete description of a structural dynamic experiment performed using the Rice shaking table facility is also reported herein. The object of this experimentation was twofold: (1) to verify the testing capability of the shaking table and, (2) to experimentally validate a simplified theory developed by the author, which predicts the maximum rotational response developed by seismic isolated building structures characterized by non-coincident centers of mass and rigidity, when subjected to strong earthquake ground motions.

  9. The Effects of Acute Stress on Cognitive Performance. A Pilot Study

    DTIC Science & Technology

    2010-12-01

    Correlation Matrix – Stress Response Measures. ................................ 13  Table 4. Non-significant HRV Results...validity and reliability. The State-Trait Anxiety Index (or STAI; Spielberger and Sydeman, 1994) is a popular self-report instrument of this type...and provides distinct scores for state anxiety (a property of the situation) and trait anxiety (a property of the individual) using a 4-point rating

  10. Can Perceptuo-Motor Skills Assessment Outcomes in Young Table Tennis Players (7-11 years) Predict Future Competition Participation and Performance? An Observational Prospective Study.

    PubMed

    Faber, Irene R; Elferink-Gemser, Marije T; Faber, Niels R; Oosterveld, Frits G J; Nijhuis-Van der Sanden, Maria W G

    2016-01-01

    Forecasting future performance in youth table tennis players based on current performance is complex due to, among other things, differences between youth players in growth, development, maturity, context and table tennis experience. Talent development programmes might benefit from an assessment of underlying perceptuo-motor skills for table tennis, which is hypothesized to determine the players' potential concerning the perceptuo-motor domain. The Dutch perceptuo-motor skills assessment intends to measure the perceptuo-motor potential for table tennis in youth players by assessing the underlying skills crucial for developing technical and tactical qualities. Untrained perceptuo-motor tasks are used as these are suggested to represent a player's future potential better than specific sport skills themselves as the latter depend on exposure to the sport itself. This study evaluated the value of the perceptuo-motor skills assessment for a talent developmental programme by evaluating its predictive validity for competition participation and performance in 48 young table tennis players (7-11 years). Players were tested on their perceptuo-motor skills once during a regional talent day, and the subsequent competition results were recorded half-yearly over a period of 2.5 years. Logistic regression analysis showed that test scores did not predict future competition participation (p >0.05). Yet, the Generalized Estimating Equations analysis, including the test items 'aiming at target', 'throwing a ball', and 'eye-hand coordination' in the best fitting model, revealed that the outcomes of the perceptuo-motor skills assessment were significant predictors for future competition results (R2 = 51%). Since the test age influences the perceptuo-motor skills assessment's outcome, another multivariable model was proposed including test age as a covariate (R2 = 53%). This evaluation demonstrates promising prospects for the perceptuo-motor skills assessment to be included in a talent development programme. Future studies are needed to clarify the predictive value in a larger sample of youth competition players over a longer period in time.

  11. Validation studies of the DOE-2 Building Energy Simulation Program. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, R.; Winkelmann, F.

    1998-06-01

    This report documents many of the validation studies (Table 1) of the DOE-2 building energy analysis simulation program that have taken place since 1981. Results for several versions of the program are presented with the most recent study conducted in 1996 on version DOE-2.1E and the most distant study conducted in 1981 on version DOE-1.3. This work is part of an effort related to continued development of DOE-2, particularly in its use as a simulation engine for new specialized versions of the program such as the recently released RESFEN 3.1. RESFEN 3.1 is a program specifically dealing with analyzing themore » energy performance of windows in residential buildings. The intent in providing the results of these validation studies is to give potential users of the program a high degree of confidence in the calculated results. Validation studies in which calculated simulation data is compared to measured data have been conducted throughout the development of the DOE-2 program. Discrepancies discovered during the course of such work has resulted in improvements in the simulation algorithms. Table 2 provides a listing of additions and modifications that have been made to various versions of the program since version DOE-2.1A. One of the most significant recent changes in the program occurred with version DOE-2.1E. An improved algorithm for calculating the outside surface film coefficient was implemented. In addition, integration of the WINDOW 4 program was accomplished resulting in improved ability in analyzing window energy performance. Validation and verification of a program as sophisticated as DOE-2 must necessarily be limited because of the approximations inherent in the program. For example, the most accurate model of the heat transfer processes in a building would include a three-dimensional analysis. To justify such detailed algorithmic procedures would correspondingly require detailed information describing the building and/or HVAC system and energy plant parameters. Until building simulation programs can get this data directly from CAD programs, such detail would negate the usefulness of the program for the practicing engineers and architects who currently use the program. In addition, the validation studies discussed herein indicate that such detail is really unnecessary. The comparison of calculated and measured quantities have resulted in a satisfactory level of confidence that is sufficient for continued use of the DOE-2 program. However, additional validation is warranted, particularly at the component level, to further improve the program.« less

  12. Path Planning for Reduced Identifiability of Unmanned Surface Vehicles Conducting Intelligence, Surveillance, and Reconnaissance

    DTIC Science & Technology

    2017-05-22

    angular velocity values Figure 33: Feasibility test Figure 34: Bellman’s Principle Figure 35: Bellman’s Principle validation Minimum Figure 36...Distribution of at test point for simulated ISR traffic Figure 48: PDFs of observed and ISR traffic Table 2: Adversary security states at test point #10...Figure 49: Hypothesis testing at test point #10 Figure 50: Distribution of for observed traffic Figure 51: Distribution of for ISR traffic Table 3

  13. Corrigendum to "Geant4 validation of neutron production on thick targets bombarded with 120 GeV protons" [Nucl. Instr. Meth. B 358 (2015) 245-250

    NASA Astrophysics Data System (ADS)

    Sabra, Mohammad S.

    2016-09-01

    In the paper by Mohammad S. Sabra, due to a mixup, wrong calculations for NEPR ratios, normalized to 20 cm-thick copper, for 40 cm and 60 cm-thick copper at 30° for QGSP-BIC, QGSP-BERT, QGSP-INCLXX, and SHIELDING were published in Table 2. The correct values are listed in the revised Table 2 as below.

  14. External Validation of the Updated Partin Tables in a Cohort of French and Italian Men

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhojani, Naeem; Department of Urology, University of Montreal, Montreal, PQ; Salomon, Laurent

    2009-02-01

    Purpose: To test the discrimination and calibration properties of the newly developed 2007 Partin Tables in two European cohorts with localized prostate cancer. Methods: Data on clinical and pathologic characteristics were obtained for 1,064 men treated with radical prostatectomy at the Creteil University Health Center in France (n = 839) and at the Milan University Vita-Salute in Italy (n = 225). Overall discrimination was assessed with receiver operating characteristic curve analysis, which quantified the accuracy of stage predictions for each center. Calibration plots graphically explored the relationship between predicted and observed rates of extracapsular extension (ECE), seminal vesicle invasion (SVI)more » and lymph node invasion (LNI). Results: The rates of ECE, SVI, and LNI were 28%, 14%, and 2% in the Creteil cohort vs. 11%, 5%, and 5% in the Milan cohort. In the Creteil cohort, the accuracy of ECE, SVI, and LNI prediction was 61%, 71%, and 82% vs. 66%, 92% and 75% for the Milan cohort. Important departures were recorded between Partin Tables' predicted and observed rates of ECE, SVI, and LNI within both cohorts. Conclusions: The 2007 Partin Tables demonstrated worse performance in European men than they originally did in North American men. This indicates that predictive models need to be externally validated before their implementation into clinical practice.« less

  15. A New Approach to Simulate Groundwater Table Dynamics and Its Validation in China

    NASA Astrophysics Data System (ADS)

    Lv, M.; Lu, H.; Dan, L.; Yang, K.

    2017-12-01

    The groundwater has very important role in hydrology-climate-human activity interaction. But the groundwater table dynamics currently is not well simulated in global-scale land surface models. Meanwhile, almost all groundwater schemes are adopting a specific yield method to estimate groundwater table, in which how to determine the proper specific yield value remains a big challenge. In this study, we developed a Soil Moisture Correlation (SMC) method to simulate groundwater table dynamics. We coupled SMC with a hydrological model (named as NEW) and compared it with the original model in which a specific yield method is used (named as CTL). Both NEW and CTL were tested in Tangnaihai Subbasin of Yellow River and Jialingjiang Subbasin along Yangtze River, where underground water is less impacted by human activities. The simulated discharges by NEW and CTL are compared against gauge observations. The comparison results reveal that after calibration both models are able to reproduce the discharge well. However, there is no parameter needed to be calibrated for SMC. It indicates that SMC method is more efficient and easy-to-use than the specific yield method. Since there is no direct groundwater table observation in these two basins, simulated groundwater table were compared with a global data set provided by Fan et al. (2013). Both NEW and CTL estimate lower depths than Fan does. Moreover, when comparing the variation of terrestrial water storage (TWS) derived from NEW with that observed by GRACE, good agreements were confirmed. It demonstrated that SMC method is able to reproduce groundwater level dynamics reliably.

  16. WEC-SIM Validation Testing Plan FY14 Q4.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruehl, Kelley Michelle

    2016-02-01

    The WEC-Sim project is currently on track, having met both the SNL and NREL FY14 Milestones, as shown in Table 1 and Table 2. This is also reflected in the Gantt chart uploaded to the WEC-Sim SharePoint site in the FY14 Q4 Deliverables folder. The work completed in FY14 includes code verification through code-to-code comparison (FY14 Q1 and Q2), preliminary code validation through comparison to experimental data (FY14 Q2 and Q3), presentation and publication of the WEC-Sim project at OMAE 2014 [1], [2], [3] and GMREC/METS 2014 [4] (FY14 Q3), WEC-Sim code development and public open-source release (FY14 Q3), andmore » development of a preliminary WEC-Sim validation test plan (FY14 Q4). This report presents the preliminary Validation Testing Plan developed in FY14 Q4. The validation test effort started in FY14 Q4 and will go on through FY15. Thus far the team has developed a device selection method, selected a device, and placed a contract with the testing facility, established several collaborations including industry contacts, and have working ideas on the testing details such as scaling, device design, and test conditions.« less

  17. Validation of a wireless modular monitoring system for structures

    NASA Astrophysics Data System (ADS)

    Lynch, Jerome P.; Law, Kincho H.; Kiremidjian, Anne S.; Carryer, John E.; Kenny, Thomas W.; Partridge, Aaron; Sundararajan, Arvind

    2002-06-01

    A wireless sensing unit for use in a Wireless Modular Monitoring System (WiMMS) has been designed and constructed. Drawing upon advanced technological developments in the areas of wireless communications, low-power microprocessors and micro-electro mechanical system (MEMS) sensing transducers, the wireless sensing unit represents a high-performance yet low-cost solution to monitoring the short-term and long-term performance of structures. A sophisticated reduced instruction set computer (RISC) microcontroller is placed at the core of the unit to accommodate on-board computations, measurement filtering and data interrogation algorithms. The functionality of the wireless sensing unit is validated through various experiments involving multiple sensing transducers interfaced to the sensing unit. In particular, MEMS-based accelerometers are used as the primary sensing transducer in this study's validation experiments. A five degree of freedom scaled test structure mounted upon a shaking table is employed for system validation.

  18. Fluorescent Lamp Replacement Study

    DTIC Science & Technology

    2017-07-01

    DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 The public reporting burden for this collection of information is estimated to average 1 hour per...information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ORGANIZATION. 1. REPORT DATE (DD-MM...Unclassified 19b. TELEPHONE NUMBER (Include area code) Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39.18 TABLE OF CONTENTS

  19. Development of student performance assessment based on scientific approach for a basic physics practicum in simple harmonic motion materials

    NASA Astrophysics Data System (ADS)

    Serevina, V.; Muliyati, D.

    2018-05-01

    This research aims to develop students’ performance assessment instrument based on scientific approach is valid and reliable in assessing the performance of students on basic physics lab of Simple Harmonic Motion (SHM). This study uses the ADDIE consisting of stages: Analyze, Design, Development, Implementation, and Evaluation. The student performance assessment developed can be used to measure students’ skills in observing, asking, conducting experiments, associating and communicate experimental results that are the ‘5M’ stages in a scientific approach. Each grain of assessment in the instrument is validated by the instrument expert and the evaluation with the result of all points of assessment shall be eligible to be used with a 100% eligibility percentage. The instrument is then tested for the quality of construction, material, and language by panel (lecturer) with the result: 85% or very good instrument construction aspect, material aspect 87.5% or very good, and language aspect 83% or very good. For small group trial obtained instrument reliability level of 0.878 or is in the high category, where r-table is 0.707. For large group trial obtained instrument reliability level of 0.889 or is in the high category, where r-table is 0.320. Instruments declared valid and reliable for 5% significance level. Based on the result of this research, it can be concluded that the student performance appraisal instrument based on the developed scientific approach is declared valid and reliable to be used in assessing student skill in SHM experimental activity.

  20. Interference tables: a useful model for interference analysis in asynchronous multicarrier transmission

    NASA Astrophysics Data System (ADS)

    Medjahdi, Yahia; Terré, Michel; Ruyet, Didier Le; Roviras, Daniel

    2014-12-01

    In this paper, we investigate the impact of timing asynchronism on the performance of multicarrier techniques in a spectrum coexistence context. Two multicarrier schemes are considered: cyclic prefix-based orthogonal frequency division multiplexing (CP-OFDM) with a rectangular pulse shape and filter bank-based multicarrier (FBMC) with physical layer for dynamic spectrum access and cognitive radio (PHYDYAS) and isotropic orthogonal transform algorithm (IOTA) waveforms. First, we present the general concept of the so-called power spectral density (PSD)-based interference tables which are commonly used for multicarrier interference characterization in spectrum sharing context. After highlighting the limits of this approach, we propose a new family of interference tables called `instantaneous interference tables'. The proposed tables give the interference power caused by a given interfering subcarrier on a victim one, not only as a function of the spectral distance separating both subcarriers but also with respect to the timing misalignment between the subcarrier holders. In contrast to the PSD-based interference tables, the accuracy of the proposed tables has been validated through different simulation results. Furthermore, due to the better frequency localization of both PHYDYAS and IOTA waveforms, FBMC technique is demonstrated to be more robust to timing asynchronism compared to OFDM one. Such a result makes FBMC a potential candidate for the physical layer of future cognitive radio systems.

  1. Double-crested Cormorant Management Plan to Reduce Predation of Juvenile Salmonids in the Columbia River Estuary

    DTIC Science & Technology

    2015-01-01

    Table ES-3): TABLE ES-3. Affected Environment. Affect ed Resource Summary Vegetation A mix of native a nd non-native plant species is fou nd on t...from British Columbia to California and east to the Continental Divide. Although the western population of double-crested cormorants composes a small ...no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control

  2. Validation and Interrogation of Differentially Expressed and Alternatively Spliced Genes in African American Prostate Cancer

    DTIC Science & Technology

    2016-10-01

    These analyses have led to two submitted manuscripts. The first manuscript, “Variants of stemness -related genes predicted to regulate RNA splicing...and Table 1-3 at the end of this progress report. The second manuscript, “Single nucleotide polymorphisms of stemness pathway genes predicted to...cancer and support a contribution of the stemness pathway to prostate cancer patient outcome. Please see Figure 5-7 and Table 4-6 at the end of this

  3. SU-F-T-365: Clinical Commissioning of the Monaco Treatment Planning System for the Novalis Tx to Deliver VMAT, SRS and SBRT Treatments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adnani, N

    Purpose: To commission the Monaco Treatment Planning System for the Novalis Tx machine. Methods: The commissioning of Monte-Carlo (MC), Collapsed Cone (CC) and electron Monte-Carlo (eMC) beam models was performed through a series of measurements and calculations in medium and in water. In medium measurements relied Octavius 4D QA system with the 1000 SRS detector array for field sizes less than 4 cm × 4 cm and the 1500 detector array for larger field sizes. Heterogeneity corrections were validated using a custom built phantom. Prior to clinical implementation, an end to end testing of a Prostate and H&N VMAT plansmore » was performed. Results: Using a 0.5% uncertainty and 2 mm grid sizes, Tables I and II summarize the MC validation at 6 MV and 18 MV in both medium and water. Tables III and IV show similar comparisons for CC. Using the custom heterogeneity phantom setup of Figure 1 and IGRT guidance summarized in Figure 2, Table V lists the percent pass rate for a 2%, 2 mm gamma criteria at 6 and 18 MV for both MC and CC. The relationship between MC calculations settings of uncertainty and grid size and the gamma passing rate for a prostate and H&N case is shown in Table VI. Table VII lists the results of the eMC calculations compared to measured data for clinically available applicators and Table VIII for small field cutouts. Conclusion: MU calculations using MC are highly sensitive to uncertainty and grid size settings. The difference can be of the order of several per cents. MC is superior to CC for small fields and when using heterogeneity corrections, regardless of field size, making it more suitable for SRS, SBRT and VMAT deliveries. eMC showed good agreement with measurements down to 2 cm − 2 cm field size.« less

  4. NRL Hyperspectral Imagery Trafficability Tool (HITT): Software andSpectral-Geotechnical Look-up Tables for Estimation and Mapping of Soil Bearing Strength from Hyperspectral Imagery

    DTIC Science & Technology

    2012-09-28

    spectral-geotechnical libraries and models developed during remote sensing and calibration/ validation campaigns conducted by NRL and collaborating...geotechnical libraries and models developed during remote sensing and calibration/ validation campaigns conducted by NRL and collaborating institutions in four...2010; Bachmann, Fry, et al, 2012a). The NRL HITT tool is a model for how we develop and validate software, and the future development of tools by

  5. Instantaneous and controllable integer ambiguity resolution: review and an alternative approach

    NASA Astrophysics Data System (ADS)

    Zhang, Jingyu; Wu, Meiping; Li, Tao; Zhang, Kaidong

    2015-11-01

    In the high-precision application of Global Navigation Satellite System (GNSS), integer ambiguity resolution is the key step to realize precise positioning and attitude determination. As the necessary part of quality control, integer aperture (IA) ambiguity resolution provides the theoretical and practical foundation for ambiguity validation. It is mainly realized by acceptance testing. Due to the constraint of correlation between ambiguities, it is impossible to realize the controlling of failure rate according to analytical formula. Hence, the fixed failure rate approach is implemented by Monte Carlo sampling. However, due to the characteristics of Monte Carlo sampling and look-up table, we have to face the problem of a large amount of time consumption if sufficient GNSS scenarios are included in the creation of look-up table. This restricts the fixed failure rate approach to be a post process approach if a look-up table is not available. Furthermore, if not enough GNSS scenarios are considered, the table may only be valid for a specific scenario or application. Besides this, the method of creating look-up table or look-up function still needs to be designed for each specific acceptance test. To overcome these problems in determination of critical values, this contribution will propose an instantaneous and CONtrollable (iCON) IA ambiguity resolution approach for the first time. The iCON approach has the following advantages: (a) critical value of acceptance test is independently determined based on the required failure rate and GNSS model without resorting to external information such as look-up table; (b) it can be realized instantaneously for most of IA estimators which have analytical probability formulas. The stronger GNSS model, the less time consumption; (c) it provides a new viewpoint to improve the research about IA estimation. To verify these conclusions, multi-frequency and multi-GNSS simulation experiments are implemented. Those results show that IA estimators based on iCON approach can realize controllable ambiguity resolution. Besides this, compared with ratio test IA based on look-up table, difference test IA and IA least square based on the iCON approach most of times have higher success rates and better controllability to failure rates.

  6. Customized Clinical Practice Guidelines for Management of Adult Cataract in Iran

    PubMed Central

    Rajavi, Zhaleh; Javadi, Mohammad Ali; Daftarian, Narsis; Safi, Sare; Nejat, Farhad; Shirvani, Armin; Ahmadieh, Hamid; Shahraz, Saeid; Ziaei, Hossein; Moein, Hamidreza; Motlagh, Behzad Fallahi; Feizi, Sepehr; Foroutan, Alireza; Hashemi, Hassan; Hashemian, Seyed Javad; Jabbarvand, Mahmoud; Jafarinasab, Mohammad Reza; Karimian, Farid; Mohammad-Rabei, Hossein; Mohammadpour, Mehrdad; Nassiri, Nader; Panahi-Bazaz, Mahmoodreza; Rohani, Mohammad Reza; Sedaghat, Mohammad Reza; Sheibani, Kourosh

    2015-01-01

    Purpose: To customize clinical practice guidelines (CPGs) for cataract management in the Iranian population. Methods: First, four CPGs (American Academy of Ophthalmology 2006 and 2011, Royal College of Ophthalmologists 2010, and Canadian Ophthalmological Society 2008) were selected from a number of available CPGs in the literature for cataract management. All recommendations of these guidelines, together with their references, were studied. Each recommendation was summarized in 4 tables. The first table showed the recommendation itself in clinical question components format along with its level of evidence. The second table contained structured abstracts of supporting articles related to the clinical question with their levels of evidence. The third table included the customized recommendation of the internal group respecting its clinical advantage, cost, and complications. In the fourth table, the internal group their recommendations from 1 to 9 based on the customizing capability of the recommendation (applicability, acceptability, external validity). Finally, customized recommendations were sent one month prior to a consensus session to faculty members of all universities across the country asking for their comments on recommendations. Results: The agreed recommendations were accepted as conclusive while those with no agreement were discussed at the consensus session. Finally, all customized recommendations were codified as 80 recommendations along with their sources and levels of evidence for the Iranian population. Conclusion: Customization of CPGs for management of adult cataract for the Iranian population seems to be useful for standardization of referral, diagnosis and treatment of patients. PMID:27051491

  7. Groundwater influence on soil moisture memory and land-atmosphere interactions over the Iberian Peninsula

    NASA Astrophysics Data System (ADS)

    Martinez-de la Torre, Alberto; Miguez-Macho, Gonzalo

    2017-04-01

    We investigate the memory introduced in soil moisture fields by groundwater long timescales of variation in the semi-arid regions of the Iberian Peninsula with the LEAFHYDRO soil-vegetation-hydrology model, which includes a dynamic water table fully coupled to soil moisture and river flow via 2-way fluxes. We select a 10-year period (1989-1998) with transitions from wet to dry to again wet long lasting conditions and we carry out simulations at 2.5 km spatial resolution forced by ERA-Interim and a high-resolution precipitation analysis over Spain and Portugal. The model produces a realistic water table that we validate with hundreds of water table depth observation time series (ranging from 4 to 10 years) over the Iberian Peninsula. Modeled river flow is also compared to observations. Over shallow water table regions, results highlight the groundwater buffering effect on soil moisture fields over dry spells and long-term droughts, as well as the slow recovery of pre-drought soil wetness once climatic conditions turn wetter. Groundwater sustains river flow during dry summer periods. The longer lasting wet conditions in the soil when groundwater is considered increase summer evapotranspiration, that is mostly water-limited. Our results suggest that groundwater interaction with soil moisture should be considered for climate seasonal forecasting and climate studies in general over water-limited regions where shallow water tables are significantly present and connected to land surface hydrology.

  8. Validation of asthma recording in electronic health records: a systematic review

    PubMed Central

    Nissen, Francis; Quint, Jennifer K; Wilkinson, Samantha; Mullerova, Hana; Smeeth, Liam; Douglas, Ian J

    2017-01-01

    Objective To describe the methods used to validate asthma diagnoses in electronic health records and summarize the results of the validation studies. Background Electronic health records are increasingly being used for research on asthma to inform health services and health policy. Validation of the recording of asthma diagnoses in electronic health records is essential to use these databases for credible epidemiological asthma research. Methods We searched EMBASE and MEDLINE databases for studies that validated asthma diagnoses detected in electronic health records up to October 2016. Two reviewers independently assessed the full text against the predetermined inclusion criteria. Key data including author, year, data source, case definitions, reference standard, and validation statistics (including sensitivity, specificity, positive predictive value [PPV], and negative predictive value [NPV]) were summarized in two tables. Results Thirteen studies met the inclusion criteria. Most studies demonstrated a high validity using at least one case definition (PPV >80%). Ten studies used a manual validation as the reference standard; each had at least one case definition with a PPV of at least 63%, up to 100%. We also found two studies using a second independent database to validate asthma diagnoses. The PPVs of the best performing case definitions ranged from 46% to 58%. We found one study which used a questionnaire as the reference standard to validate a database case definition; the PPV of the case definition algorithm in this study was 89%. Conclusion Attaining high PPVs (>80%) is possible using each of the discussed validation methods. Identifying asthma cases in electronic health records is possible with high sensitivity, specificity or PPV, by combining multiple data sources, or by focusing on specific test measures. Studies testing a range of case definitions show wide variation in the validity of each definition, suggesting this may be important for obtaining asthma definitions with optimal validity. PMID:29238227

  9. Estimation of water table level and nitrate pollution based on geostatistical and multiple mass transport models

    NASA Astrophysics Data System (ADS)

    Matiatos, Ioannis; Varouhakis, Emmanouil A.; Papadopoulou, Maria P.

    2015-04-01

    As the sustainable use of groundwater resources is a great challenge for many countries in the world, groundwater modeling has become a very useful and well established tool for studying groundwater management problems. Based on various methods used to numerically solve algebraic equations representing groundwater flow and contaminant mass transport, numerical models are mainly divided into Finite Difference-based and Finite Element-based models. The present study aims at evaluating the performance of a finite difference-based (MODFLOW-MT3DMS), a finite element-based (FEFLOW) and a hybrid finite element and finite difference (Princeton Transport Code-PTC) groundwater numerical models simulating groundwater flow and nitrate mass transport in the alluvial aquifer of Trizina region in NE Peloponnese, Greece. The calibration of groundwater flow in all models was performed using groundwater hydraulic head data from seven stress periods and the validation was based on a series of hydraulic head data for two stress periods in sufficient numbers of observation locations. The same periods were used for the calibration of nitrate mass transport. The calibration and validation of the three models revealed that the simulated values of hydraulic heads and nitrate mass concentrations coincide well with the observed ones. The models' performance was assessed by performing a statistical analysis of these different types of numerical algorithms. A number of metrics, such as Mean Absolute Error (MAE), Root Mean Square Error (RMSE), Bias, Nash Sutcliffe Model Efficiency (NSE) and Reliability Index (RI) were used allowing the direct comparison of models' performance. Spatiotemporal Kriging (STRK) was also applied using separable and non-separable spatiotemporal variograms to predict water table level and nitrate concentration at each sampling station for two selected hydrological stress periods. The predictions were validated using the respective measured values. Maps of water table level and nitrate concentrations were produced and compared with those obtained from groundwater and mass transport numerical models. Preliminary results showed similar efficiency of the spatiotemporal geostatistical method with the numerical models. However data requirements of the former model were significantly less. Advantages and disadvantages of the methods performance were analysed and discussed indicating the characteristics of the different approaches.

  10. Selection of Yeasts as Starter Cultures for Table Olives: A Step-by-Step Procedure

    PubMed Central

    Bevilacqua, Antonio; Corbo, Maria Rosaria; Sinigaglia, Milena

    2012-01-01

    The selection of yeasts intended as starters for table olives is a complex process, including a characterization step at laboratory level and a validation at lab level and factory-scale. The characterization at lab level deals with the assessment of some technological traits (growth under different temperatures and at alkaline pHs, effect of salt, and for probiotic strains the resistance to preservatives), enzymatic activities, and some new functional properties (probiotic traits, production of vitamin B-complex, biological debittering). The paper reports on these traits, focusing both on their theoretical implications and lab protocols; moreover, there are some details on predictive microbiology for yeasts of table olives and on the use of multivariate approaches to select suitable starters. PMID:22666220

  11. Tier One Performance Screen Initial Operational Test and Evaluation: 2011 Annual Report

    DTIC Science & Technology

    2013-01-01

    OPERATIONAL TEST AND EVALUATION: 2011 ANNUAL REPORT EXECUTIVE SUMMARY Research Requirement: In addition to educational, physical , and...34 Table 5.4. Incremental Validity Estimates for the TAPAS and TOPS Composite Scales over the AFQT for Predicting IMT Physical Fitness Criteria by...Validity Estimates for the TAPAS and TOPS Composite Scales over the AFQT for Predicting In-Unit Physical Fitness Criteria by Education Tier

  12. A Performance Management Framework for Civil Engineering

    DTIC Science & Technology

    1990-09-01

    cultural change. A non - equivalent control group design was chosen to augment the case analysis. Figure 3.18 shows the form of the quasi-experiment. The...The non - equivalent control group design controls the following obstacles to internal validity: history, maturation, testing, and instrumentation. The...and Stanley, 1963:48,50) Table 7. Validity of Quasi-Experiment The non - equivalent control group experimental design controls the following obstacles to

  13. Presentation of a High Resolution Time Lapse 3D Groundwater Model of Metsähovi for Calculating the Gravity Effect of Groundwater in Local Scale

    NASA Astrophysics Data System (ADS)

    Hokkanen, T. M.; Hartikainen, A.; Raja-Halli, A.; Virtanen, H.; Makinen, J.

    2015-12-01

    INTRODUCTION The aim of this study is to construct a fine resolution time lapse groundwater (GW) model of Metsähovi (MH). GW, geological, and soil moisture (SM) data were collected for several years to achieve the goal. The knowledge of the behavior of the GW at local scale is essential for superconductive gravimeter (SG) investigations performing in MH. DESCRIPTION OF THE DATA Almost 50 sensors have been recorded SM data some 6 years with 1 to 5 minutes sampling frequency. The GW table has been monitored, both in bedrock and in soil, in many stages with all together 15 piezometers. Two geological sampling campaigns were conducted to get the knowledge of hydrological properties of soil in the study area of 200×200 m2 around SG station in MH. PRINCIPLE OF TIME LAPSE 3D HYDROGEOLOGICAL MODEL The model of study site consists of the surfaces of ground and bedrock gridded with 2×2 m2 resolution. The height of GW table was interpolated to 2×2×0.1 m3 grid between GW and SM monitoring points. Close to the outline of the study site and areas lacking of sensors GW table was defined by extrapolation and considering the geological information of the area. The bedrock porosity is 2% and soil porosity determined by geological information and SM recordings is from 5 to 35%. Only fully saturated media is considered in the time lapse model excluding unsaturated one. BENEFICIERS With a new model the fluctuation of GW table can be followed with ranging time lapses from 1 minute to 1 month. The gravity effect caused by the variation of GW table can be calculated more accurate than before in MH. Moreover, the new model can be validated and refined by measured gravity, i.e. hydrological model can be improved by SG recordings (Figure 1).

  14. Effect of climate change on the irrigation and discharge scheme for winter wheat in Huaibei Plain, China

    NASA Astrophysics Data System (ADS)

    Zhu, Y.; Ren, L.; Lü, H.

    2017-12-01

    On the Huaibei Plain of Anhui Province, China, winter wheat (WW) is the most prominent crop. The study area belongs to transitional climate, with shallow water table. The original climate change is complex, in addition, global warming make the climate change more complex. The winter wheat growth period is from October to June, just during the rainless season, the WW growth always depends on part of irrigation water. Under such complex climate change, the rainfall varies during the growing seasons, and water table elevations also vary. Thus, water tables supply variable moisture change between soil water and groundwater, which impact the irrigation and discharge scheme for plant growth and yield. In Huaibei plain, the environmental pollution is very serious because of agricultural use of chemical fertilizer, pesticide, herbicide and etc. In order to protect river water and groundwater from pollution, the irrigation and discharge scheme should be estimated accurately. Therefore, determining the irrigation and discharge scheme for winter wheat under climate change is important for the plant growth management decision-making. Based on field observations and local weather data of 2004-2005 and 2005-2006, the numerical model HYDRUS-1D was validated and calibrated by comparing simulated and measured root-zone soil water contents. The validated model was used to estimate the irrigation and discharge scheme in 2010-2090 under the scenarios described by HadCM3 (1970 to 2000 climate states are taken as baselines) with winter wheat growth in an optimum state indicated by growth height and LAI.

  15. Sex determination using discriminant function analysis in Indigenous (Kurubas) children and adolescents of Coorg, Karnataka, India: A lateral cephalometric study.

    PubMed

    Devang Divakar, Darshan; John, Jacob; Al Kheraif, Abdulaziz Abdullah; Mavinapalla, Seema; Ramakrishnaiah, Ravikumar; Vellappally, Sajith; Hashem, Mohamed Ibrahim; Dalati, M H N; Durgesh, B H; Safadi, Rima A; Anil, Sukumaran

    2016-11-01

    Aim: To test the validity of sex discrimination using lateral cephalometric radiograph and discriminant function analysis in Indigenous (Kuruba) children and adolescents of Coorg, Karnataka, India. Methods and materials: Six hundred and sixteen lateral cephalograms of 380 male and 236 females of age ranging from 6.5 to 18 years of Indigenous population of Coorg, Karnataka, India called Kurubas having a normal occlusion were included in the study. Lateral cephalograms were obtained in a standard position with teeth in centric occlusion and lips relaxed. Each radiograph was traced and cephalometric landmarks were measured using digital calliper. Calculations of 24 cephalometric measurements were performed. Results: Males exhibited significantly greater mean angular and linear cephalometric measurements as compared to females ( p  < 0.05) (Table 5). Also, significant differences ( p  < 0.05) were observed in all the variables according to age (Table 6). Out of 24 variables, only ULTc predicts the gender. The reliability of the derived discriminant function was assessed among study subjects; 100% of males and females were recognized correctly. Conclusion: The final outcome of this study validates the existence of sexual dimorphism in the skeleton as early as 6.5 years of age. There is a need for further research to determine other landmarks that can help in sex determination and norms for Indigenous (Kuruba) population and also other Indigenous population of Coorg, Karnataka, India.

  16. SU-D-209-04: Raise Your Table: An Effective Way to Reduce Radiation Dose for Fluoroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huo, D; Hoerner, M; Toskich, B

    2016-06-15

    Purpose: Patient table height plays an important role in estimating patient skin dose for interventional radiology (IR) procedures, because the patient’s skin location is dependent on the height of table. Variation in table height can lead to as much as 150% difference in skin dose for patient exams with similar air kerma meter readings. In our facility, IR procedural workflow was recently changed to require the IR physicians to confirm the patient table height before the procedure. The patient table height data was collected before and after this workflow change to validate the implementation of this practice. Methods: Table heightmore » information was analyzed for all procedures performed in three IR rooms, which were impacted by the workflow change, covering three months before and after the change (Aug 2015 to Jan 2016). In total, 442, 425, and 390 procedures were performed in these three rooms over this time period. There were no personnel or procedure assignment changes during the six-month period of time. Statistical analysis was performed for the average table height changes before and after the workflow change. Results: For the three IR rooms investigated, after the workflow change, the average table heights were increased by 1.43 cm (p=0.004084), 0.66 cm (p=0.187089), and 1.59 cm (p=0.002193), providing a corresponding estimated skin dose savings of 6.76%, 2.94% and 7.62%, respectively. After the workflow change, the average table height was increased by 0.95 cm, 0.63 cm, 0.55 cm, 1.07 cm, 1.12 cm, and 3.36 cm for the six physicians who routinely work in these three rooms. Conclusion: Consistent improvement in table height settings has been observed for all IR rooms and all physicians following a simple workflow change. This change has led to significant patient dose savings by making physicians aware of the pre-procedure table position.« less

  17. Psychometric properties of a revised version of the Assisting Hand Assessment (Kids-AHA 5.0).

    PubMed

    Holmefur, Marie M; Krumlinde-Sundholm, Lena

    2016-06-01

    The aim of this study was to scrutinize the Assisting Hand Assessment (AHA) version 4.4 for possible improvements and to evaluate the psychometric properties regarding internal scale validity and aspects of reliability of a revised version of the AHA. In collaboration with experts, scoring criteria were changed for four items, and one fully new item was constructed. Twenty-two original, one new, and four revised items were scored for 164 assessments of children with unilateral cerebral palsy aged 18 months to 12 years. Rasch measurement analysis was used to evaluate internal scale validity by exploring rating-scale functioning, item and person goodness-of-fit, and principal component analysis. Targeting and scale reliability were also evaluated. After removal of misfitting items, a 20-item scale showed satisfactory goodness-of-fit. Unidimensionality was confirmed by principal component analysis. The rating scale functioned well for the 20 items, and the item difficulty was well suited to the ability level of the sample. The person reliability coefficient was 0.98, indicating high separation ability of the scale. A conversion table of AHA scores between the previous version (4.4) and the new version (5.0) was constructed. The new, 20-item version of the Kids-AHA (version 5.0), demonstrated excellent internal scale validity, suggesting improved responsiveness to changes and shortened scoring time. For comparison of scores from version 4.4 to 5.0, a transformation table is presented. © 2015 Mac Keith Press.

  18. Bitburg AB, Bitburg, Germany. Revised Uniform Summary of Surface Weather Observations (RUSSWO). Parts A-F

    DTIC Science & Technology

    1979-02-20

    for e;--ch month and annual (all motnhs) and the total valid aebserntion count. An asterisk (%*)J is printed in any yar-montha block -.en the en-rme...annuel (all months). An asterisk (*) is printed in each data block if one or more days are missing for the mantt. No occurxvnces for a month ae indicted...in the sae msaar as in the extreme tables above. If a trace becoms the extreme or wathly total in say of these tables it is printed as r Continaed an

  19. Validation Workshop of the DRDC Concept Map Knowledge Model: Issues in Intelligence Analysis

    DTIC Science & Technology

    2010-06-29

    group noted problems with grammar , and a more standard approach to the grammar of the linking term (e.g. use only active tense ) would certainly have...Knowledge Model is distinct from a Concept Map. A Concept Map is a single map, probably presented in one view, while a Knowledge Model is a set of...Agenda The workshop followed the agenda presented in Table 2-3. Table 2-3: Workshop Agenda Time Title 13:00 – 13:15 Registration 13:15 – 13:45

  20. Elastic, Cottage Cheese, and Gasoline: Visualizing Division of Fractions

    ERIC Educational Resources Information Center

    Peck, Sallie; Wood, Japheth

    2008-01-01

    Teachers must be prepared to recognize valid alternative representations of arithmetic problems. Challenging examples involving mixed fractions and division are presented along with teacher's discussion from a professional development workshop. (Contains 6 figures and 1 table.)

  1. Can Perceptuo-Motor Skills Assessment Outcomes in Young Table Tennis Players (7–11 years) Predict Future Competition Participation and Performance? An Observational Prospective Study

    PubMed Central

    2016-01-01

    Forecasting future performance in youth table tennis players based on current performance is complex due to, among other things, differences between youth players in growth, development, maturity, context and table tennis experience. Talent development programmes might benefit from an assessment of underlying perceptuo-motor skills for table tennis, which is hypothesized to determine the players’ potential concerning the perceptuo-motor domain. The Dutch perceptuo-motor skills assessment intends to measure the perceptuo-motor potential for table tennis in youth players by assessing the underlying skills crucial for developing technical and tactical qualities. Untrained perceptuo-motor tasks are used as these are suggested to represent a player’s future potential better than specific sport skills themselves as the latter depend on exposure to the sport itself. This study evaluated the value of the perceptuo-motor skills assessment for a talent developmental programme by evaluating its predictive validity for competition participation and performance in 48 young table tennis players (7–11 years). Players were tested on their perceptuo-motor skills once during a regional talent day, and the subsequent competition results were recorded half-yearly over a period of 2.5 years. Logistic regression analysis showed that test scores did not predict future competition participation (p >0.05). Yet, the Generalized Estimating Equations analysis, including the test items ‘aiming at target’, ‘throwing a ball’, and ‘eye-hand coordination’ in the best fitting model, revealed that the outcomes of the perceptuo-motor skills assessment were significant predictors for future competition results (R2 = 51%). Since the test age influences the perceptuo-motor skills assessment’s outcome, another multivariable model was proposed including test age as a covariate (R2 = 53%). This evaluation demonstrates promising prospects for the perceptuo-motor skills assessment to be included in a talent development programme. Future studies are needed to clarify the predictive value in a larger sample of youth competition players over a longer period in time. PMID:26863212

  2. STILTS -- Starlink Tables Infrastructure Library Tool Set

    NASA Astrophysics Data System (ADS)

    Taylor, Mark

    STILTS is a set of command-line tools for processing tabular data. It has been designed for, but is not restricted to, use on astronomical data such as source catalogues. It contains both generic (format-independent) table processing tools and tools for processing VOTable documents. Facilities offered include crossmatching, format conversion, format validation, column calculation and rearrangement, row selection, sorting, plotting, statistical calculations and metadata display. Calculations on cell data can be performed using a powerful and extensible expression language. The package is written in pure Java and based on STIL, the Starlink Tables Infrastructure Library. This gives it high portability, support for many data formats (including FITS, VOTable, text-based formats and SQL databases), extensibility and scalability. Where possible the tools are written to accept streamed data so the size of tables which can be processed is not limited by available memory. As well as the tutorial and reference information in this document, detailed on-line help is available from the tools themselves. STILTS is available under the GNU General Public Licence.

  3. Effects of shallow water table, salinity and frequency of irrigation water on the date palm water use

    NASA Astrophysics Data System (ADS)

    Askri, Brahim; Ahmed, Abdelkader T.; Abichou, Tarek; Bouhlila, Rachida

    2014-05-01

    In southern Tunisia oases, waterlogging, salinity, and water shortage represent serious threats to the sustainability of irrigated agriculture. Understanding the interaction between these problems and their effects on root water uptake is fundamental for suggesting possible options of improving land and water productivity. In this study, HYDRUS-1D model was used in a plot of farmland located in the Fatnassa oasis to investigate the effects of waterlogging, salinity, and water shortage on the date palm water use. The model was calibrated and validated using experimental data of sap flow density of a date palm, soil hydraulic properties, water table depth, and amount of irrigation water. The comparison between predicted and observed data for date palm transpiration rates was acceptable indicating that the model could well estimate water consumption of this tree crop. Scenario simulations were performed with different water table depths, and salinities and frequencies of irrigation water. The results show that the impacts of water table depth and irrigation frequency vary according to the season. In summer, high irrigation frequency and shallow groundwater are needed to maintain high water content and low salinity of the root-zone and therefore to increase the date palm transpiration rates. However, these factors have no significant effect in winter. The results also reveal that irrigation water salinity has no significant effect under shallow saline groundwater.

  4. Nurse Education, Center of Excellence for Remote and Medically Under-Served Areas (CERMUSA)

    DTIC Science & Technology

    2014-04-01

    didactic portion o Online pre-test and post-test o Online survey • Statistical validity: o The study investigators enrolled 134 participants...NURSE.SAS Datacut: 2014-01-17 Generated: 2014-01-26:16:40 Table 2.09: Summary Statistics , Does your curriculum teach students how to develop a personal...disasters. Pre-test post-test results indicated that the delivery of didactic material via an online course management system is an effective

  5. Nuclear event time histories and computed site transfer functions for locations in the Los Angeles region

    USGS Publications Warehouse

    Rogers, A.M.; Covington, P.A.; Park, R.B.; Borcherdt, R.D.; Perkins, D.M.

    1980-01-01

    This report presents a collection of Nevada Test Site (NTS) nuclear explosion recordings obtained at sites in the greater Los Angeles, Calif., region. The report includes ground velocity time histories, as well as, derived site transfer functions. These data have been collected as part of a study to evaluate the validity of using low-level ground motions to predict the frequency-dependent response of a site during an earthquake. For this study 19 nuclear events were recorded at 98 separate locations. Some of these sites have recorded more than one of the nuclear explosions, and, consequently, there are a total of 159, three-component station records. The location of all the recording sites are shown in figures 1–5, the station coordinates and abbreviations are given in table 1. The station addresses are listed in table 2, and the nuclear explosions that were recorded are listed in table 3. The recording sites were chosen on the basis of three criteria: (1) that the underlying geological conditions were representative of conditions over significant areas of the region, (2) that the site was the location of a strong-motion recording of the 1971 San Fernando earthquake, or (3) that more complete geographical coverage was required in that location.

  6. Psychometric properties for the Balanced Inventory of Desirable Responding: dichotomous versus polytomous conventional and IRT scoring.

    PubMed

    Vispoel, Walter P; Kim, Han Yi

    2014-09-01

    [Correction Notice: An Erratum for this article was reported in Vol 26(3) of Psychological Assessment (see record 2014-16017-001). The mean, standard deviation and alpha coefficient originally reported in Table 1 should be 74.317, 10.214 and .802, respectively. The validity coefficients in the last column of Table 4 are affected as well. Correcting this error did not change the substantive interpretations of the results, but did increase the mean, standard deviation, alpha coefficient, and validity coefficients reported for the Honesty subscale in the text and in Tables 1 and 4. The corrected versions of Tables 1 and Table 4 are shown in the erratum.] Item response theory (IRT) models were applied to dichotomous and polytomous scoring of the Self-Deceptive Enhancement and Impression Management subscales of the Balanced Inventory of Desirable Responding (Paulhus, 1991, 1999). Two dichotomous scoring methods reflecting exaggerated endorsement and exaggerated denial of socially desirable behaviors were examined. The 1- and 2-parameter logistic models (1PLM, 2PLM, respectively) were applied to dichotomous responses, and the partial credit model (PCM) and graded response model (GRM) were applied to polytomous responses. For both subscales, the 2PLM fit dichotomous responses better than did the 1PLM, and the GRM fit polytomous responses better than did the PCM. Polytomous GRM and raw scores for both subscales yielded higher test-retest and convergent validity coefficients than did PCM, 1PLM, 2PLM, and dichotomous raw scores. Information plots showed that the GRM provided consistently high measurement precision that was superior to that of all other IRT models over the full range of both construct continuums. Dichotomous scores reflecting exaggerated endorsement of socially desirable behaviors provided noticeably weak precision at low levels of the construct continuums, calling into question the use of such scores for detecting instances of "faking bad." Dichotomous models reflecting exaggerated denial of the same behaviors yielded much better precision at low levels of the constructs, but it was still less precision than that of the GRM. These results support polytomous over dichotomous scoring in general, alternative dichotomous scoring for detecting faking bad, and extension of GRM scoring to situations in which IRT offers additional practical advantages over classical test theory (adaptive testing, equating, linking, scaling, detecting differential item functioning, and so forth). PsycINFO Database Record (c) 2014 APA, all rights reserved.

  7. Empirical correlates for the Minnesota Multiphasic Personality Inventory-2-Restructured Form in a German inpatient sample.

    PubMed

    Moultrie, Josefine K; Engel, Rolf R

    2017-10-01

    We identified empirical correlates for the 42 substantive scales of the German language version of the Minnesota Multiphasic Personality Inventory (MMPI)-2-Restructured Form (MMPI-2-RF): Higher Order, Restructured Clinical, Specific Problem, Interest, and revised Personality Psychopathology Five scales. We collected external validity data by means of a 177-item chart review form in a sample of 488 psychiatric inpatients of a German university hospital. We structured our findings along the interpretational guidelines for the MMPI-2-RF and compared them with the validity data published in the tables of the MMPI-2-RF Technical Manual. Our results show significant correlations between MMPI-2-RF scales and conceptually relevant criteria. Most of the results were in line with U.S. validation studies. Some of the differences could be attributed to sample compositions. For most of the scales, construct validity coefficients were acceptable. Taken together, this study amplifies the enlarging body of research on empirical correlates of the MMPI-2-RF scales in a new sample. The study suggests that the interpretations given in the MMPI-2-RF manual may be generalizable to the German language MMPI-2-RF. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. Step 4: Get Routine Care to Avoid Problems | NIH MedlinePlus the Magazine

    MedlinePlus

    ... Fall 2014 Table of Contents Accelerating Medicines Partnership (AMP—Part 3 of 4) Type 2 Diabetes The ... organizations have together created the Accelerating Medicines Partnership (AMP) to develop new models for identifying and validating ...

  9. Strengths and Difficulties Questionnaire: internal validity and reliability for New Zealand preschoolers.

    PubMed

    Kersten, Paula; Vandal, Alain C; Elder, Hinemoa; McPherson, Kathryn M

    2018-04-21

    This observational study examines the internal construct validity, internal consistency and cross-informant reliability of the Strengths and Difficulties Questionnaire (SDQ) in a New Zealand preschool population across four ethnicity strata (New Zealand European, Māori, Pasifika, Asian). Rasch analysis was employed to examine internal validity on a subsample of 1000 children. Internal consistency (n=29 075) and cross-informant reliability (n=17 006) were examined using correlations, intraclass correlation coefficients and Cronbach's alpha on the sample available for such analyses. Data were used from a national SDQ database provided by the funder, pertaining to New Zealand domiciled children aged 4 and 5 and scored by their parents and teachers. The five subscales do not fit the Rasch model (as indicated by the overall fit statistics), contain items that are biased (differential item functioning (DIF)) by key variables, suffer from a floor and ceiling effect and have unacceptable internal consistency. After dealing with DIF, the Total Difficulty scale does fit the Rasch model and has good internal consistency. Parent/teacher inter-rater reliability was unacceptably low for all subscales. The five SDQ subscales are not valid and not suitable for use in their own right in New Zealand. We have provided a conversion table for the Total Difficulty scale, which takes account of bias by ethnic group. Clinicians should use this conversion table in order to reconcile DIF by culture in final scores. It is advisable to use both parents and teachers' feedback when considering children's needs for referral of further assessment. Future work should examine whether validity is impacted by different language versions used in the same country. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  10. Lookup Tables Versus Stacked Rasch Analysis in Comparing Pre- and Postintervention Adult Strabismus-20 Data.

    PubMed

    Leske, David A; Hatt, Sarah R; Liebermann, Laura; Holmes, Jonathan M

    2016-02-01

    We compare two methods of analysis for Rasch scoring pre- to postintervention data: Rasch lookup table versus de novo stacked Rasch analysis using the Adult Strabismus-20 (AS-20). One hundred forty-seven subjects completed the AS-20 questionnaire prior to surgery and 6 weeks postoperatively. Subjects were classified 6 weeks postoperatively as "success," "partial success," or "failure" based on angle and diplopia status. Postoperative change in AS-20 scores was compared for all four AS-20 domains (self-perception, interactions, reading function, and general function) overall and by success status using two methods: (1) applying historical Rasch threshold measures from lookup tables and (2) performing a stacked de novo Rasch analysis. Change was assessed by analyzing effect size, improvement exceeding 95% limits of agreement (LOA), and score distributions. Effect sizes were similar for all AS-20 domains whether obtained from lookup tables or stacked analysis. Similar proportions exceeded 95% LOAs using lookup tables versus stacked analysis. Improvement in median score was observed for all AS-20 domains using lookup tables and stacked analysis ( P < 0.0001 for all comparisons). The Rasch-scored AS-20 is a responsive and valid instrument designed to measure strabismus-specific health-related quality of life. When analyzing pre- to postoperative change in AS-20 scores, Rasch lookup tables and de novo stacked Rasch analysis yield essentially the same results. We describe a practical application of lookup tables, allowing the clinician or researcher to score the Rasch-calibrated AS-20 questionnaire without specialized software.

  11. Lookup Tables Versus Stacked Rasch Analysis in Comparing Pre- and Postintervention Adult Strabismus-20 Data

    PubMed Central

    Leske, David A.; Hatt, Sarah R.; Liebermann, Laura; Holmes, Jonathan M.

    2016-01-01

    Purpose We compare two methods of analysis for Rasch scoring pre- to postintervention data: Rasch lookup table versus de novo stacked Rasch analysis using the Adult Strabismus-20 (AS-20). Methods One hundred forty-seven subjects completed the AS-20 questionnaire prior to surgery and 6 weeks postoperatively. Subjects were classified 6 weeks postoperatively as “success,” “partial success,” or “failure” based on angle and diplopia status. Postoperative change in AS-20 scores was compared for all four AS-20 domains (self-perception, interactions, reading function, and general function) overall and by success status using two methods: (1) applying historical Rasch threshold measures from lookup tables and (2) performing a stacked de novo Rasch analysis. Change was assessed by analyzing effect size, improvement exceeding 95% limits of agreement (LOA), and score distributions. Results Effect sizes were similar for all AS-20 domains whether obtained from lookup tables or stacked analysis. Similar proportions exceeded 95% LOAs using lookup tables versus stacked analysis. Improvement in median score was observed for all AS-20 domains using lookup tables and stacked analysis (P < 0.0001 for all comparisons). Conclusions The Rasch-scored AS-20 is a responsive and valid instrument designed to measure strabismus-specific health-related quality of life. When analyzing pre- to postoperative change in AS-20 scores, Rasch lookup tables and de novo stacked Rasch analysis yield essentially the same results. Translational Relevance We describe a practical application of lookup tables, allowing the clinician or researcher to score the Rasch-calibrated AS-20 questionnaire without specialized software. PMID:26933524

  12. Kepler Data Validation Time Series File: Description of File Format and Content

    NASA Technical Reports Server (NTRS)

    Mullally, Susan E.

    2016-01-01

    The Kepler space mission searches its time series data for periodic, transit-like signatures. The ephemerides of these events, called Threshold Crossing Events (TCEs), are reported in the TCE tables at the NASA Exoplanet Archive (NExScI). Those TCEs are then further evaluated to create planet candidates and populate the Kepler Objects of Interest (KOI) table, also hosted at the Exoplanet Archive. The search, evaluation and export of TCEs is performed by two pipeline modules, TPS (Transit Planet Search) and DV (Data Validation). TPS searches for the strongest, believable signal and then sends that information to DV to fit a transit model, compute various statistics, and remove the transit events so that the light curve can be searched for other TCEs. More on how this search is done and on the creation of the TCE table can be found in Tenenbaum et al. (2012), Seader et al. (2015), Jenkins (2002). For each star with at least one TCE, the pipeline exports a file that contains the light curves used by TPS and DV to find and evaluate the TCE(s). This document describes the content of these DV time series files, and this introduction provides a bit of context for how the data in these files are used by the pipeline.

  13. NCEL (Naval Civil Engineering Laboratory) Quarterly Abstracts of Technical Documents, 1 April to 30 June 1987.

    DTIC Science & Technology

    1987-06-30

    release; distribution unlimited. 87 8 3075 TABLE OF CONTENTS page TECHNICAL NOTES N-1764 Validation of Nitronic 33 in Reinforced and Prestressed...TECHNICAL WES K- 1764 Validation of Nitrovic 33 In Reeinforced and Prestressed Concrete, Apr 1987, James F. Jenkins (public release) Nitronic 33...prestressing strand are not acceptable. Before Nitronic 33 stainless steel prestressed concrete waterfront structures were constructed, it was necessary to

  14. The Model Analyst’s Toolkit: Scientific Model Development, Analysis, and Validation

    DTIC Science & Technology

    2014-05-20

    but there can still be many recommendations generated. Therefore, the recommender results are displayed in a sortable table where each row is a...reporting period. Since the synthesis graph can be complex and have many dependencies, the system must determine the order of evaluation of nodes, and...validation failure, if any. 3.1. Automatic Feature Extraction In many domains, causal models can often be more readily described as patterns of

  15. Experiments and Simulations of Exploding Aluminum Wires: Validation of ALEGRA-MHD

    DTIC Science & Technology

    2010-09-01

    ii REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 ...currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1 . REPORT DATE (DD-MM-YYYY) September 2010 2. REPORT TYPE Final...List of Tables vi Acknowledgements vii 1 . Introduction 1 2. Experimental Setup 2 3. Computational Setup 5 3.1 Description of ALEGRA

  16. Toward a periodic table of personality: Mapping personality scales between the five-factor model and the circumplex model.

    PubMed

    Woods, Stephen A; Anderson, Neil R

    2016-04-01

    In this study, we examine the structures of 10 personality inventories (PIs) widely used for personnel assessment by mapping the scales of PIs to the lexical Big Five circumplex model resulting in a Periodic Table of Personality. Correlations between 273 scales from 10 internationally popular PIs with independent markers of the lexical Big Five are reported, based on data from samples in 2 countries (United Kingdom, N = 286; United States, N = 1,046), permitting us to map these scales onto the Abridged Big Five Dimensional Circumplex model (Hofstee, de Raad, & Goldberg, 1992). Emerging from our findings we propose a common facet framework derived from the scales of the PIs in our study. These results provide important insights into the literature on criterion-related validity of personality traits, and enable researchers and practitioners to understand how different PI scales converge and diverge and how compound PI scales may be constructed or replicated. Implications for research and practice are considered. (c) 2016 APA, all rights reserved).

  17. An empirical method for approximating stream baseflow time series using groundwater table fluctuations

    NASA Astrophysics Data System (ADS)

    Meshgi, Ali; Schmitter, Petra; Babovic, Vladan; Chui, Ting Fong May

    2014-11-01

    Developing reliable methods to estimate stream baseflow has been a subject of interest due to its importance in catchment response and sustainable watershed management. However, to date, in the absence of complex numerical models, baseflow is most commonly estimated using statistically derived empirical approaches that do not directly incorporate physically-meaningful information. On the other hand, Artificial Intelligence (AI) tools such as Genetic Programming (GP) offer unique capabilities to reduce the complexities of hydrological systems without losing relevant physical information. This study presents a simple-to-use empirical equation to estimate baseflow time series using GP so that minimal data is required and physical information is preserved. A groundwater numerical model was first adopted to simulate baseflow for a small semi-urban catchment (0.043 km2) located in Singapore. GP was then used to derive an empirical equation relating baseflow time series to time series of groundwater table fluctuations, which are relatively easily measured and are physically related to baseflow generation. The equation was then generalized for approximating baseflow in other catchments and validated for a larger vegetation-dominated basin located in the US (24 km2). Overall, this study used GP to propose a simple-to-use equation to predict baseflow time series based on only three parameters: minimum daily baseflow of the entire period, area of the catchment and groundwater table fluctuations. It serves as an alternative approach for baseflow estimation in un-gauged systems when only groundwater table and soil information is available, and is thus complementary to other methods that require discharge measurements.

  18. Estimation of skull table thickness with clinical CT and validation with microCT.

    PubMed

    Lillie, Elizabeth M; Urban, Jillian E; Weaver, Ashley A; Powers, Alexander K; Stitzel, Joel D

    2015-01-01

    Brain injuries resulting from motor vehicle crashes (MVC) are extremely common yet the details of the mechanism of injury remain to be well characterized. Skull deformation is believed to be a contributing factor to some types of traumatic brain injury (TBI). Understanding biomechanical contributors to skull deformation would provide further insight into the mechanism of head injury resulting from blunt trauma. In particular, skull thickness is thought be a very important factor governing deformation of the skull and its propensity for fracture. Current computed tomography (CT) technology is limited in its ability to accurately measure cortical thickness using standard techniques. A method to evaluate cortical thickness using cortical density measured from CT data has been developed previously. This effort validates this technique for measurement of skull table thickness in clinical head CT scans using two postmortem human specimens. Bone samples were harvested from the skulls of two cadavers and scanned with microCT to evaluate the accuracy of the estimated cortical thickness measured from clinical CT. Clinical scans were collected at 0.488 and 0.625 mm in plane resolution with 0.625 mm thickness. The overall cortical thickness error was determined to be 0.078 ± 0.58 mm for cortical samples thinner than 4 mm. It was determined that 91.3% of these differences fell within the scanner resolution. Color maps of clinical CT thickness estimations are comparable to color maps of microCT thickness measurements, indicating good quantitative agreement. These data confirm that the cortical density algorithm successfully estimates skull table thickness from clinical CT scans. The application of this technique to clinical CT scans enables evaluation of cortical thickness in population-based studies. © 2014 Anatomical Society.

  19. Evaluation of a distributed catchment scale water balance model

    NASA Technical Reports Server (NTRS)

    Troch, Peter A.; Mancini, Marco; Paniconi, Claudio; Wood, Eric F.

    1993-01-01

    The validity of some of the simplifying assumptions in a conceptual water balance model is investigated by comparing simulation results from the conceptual model with simulation results from a three-dimensional physically based numerical model and with field observations. We examine, in particular, assumptions and simplifications related to water table dynamics, vertical soil moisture and pressure head distributions, and subsurface flow contributions to stream discharge. The conceptual model relies on a topographic index to predict saturation excess runoff and on Philip's infiltration equation to predict infiltration excess runoff. The numerical model solves the three-dimensional Richards equation describing flow in variably saturated porous media, and handles seepage face boundaries, infiltration excess and saturation excess runoff production, and soil driven and atmosphere driven surface fluxes. The study catchments (a 7.2 sq km catchment and a 0.64 sq km subcatchment) are located in the North Appalachian ridge and valley region of eastern Pennsylvania. Hydrologic data collected during the MACHYDRO 90 field experiment are used to calibrate the models and to evaluate simulation results. It is found that water table dynamics as predicted by the conceptual model are close to the observations in a shallow water well and therefore, that a linear relationship between a topographic index and the local water table depth is found to be a reasonable assumption for catchment scale modeling. However, the hydraulic equilibrium assumption is not valid for the upper 100 cm layer of the unsaturated zone and a conceptual model that incorporates a root zone is suggested. Furthermore, theoretical subsurface flow characteristics from the conceptual model are found to be different from field observations, numerical simulation results, and theoretical baseflow recession characteristics based on Boussinesq's groundwater equation.

  20. Assessment of the MPACT Resonance Data Generation Procedure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Kang Seog; Williams, Mark L.

    Currently, heterogeneous models are being used to generate resonance self-shielded cross-section tables as a function of background cross sections for important nuclides such as 235U and 238U by performing the CENTRM (Continuous Energy Transport Model) slowing down calculation with the MOC (Method of Characteristics) spatial discretization and ESSM (Embedded Self-Shielding Method) calculations to obtain background cross sections. And then the resonance self-shielded cross section tables are converted into subgroup data which are to be used in estimating problem-dependent self-shielded cross sections in MPACT (Michigan Parallel Characteristics Transport Code). Although this procedure has been developed and thus resonance data have beenmore » generated and validated by benchmark calculations, assessment has never been performed to review if the resonance data are properly generated by the procedure and utilized in MPACT. This study focuses on assessing the procedure and a proper use in MPACT.« less

  1. Identification of sea ice types in spaceborne synthetic aperture radar data

    NASA Technical Reports Server (NTRS)

    Kwok, Ronald; Rignot, Eric; Holt, Benjamin; Onstott, R.

    1992-01-01

    This study presents an approach for identification of sea ice types in spaceborne SAR image data. The unsupervised classification approach involves cluster analysis for segmentation of the image data followed by cluster labeling based on previously defined look-up tables containing the expected backscatter signatures of different ice types measured by a land-based scatterometer. Extensive scatterometer observations and experience accumulated in field campaigns during the last 10 yr were used to construct these look-up tables. The classification approach, its expected performance, the dependence of this performance on radar system performance, and expected ice scattering characteristics are discussed. Results using both aircraft and simulated ERS-1 SAR data are presented and compared to limited field ice property measurements and coincident passive microwave imagery. The importance of an integrated postlaunch program for the validation and improvement of this approach is discussed.

  2. Testate amoeba transfer function performance along localised hydrological gradients.

    PubMed

    Tsyganov, Andrey N; Mityaeva, Olga A; Mazei, Yuri A; Payne, Richard J

    2016-09-01

    Testate amoeba transfer functions are widely used for reconstruction of palaeo-hydrological regime in peatlands. However, the limitations of this approach have become apparent with increasing attention to validation and assessing sources of uncertainty. This paper investigates effects of peatland type and sampling depth on the performance of a transfer function using an independent test-set from four Sphagnum-dominated sites in European Russia (Penza Region). We focus on transfer function performance along localised hydrological gradients, which is a useful analogue for predictive ability through time. The performance of the transfer function with the independent test-set was generally weaker than for the leave-one-out or bootstrap cross-validations. However, the transfer function was robust for the reconstruction of relative changes in water-table depth, provided the presence of good modern analogues and overlap in water-table depth ranges. When applied to subsurface samples, the performance of the transfer function was reduced due to selective decomposition, the presence of deep-dwelling taxa or vertical transfer of shells. Our results stress the importance of thorough testing of transfer functions, and highlight the role of taphonomic processes in determining results. Further studies of stratification, taxonomy and taphonomy of testate amoebae will be needed to improve the robustness of transfer function output. Copyright © 2015 Elsevier GmbH. All rights reserved.

  3. Linguistic validation of stigmatisation degree, self-esteem and knowledge questionnaire among asthma patients using Rasch analysis.

    PubMed

    Ahmad, Sohail; Ismail, Ahmad Izuanuddin; Khan, Tahir Mehmood; Akram, Waqas; Mohd Zim, Mohd Arif; Ismail, Nahlah Elkudssiah

    2017-04-01

    The stigmatisation degree, self-esteem and knowledge either directly or indirectly influence the control and self-management of asthma. To date, there is no valid and reliable instrument that can assess these key issues collectively. The main aim of this study was to test the reliability and validity of the newly devised and translated "Stigmatisation Degree, Self-Esteem and Knowledge Questionnaire" among adult asthma patients using the Rasch measurement model. This cross-sectional study recruited thirty adult asthma patients from two respiratory specialist clinics in Selangor, Malaysia. The newly devised self-administered questionnaire was adapted from relevant publications and translated into the Malay language using international standard translation guidelines. Content and face validation was done. The data were extracted and analysed for real item reliability and construct validation using the Rasch model. The translated "Stigmatisation Degree, Self-Esteem and Knowledge Questionnaire" showed high real item reliability values of 0.90, 0.86 and 0.89 for stigmatisation degree, self-esteem, and knowledge of asthma, respectively. Furthermore, all values of point measure correlation (PTMEA Corr) analysis were within the acceptable specified range of the Rasch model. Infit/outfit mean square values and Z standard (ZSTD) values of each item verified the construct validity and suggested retaining all the items in the questionnaire. The reliability analyses and output tables of item measures for construct validation proved the translated Malaysian version of "Stigmatisation Degree, Self-Esteem and Knowledge Questionnaire" as a valid and highly reliable questionnaire.

  4. Use of the ETA-1 reactor for the validation of the multi-group APOLLO2-MORET 5 code and the Monte Carlo continuous energy MORET 5 code

    NASA Astrophysics Data System (ADS)

    Leclaire, N.; Cochet, B.; Le Dauphin, F. X.; Haeck, W.; Jacquet, O.

    2014-06-01

    The present paper aims at providing experimental validation for the use of the MORET 5 code for advanced concepts of reactor involving thorium and heavy water. It therefore constitutes an opportunity to test and improve the thermal-scattering data of heavy water and also to test the recent implementation of probability tables in the MORET 5 code.

  5. Detailed Validation of the Bidirectional Effect in Various Case 1 and Case 2 Waters

    DTIC Science & Technology

    2012-03-26

    of the viewing direction, i.e., they assumed a completely diffuse BRDF . Previous efforts to model / understand the actual BRDF [4-10] have produced...places. Second, the MAG2002 BRDF tables were developed from a radiative transfer (RT) model that used scattering particle phase functions that...situ measurements from just 3 locations to validate their model ; here we used a much larger data set across a wide variety of inherent optical

  6. Controlled laboratory experiments and modeling of vegetative filter strips with shallow water tables

    NASA Astrophysics Data System (ADS)

    Fox, Garey A.; Muñoz-Carpena, Rafael; Purvis, Rebecca A.

    2018-01-01

    Natural or planted vegetation at the edge of fields or adjacent to streams, also known as vegetative filter strips (VFS), are commonly used as an environmental mitigation practice for runoff pollution and agrochemical spray drift. The VFS position in lowlands near water bodies often implies the presence of a seasonal shallow water table (WT). In spite of its potential importance, there is limited experimental work that systematically studies the effect of shallow WTs on VFS efficacy. Previous research recently coupled a new physically based algorithm describing infiltration into soils bounded by a water table into the VFS numerical overland flow and transport model, VFSMOD, to simulate VFS dynamics under shallow WT conditions. In this study, we tested the performance of the model against laboratory mesoscale data under controlled conditions. A laboratory soil box (1.0 m wide, 2.0 m long, and 0.7 m deep) was used to simulate a VFS and quantify the influence of shallow WTs on runoff. Experiments included planted Bermuda grass on repacked silt loam and sandy loam soils. A series of experiments were performed including a free drainage case (no WT) and a static shallow water table (0.3-0.4 m below ground surface). For each soil type, this research first calibrated VFSMOD to the observed outflow hydrograph for the free drainage experiments to parameterize the soil hydraulic and vegetation parameters, and then evaluated the model based on outflow hydrographs for the shallow WT experiments. This research used several statistical metrics and a new approach based on hypothesis testing of the Nash-Sutcliffe model efficiency coefficient (NSE) to evaluate model performance. The new VFSMOD routines successfully simulated the outflow hydrographs under both free drainage and shallow WT conditions. Statistical metrics considered the model performance valid with greater than 99.5% probability across all scenarios. This research also simulated the shallow water table experiments with both free drainage and various water table depths to quantify the effect of assuming the former boundary condition. For these two soil types, shallow WTs within 1.0-1.2 m below the soil surface influenced infiltration. Existing models will suggest a more protective vegetative filter strip than what actually exists if shallow water table conditions are not considered.

  7. Assessing Forest Carbon Response to Climate Change and Disturbances Using Long-term Hydro-climatic Observations and Simulations

    NASA Astrophysics Data System (ADS)

    Trettin, C.; Dai, Z.; Amatya, D. M.

    2014-12-01

    Long-term climatic and hydrologic observations on the Santee Experimental Forest in the lower coastal plain of South Carolina were used to estimate long-term changes in hydrology and forest carbon dynamics for a pair of first-order watersheds. Over 70 years of climate data indicated that warming in this forest area in the last decades was faster than the global mean; 35+ years of hydrologic records showed that forest ecosystem succession three years following Hurricane Hugo caused a substantial change in the ratio of runoff to precipitation. The change in this relationship between the paired watersheds was attributed to altered evapotranspiration processes caused by greater abundance of pine in the treatment watershed and regeneration of the mixed hardwood-pine forest on the reference watershed. The long-term records and anomalous observations are highly valuable for reliable calibration and validation of hydrological and biogeochemical models capturing the effects of climate variability. We applied the hydrological model MIKESHE that showed that runoff and water table level are sensitive to global warming, and that the sustained warming trends can be expected to decrease stream discharge and lower the mean water table depth. The spatially-explicit biogeochemical model Forest-DNDC, validated using biomass measurements from the watersheds, was used to assess carbon dynamics in response to high resolution hydrologic observation data and simulation results. The simulations showed that the long-term spatiotemporal carbon dynamics, including biomass and fluxes of soil carbon dioxide and methane were highly regulated by disturbance regimes, climatic conditions and water table depth. The utility of linked-modeling framework demonstrated here to assess biogeochemical responses at the watershed scale suggests applications for assessing the consequences of climate change within an urbanizing forested landscape. The approach may also be applicable for validating large-scale models.

  8. Modeling water table dynamics in managed and restored peatlands

    NASA Astrophysics Data System (ADS)

    Cresto Aleina, Fabio; Rasche, Livia; Hermans, Renée; Subke, Jens-Arne; Schneider, Uwe; Brovkin, Victor

    2016-04-01

    European peatlands have been extensively managed over past centuries. Typical management activities consisted of drainage and afforestation, which lead to considerable damage to the peat and potentially significant carbon loss. Recent efforts to restore previously managed peatlands have been carried out throughout Europe. These restoration efforts have direct implications for water table depth and greenhouse gas emissions, thus impacting on the ecosystem services provided by peatland areas. In order to quantify the impact of peatland restoration on water table depth and greenhouse gas budget, We coupled the Environmental Policy Integrated Climate (EPIC) model to a process-based model for methane emissions (Walter and Heimann, 2000). The new model (EPIC-M) can potentially be applied at the European and even at the global scale, but it is yet to be tested and evaluated. We present results of this new tool from different peatlands in the Flow Country, Scotland. Large parts of the peatlands of the region have been drained and afforested during the 1980s, but since the late 1990s, programs to restore peatlands in the Flow Country have been enforced. This region offers therefore a range of peatlands, from near pristine, to afforested and drained, with different resoration ages in between, where we can apply the EPIC-M model and validate it against experimental data from all land stages of restoration Goals of this study are to evaluate the EPIC-M model and its performances against in situ measurements of methane emissions and water table changes in drained peatlands and in restored ones. Secondly, our purpose is to study the environmental impact of peatland restoration, including methane emissions, due to the rewetting of drained surfaces. To do so, we forced the EPIC-M model with local meteorological and soil data, and simulated soil temperatures, water table dynamics, and greenhouse gas emissions. This is the first step towards a European-wide application of the EPIC-M model for the assessment of the environmental impact of peatland restoration.

  9. Stochastic analysis of unsaturated steady flows above the water table

    NASA Astrophysics Data System (ADS)

    Severino, Gerardo; Scarfato, Maddalena; Comegna, Alessandro

    2017-08-01

    Steady flow takes place into a three-dimensional partially saturated porous medium where, due to their spatial variability, the saturated conductivity Ks, and the relative conductivity Kr are modeled as random space functions (RSF)s. As a consequence, the flow variables (FVs), i.e., pressure-head and specific flux, are also RSFs. The focus of the present paper consists into quantifying the uncertainty of the FVs above the water table. The simple expressions (most of which in closed form) of the second-order moments pertaining to the FVs allow one to follow the transitional behavior from the zone close to the water table (where the FVs are nonstationary), till to their far-field limit (where the FVs become stationary RSFs). In particular, it is shown how the stationary limits (and the distance from the water table at which stationarity is attained) depend upon the statistical structure of the RSFs Ks, Kr, and the infiltrating rate. The mean pressure head ><Ψ>> has been also computed, and it is expressed as <Ψ>=Ψ0>(1+ψ>), being ψ a characteristic heterogeneity function which modifies the zero-order approximation Ψ0 of the pressure head (valid for a vadose zone of uniform soil properties) to account for the spatial variability of Ks and Kr. Two asymptotic limits, i.e., close (near field) and away (far field) from the water table, are derived into a very general manner, whereas the transitional behavior of ψ between the near/far field can be determined after specifying the shape of the various input soil properties. Besides the theoretical interest, results of the present paper are useful for practical purposes, as well. Indeed, the model is tested against to real data, and in particular it is shown how it is possible for the specific case study to grasp the behavior of the FVs within an environment (i.e., the vadose zone close to the water table) which is generally very difficult to access by direct inspection.

  10. Linking Associations of Rare Low-Abundance Species to Their Environments by Association Networks

    DOE PAGES

    Karpinets, Tatiana V.; Gopalakrishnan, Vancheswaran; Wargo, Jennifer; ...

    2018-03-07

    Studies of microbial communities by targeted sequencing of rRNA genes lead to recovering numerous rare low-abundance taxa with unknown biological roles. We propose to study associations of such rare organisms with their environments by a computational framework based on transformation of the data into qualitative variables. Namely, we analyze the sparse table of putative species or OTUs (operational taxonomic units) and samples generated in such studies, also known as an OTU table, by collecting statistics on co-occurrences of the species and on shared species richness across samples. Based on the statistics we built two association networks, of the rare putativemore » species and of the samples respectively, using a known computational technique, Association networks (Anets) developed for analysis of qualitative data. Clusters of samples and clusters of OTUs are then integrated and combined with metadata of the study to produce a map of associated putative species in their environments. We tested and validated the framework on two types of microbiomes, of human body sites and that of the Populus tree root systems. We show that in both studies the associations of OTUs can separate samples according to environmental or physiological characteristics of the studied systems.« less

  11. Linking Associations of Rare Low-Abundance Species to Their Environments by Association Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpinets, Tatiana V.; Gopalakrishnan, Vancheswaran; Wargo, Jennifer

    Studies of microbial communities by targeted sequencing of rRNA genes lead to recovering numerous rare low-abundance taxa with unknown biological roles. We propose to study associations of such rare organisms with their environments by a computational framework based on transformation of the data into qualitative variables. Namely, we analyze the sparse table of putative species or OTUs (operational taxonomic units) and samples generated in such studies, also known as an OTU table, by collecting statistics on co-occurrences of the species and on shared species richness across samples. Based on the statistics we built two association networks, of the rare putativemore » species and of the samples respectively, using a known computational technique, Association networks (Anets) developed for analysis of qualitative data. Clusters of samples and clusters of OTUs are then integrated and combined with metadata of the study to produce a map of associated putative species in their environments. We tested and validated the framework on two types of microbiomes, of human body sites and that of the Populus tree root systems. We show that in both studies the associations of OTUs can separate samples according to environmental or physiological characteristics of the studied systems.« less

  12. The main indicators for Iranian hospital ethical accreditation

    PubMed Central

    ENJOO, SEYED ALI; AMINI, MITRA; TABEI, SEYED ZIAADIN; MAHBUDI, ALI; KAVOSI, ZAHRA; SABER, MAHBOOBEH

    2015-01-01

    Introduction The application of organizational ethics in hospitals is one of the novel ways to improve medical ethics. Nowadays achieving efficient and sufficient ethical hospital indicators seems to be inevitable. In this connection, the present study aims to determine the best indicators in hospital accreditation. Methods 69 indicators in 11 fields to evaluate hospital ethics were achieved through a five-step qualitative and quantitative study including literature review, expert focus group, Likert scale survey, 3 rounded Delphi, and content validity measurement. Expert focus group meeting was conducted, employing Nominal Group Technique (NGT). After running NGT, a three rounded Delphi and parallel to Delphi and a Likert scale survey were performed to obtain objective indicators for each domain. The experts were all healthcare professionals who were also medical ethics researchers, teachers, or PhD students. Content validity measurements were computed, using the viewpoints of two different expert groups, some ethicists, and some health care professionals (n=46). Results After conducting NGT, Delphi, Likert survey, 11 main domains were listed including:  Informed consent, Medical confidentiality, Physician-patient economic relations, Ethics consultation policy in the hospital, Ethical charter of hospital, Breaking bad medical news protocol, Respect for the patients’ rights, Clinical ethics committee, Spiritual and palliative care unit programs in the hospitals, Healthcare professionals’ communication skills, and Equitable access to the healthcare. Also 71 objective indicators for these 11 domains were listed in 11 tables with 5 to 8 indicators per table. Content Validity Ratio (CVR) measurements were done and 69 indicators were highlighted. Conclusion The domains listed in this study seem to be the most important ones for evaluating hospital ethics programs and services. Healthcare organizations’ accreditation and ranking are crucial for the improvement of healthcare services. Ethics programs would also motivate hospitals to improve their services and move towards patients’ satisfaction. In this regard, more involvement of bioethicists can help healthcare organizations to develop ethics programs and ensure ethics-based practice in hospitals. PMID:26269789

  13. Determining suitable dimensions for dairy goat feeding places by evaluating body posture and feeding reach.

    PubMed

    Keil, Nina M; Pommereau, Marc; Patt, Antonia; Wechsler, Beat; Gygax, Lorenz

    2017-02-01

    Confined goats spend a substantial part of the day feeding. A poorly designed feeding place increases the risk of feeding in nonphysiological body postures, and even injury. Scientifically validated information on suitable dimensions of feeding places for loose-housed goats is almost absent from the literature. The aim of the present study was, therefore, to determine feeding place dimensions that would allow goats to feed in a species-appropriate, relaxed body posture. A total of 27 goats with a height at the withers of 62 to 80 cm were included in the study. Goats were tested individually in an experimental feeding stall that allowed the height difference between the feed table, the standing area of the forelegs, and a feeding area step (difference in height between forelegs and hind legs) to be varied. The goats accessed the feed table via a palisade feeding barrier. The feed table was equipped with recesses at varying distances to the feeding barrier (5-55 cm in 5-cm steps) at angles of 30°, 60°, 90°, 120°, or 150° (feeding angle), which were filled with the goats' preferred food. In 18 trials, balanced for order across animals, each animal underwent all possible combinations of feeding area step (3 levels: 0, 10, and 20 cm) and of difference in height between feed table and standing area of forelegs (6 levels: 0, 5, 10, 15, 20, and 25 cm). The minimum and maximum reach at which the animals could reach feed on the table with a relaxed body posture was determined for each combination. Statistical analysis was performed using mixed-effects models. The animals were able to feed with a relaxed posture when the feed table was at least 10 cm higher than the standing height of the goats' forelegs. Larger goats achieved smaller minimum reaches and minimum reach increased if the goats' head and neck were angled. Maximum reach increased with increasing height at withers and height of the feed table. The presence of a feeding area step had no influence on minimum and maximum reach. Based on these results, the goats' feeding place can be designed to ensure that the animals are able to reach all of the feed in the manger or on the feed table with a relaxed posture, thus avoiding injuries and nonphysiological stress on joints and hooves. A feeding area step up to a maximum of 20 cm need not be taken into account in terms of feeding reach. However, the feed table must be raised at least 10 cm above the standing area to allow the goats to feed in a species-appropriate, relaxed posture. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  14. Reliability and Validity of Instruments for Assessing Perinatal Depression in African Settings: Systematic Review and Meta-Analysis

    PubMed Central

    Tsai, Alexander C.; Scott, Jennifer A.; Hung, Kristin J.; Zhu, Jennifer Q.; Matthews, Lynn T.; Psaros, Christina; Tomlinson, Mark

    2013-01-01

    Background A major barrier to improving perinatal mental health in Africa is the lack of locally validated tools for identifying probable cases of perinatal depression or for measuring changes in depression symptom severity. We systematically reviewed the evidence on the reliability and validity of instruments to assess perinatal depression in African settings. Methods and Findings Of 1,027 records identified through searching 7 electronic databases, we reviewed 126 full-text reports. We included 25 unique studies, which were disseminated in 26 journal articles and 1 doctoral dissertation. These enrolled 12,544 women living in nine different North and sub-Saharan African countries. Only three studies (12%) used instruments developed specifically for use in a given cultural setting. Most studies provided evidence of criterion-related validity (20 [80%]) or reliability (15 [60%]), while fewer studies provided evidence of construct validity, content validity, or internal structure. The Edinburgh postnatal depression scale (EPDS), assessed in 16 studies (64%), was the most frequently used instrument in our sample. Ten studies estimated the internal consistency of the EPDS (median estimated coefficient alpha, 0.84; interquartile range, 0.71-0.87). For the 14 studies that estimated sensitivity and specificity for the EPDS, we constructed 2 x 2 tables for each cut-off score. Using a bivariate random-effects model, we estimated a pooled sensitivity of 0.94 (95% confidence interval [CI], 0.68-0.99) and a pooled specificity of 0.77 (95% CI, 0.59-0.88) at a cut-off score of ≥9, with higher cut-off scores yielding greater specificity at the cost of lower sensitivity. Conclusions The EPDS can reliably and validly measure perinatal depression symptom severity or screen for probable postnatal depression in African countries, but more validation studies on other instruments are needed. In addition, more qualitative research is needed to adequately characterize local understandings of perinatal depression-like syndromes in different African contexts. PMID:24340036

  15. Validation of asthma recording in electronic health records: protocol for a systematic review.

    PubMed

    Nissen, Francis; Quint, Jennifer K; Wilkinson, Samantha; Mullerova, Hana; Smeeth, Liam; Douglas, Ian J

    2017-05-29

    Asthma is a common, heterogeneous disease with significant morbidity and mortality worldwide. It can be difficult to define in epidemiological studies using electronic health records as the diagnosis is based on non-specific respiratory symptoms and spirometry, neither of which are routinely registered. Electronic health records can nonetheless be valuable to study the epidemiology, management, healthcare use and control of asthma. For health databases to be useful sources of information, asthma diagnoses should ideally be validated. The primary objectives are to provide an overview of the methods used to validate asthma diagnoses in electronic health records and summarise the results of the validation studies. EMBASE and MEDLINE will be systematically searched for appropriate search terms. The searches will cover all studies in these databases up to October 2016 with no start date and will yield studies that have validated algorithms or codes for the diagnosis of asthma in electronic health records. At least one test validation measure (sensitivity, specificity, positive predictive value, negative predictive value or other) is necessary for inclusion. In addition, we require the validated algorithms to be compared with an external golden standard, such as a manual review, a questionnaire or an independent second database. We will summarise key data including author, year of publication, country, time period, date, data source, population, case characteristics, clinical events, algorithms, gold standard and validation statistics in a uniform table. This study is a synthesis of previously published studies and, therefore, no ethical approval is required. The results will be submitted to a peer-reviewed journal for publication. Results from this systematic review can be used to study outcome research on asthma and can be used to identify case definitions for asthma. CRD42016041798. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. Validation of the TOLNet lidars during SCOOP (Southern California Ozone Observation Project)

    NASA Astrophysics Data System (ADS)

    Leblanc, Thierry; Granados-Munoz, Maria-Jose; Strawbridge, Kevin; Senff, Chris; Langford, Andy; Berkoff, Tim; Gronoff, Guillaume; DeYoung, Russel; Carion, Bill; Chen, G.; Sullivan, John; McGee, Tom; Jonhson, M.; Kuang, S.; Newchurch, Mike

    2018-04-01

    Five TOLNet lidars participated to a validation campaign at the JPL-Table Mountain Facility, CA in August 2016. All lidars agreed within ±10% of each other and within ±7% of the ozonesondes. Centralized data processing was used to compare the uncertainty budgets. The results highlight the TOLNet potential to address science questions ranging from boundary layer processes to long range transport. TOLNet can now be seen as a robust network for use in field campaigns and long term monitoring.

  17. An Interpolation Method for Obtaining Thermodynamic Properties Near Saturated Liquid and Saturated Vapor Lines

    NASA Technical Reports Server (NTRS)

    Nguyen, Huy H.; Martin, Michael A.

    2004-01-01

    The two most common approaches used to formulate thermodynamic properties of pure substances are fundamental (or characteristic) equations of state (Helmholtz and Gibbs functions) and a piecemeal approach that is described in Adebiyi and Russell (1992). This paper neither presents a different method to formulate thermodynamic properties of pure substances nor validates the aforementioned approaches. Rather its purpose is to present a method to generate property tables from existing property packages and a method to facilitate the accurate interpretation of fluid thermodynamic property data from those tables. There are two parts to this paper. The first part of the paper shows how efficient and usable property tables were generated, with the minimum number of data points, using an aerospace industry standard property package. The second part describes an innovative interpolation technique that has been developed to properly obtain thermodynamic properties near the saturated liquid and saturated vapor lines.

  18. Intrinsic fluorescence of protein in turbid media using empirical relation based on Monte Carlo lookup table

    NASA Astrophysics Data System (ADS)

    Einstein, Gnanatheepam; Udayakumar, Kanniyappan; Aruna, Prakasarao; Ganesan, Singaravelu

    2017-03-01

    Fluorescence of Protein has been widely used in diagnostic oncology for characterizing cellular metabolism. However, the intensity of fluorescence emission is affected due to the absorbers and scatterers in tissue, which may lead to error in estimating exact protein content in tissue. Extraction of intrinsic fluorescence from measured fluorescence has been achieved by different methods. Among them, Monte Carlo based method yields the highest accuracy for extracting intrinsic fluorescence. In this work, we have attempted to generate a lookup table for Monte Carlo simulation of fluorescence emission by protein. Furthermore, we fitted the generated lookup table using an empirical relation. The empirical relation between measured and intrinsic fluorescence is validated using tissue phantom experiments. The proposed relation can be used for estimating intrinsic fluorescence of protein for real-time diagnostic applications and thereby improving the clinical interpretation of fluorescence spectroscopic data.

  19. Application of gamma imaging techniques for the characterisation of position sensitive gamma detectors

    NASA Astrophysics Data System (ADS)

    Habermann, T.; Didierjean, F.; Duchêne, G.; Filliger, M.; Gerl, J.; Kojouharov, I.; Li, G.; Pietralla, N.; Schaffner, H.; Sigward, M.-H.

    2017-11-01

    A device to characterize position-sensitive germanium detectors has been implemented at GSI. The main component of this so called scanning table is a gamma camera that is capable of producing online 2D images of the scanned detector by means of a PET technique. To calibrate the gamma camera Compton imaging is employed. The 2D data can be processed further offline to obtain depth information. Of main interest is the response of the scanned detector in terms of the digitized pulse shapes from the preamplifier. This is an important input for pulse-shape analysis algorithms as they are in use for gamma tracking arrays in gamma spectroscopy. To validate the scanning table, a comparison of its results with a second scanning table implemented at the IPHC Strasbourg is envisaged. For this purpose a pixelated germanium detector has been scanned.

  20. Improved look-up table method of computer-generated holograms.

    PubMed

    Wei, Hui; Gong, Guanghong; Li, Ni

    2016-11-10

    Heavy computation load and vast memory requirements are major bottlenecks of computer-generated holograms (CGHs), which are promising and challenging in three-dimensional displays. To solve these problems, an improved look-up table (LUT) method suitable for arbitrarily sampled object points is proposed and implemented on a graphics processing unit (GPU) whose reconstructed object quality is consistent with that of the coherent ray-trace (CRT) method. The concept of distance factor is defined, and the distance factors are pre-computed off-line and stored in a look-up table. The results show that while reconstruction quality close to that of the CRT method is obtained, the on-line computation time is dramatically reduced compared with the LUT method on the GPU and the memory usage is lower than that of the novel-LUT considerably. Optical experiments are carried out to validate the effectiveness of the proposed method.

  1. Study of charged hadron multiplicities in charged-current neutrino-lead interactions in the OPERA detector

    NASA Astrophysics Data System (ADS)

    Agafonova, N.; Aleksandrov, A.; Anokhina, A.; Aoki, S.; Ariga, A.; Ariga, T.; Bertolin, A.; Bodnarchuk, I.; Bozza, C.; Brugnera, R.; Buonaura, A.; Buontempo, S.; Chernyavskiy, M.; Chukanov, A.; Consiglio, L.; D'Ambrosio, N.; De Lellis, G.; De Serio, M.; del Amo Sanchez, P.; Di Crescenzo, A.; Di Ferdinando, D.; Di Marco, N.; Dmitrievski, S.; Dracos, M.; Duchesneau, D.; Dusini, S.; Dzhatdoev, T.; Ebert, J.; Ereditato, A.; Fini, R. A.; Fornari, F.; Fukuda, T.; Galati, G.; Garfagnini, A.; Gentile, V.; Goldberg, J.; Gornushkin, Y.; Gorbunov, S.; Grella, G.; Guler, A. M.; Gustavino, C.; Hagner, C.; Hara, T.; Hayakawa, T.; Hollnagel, A.; Hosseini, B.; Ishiguro, K.; Jakovcic, K.; Jollet, C.; Kamiscioglu, C.; Kamiscioglu, M.; Kim, S. H.; Kitagawa, N.; Klicek, B.; Kodama, K.; Komatsu, M.; Kose, U.; Kreslo, I.; Laudisio, F.; Lauria, A.; Ljubicic, A.; Longhin, A.; Loverre, P.; Malgin, A.; Malenica, M.; Mandrioli, G.; Matsuo, T.; Matveev, V.; Mauri, N.; Medinaceli, E.; Meregaglia, A.; Mikado, S.; Miyanishi, M.; Mizutani, F.; Monacelli, P.; Montesi, M. C.; Morishima, K.; Muciaccia, M. T.; Naganawa, N.; Naka, T.; Nakamura, M.; Nakano, T.; Niwa, K.; Okateva, N.; Ogawa, S.; Ozaki, K.; Paoloni, A.; Paparella, L.; Park, B. D.; Pasqualini, L.; Pastore, A.; Patrizii, L.; Pessard, H.; Podgrudkov, D.; Polukhina, N.; Pozzato, M.; Pupilli, F.; Roda, M.; Roganova, T.; Rokujo, H.; Rosa, G.; Ryazhskaya, O.; Sato, O.; Schembri, A.; Shakirianova, I.; Shchedrina, T.; Shibuya, H.; Shibayama, E.; Shiraishi, T.; Simone, S.; Sirignano, C.; Sirri, G.; Sotnikov, A.; Spinetti, M.; Stanco, L.; Starkov, N.; Stellacci, S. M.; Stipcevic, M.; Strolin, P.; Takahashi, S.; Tenti, M.; Terranova, F.; Tioukov, V.; Vasina, S.; Vilain, P.; Voevodina, E.; Votano, L.; Vuilleumier, J. L.; Wilquet, G.; Wonsak, B.; Yoon, C. S.

    2018-01-01

    The OPERA experiment was designed to search for ν _{μ } → ν _{τ } oscillations in appearance mode through the direct observation of tau neutrinos in the CNGS neutrino beam. In this paper, we report a study of the multiplicity of charged particles produced in charged-current neutrino interactions in lead. We present charged hadron average multiplicities, their dispersion and investigate the KNO scaling in different kinematical regions. The results are presented in detail in the form of tables that can be used in the validation of Monte Carlo generators of neutrino-lead interactions.

  2. Digital data in support of studies and assessments of coal and petroleum resources in the Appalachian basin: Chapter I.1 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    USGS Publications Warehouse

    Trippi, Michael H.; Kinney, Scott A.; Gunther, Gregory; Ryder, Robert T.; Ruppert, Leslie F.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    Metadata for these datasets are available in HTML and XML formats. Metadata files contain information about the sources of data used to create the dataset, the creation process steps, the data quality, the geographic coordinate system and horizontal datum used for the dataset, the values of attributes used in the dataset table, information about the publication and the publishing organization, and other information that may be useful to the reader. All links in the metadata were valid at the time of compilation. Some of these links may no longer be valid. No attempt has been made to determine the new online location (if one exists) for the data.

  3. Reliability and concurrent validity of the computer workstation checklist.

    PubMed

    Baker, Nancy A; Livengood, Heather; Jacobs, Karen

    2013-01-01

    Self-report checklists are used to assess computer workstation set up, typically by workers not trained in ergonomic assessment or checklist interpretation.Though many checklists exist, few have been evaluated for reliability and validity. This study examined reliability and validity of the Computer Workstation Checklist (CWC) to identify mismatches between workers' self-reported workstation problems. The CWC was completed at baseline and at 1 month to establish reliability. Validity was determined with CWC baseline data compared to an onsite workstation evaluation conducted by an expert in computer workstation assessment. Reliability ranged from fair to near perfect (prevalence-adjusted bias-adjusted kappa, 0.38-0.93); items with the strongest agreement were related to the input device, monitor, computer table, and document holder. The CWC had greater specificity (11 of 16 items) than sensitivity (3 of 16 items). The positive predictive value was greater than the negative predictive value for all questions. The CWC has strong reliability. Sensitivity and specificity suggested workers often indicated no problems with workstation setup when problems existed. The evidence suggests that while the CWC may not be valid when used alone, it may be a suitable adjunct to an ergonomic assessment completed by professionals.

  4. Revised NEO Personality Inventory profiles of male and female U.S. Air Force pilots.

    PubMed

    Callister, J D; King, R E; Retzlaff, P D; Marsh, R W

    1999-12-01

    The study of pilot personality characteristics has a long and controversial history. Personality characteristics seem to be fairly poor predictors of training outcome; however, valid personality assessment is essential to clinical psychological evaluations. Therefore, the personality characteristics of pilots must be studied to ensure valid clinical assessment. This paper describes normative personality characteristics of U.S. Air Force pilots based on the Revised NEO Personality Inventory profiles of 1,301 U.S. Air Force student pilots. Compared with male adult norms, male student pilots had higher levels of extraversion and lower levels of agreeableness. Compared with female adult norms, female student pilots had higher levels of extraversion and openness and lower levels of agreeableness. Descriptive statistics and percentile tables for the five domain scores and 30 facet scores are provided for clinical use, and a case vignette is provided as an example of the clinical utility of these U.S. Air Force norms.

  5. Development and validation of light-duty vehicle modal emissions and fuel consumption values for traffic models.

    DOT National Transportation Integrated Search

    1999-03-01

    A methodology for developing modal vehicle emissions and fuel consumption models has been developed by Oak Ridge National Laboratory (ORNL), sponsored by the Federal Highway Administration. These models, in the form of look-up tables for fuel consump...

  6. Ada Compiler Validation Summary Report. Certificate Number: 920918S1. 11272, U.S. Navy Ada/M, Version 4.5 (/OPTIMIZE) VAX 8550/8600/8650 (Cluster) Enhanced Processor (EP) AN/UYK-44 (Bare Board)

    DTIC Science & Technology

    1992-09-01

    and Technology Gaithersburg, MD DI USA ELECTE _993_ _ _ _ 7 . PERFORMING ORGANIZATION NAME(S) AND ADDRESS(E JUN 3 1993 8. PERFORMING ORGANIZATION...current Ada Compiler Validation Capability (ACVC). This Validation Summary Report ( VSR ) gives an account of the testing of this Ada implementation. For...34 $MAXLENREALBASEDLITERAL ൘:" & (1..V- 7 => 𔃺’) & "F.E:" SMAXSTRINGLITERAL "’ & (1..V-2 => ’A’) & ’ A-1 The following table contains the values for the remaining macro

  7. Ada Compiler Validation Summary Report: Certificate Number: 910626S1. 11174 U.S. Navy, Ada/M, Version 4.0 (/Optimize), VAX 8550, Running VAX/VMS version 5.3 (Host) to AN/UYK-44 (EMR) (Bare Board) (Target).

    DTIC Science & Technology

    1991-07-30

    Gaithersburg, MD USA 7 PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION National Institute of Standards and Technology REPORT...Ada Compiler Validation Capability (ACVC). This Validation Summary Report ( VSR ) gives an account of the testing of this Ada implementation. For any... 7 => 𔃺’) & "F.E:" $MAXSTRINGLITERAL ’"’ & (1..V-2 => ’A’) & ’"’ A-i The fo~te1-wing table contains the values for the remaining macro parameters

  8. Table-top job analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-12-01

    The purpose of this Handbook is to establish general training program guidelines for training personnel in developing training for operation, maintenance, and technical support personnel at Department of Energy (DOE) nuclear facilities. TTJA is not the only method of job analysis; however, when conducted properly TTJA can be cost effective, efficient, and self-validating, and represents an effective method of defining job requirements. The table-top job analysis is suggested in the DOE Training Accreditation Program manuals as an acceptable alternative to traditional methods of analyzing job requirements. DOE 5480-20A strongly endorses and recommends it as the preferred method for analyzing jobsmore » for positions addressed by the Order.« less

  9. Preliminary phenomena identification and ranking tables for simplified boiling water reactor Loss-of-Coolant Accident scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroeger, P.G.; Rohatgi, U.S.; Jo, J.H.

    1998-04-01

    For three potential Loss-of-Coolant Accident (LOCA) scenarios in the General Electric Simplified Boiling Water Reactors (SBWR) a set of Phenomena Identification and Ranking Tables (PIRT) is presented. The selected LOCA scenarios are typical for the class of small and large breaks generally considered in Safety Analysis Reports. The method used to develop the PIRTs is described. Following is a discussion of the transient scenarios, the PIRTs are presented and discussed in detailed and in summarized form. A procedure for future validation of the PIRTs, to enhance their value, is outlined. 26 refs., 25 figs., 44 tabs.

  10. Medical Entomology Studies - XV. A Revision of the Subgenus Paraedes of the Genus Aedes (Diptera: Culicidae)

    DTIC Science & Technology

    1981-01-01

    does not display a currently valid OMB control number. 1. REPORT DATE 1981 2. REPORT TYPE 3. DATES COVERED 00-00-1981 to 00-00-1981 4. TITLE AND...branched; 9-CT longer than 8-CT. Respiratory trumpet. Index 3.21-6.31. Metanotal plate. Seta ll-CT single, occasionally barbed, longer than 10, 12...figured and recorded (Table 1). Cephalothorax. Seta l-CT with 3-5 branches. Respiratory trumpet. Index 3.73-5.25, mean 4.53. Abdomen. Seta l-11 with 14

  11. Advancing Stability and Reconciliation in Guinea-Bissau: Lessons from Africa’s First Narco-State

    DTIC Science & Technology

    2013-06-01

    for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE JUN 2013 2. REPORT...simultaneously led to a near doubling in the price of rice, the country’s staple grain, two-thirds of which must be imported. Forced to spend more of...half, the military’s troop-to-population ratio remains double the West African average. According to a 2008 study (see Table 1), more than half the

  12. 46 CFR 16.230 - Random testing requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... programs for the chemical testing for dangerous drugs on a random basis of crewmembers on inspected vessels... establish programs for the chemical testing for dangerous drugs on a random basis of crewmembers on... random drug testing shall be made by a scientifically valid method, such as a random number table or a...

  13. 46 CFR 16.230 - Random testing requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... programs for the chemical testing for dangerous drugs on a random basis of crewmembers on inspected vessels... establish programs for the chemical testing for dangerous drugs on a random basis of crewmembers on... random drug testing shall be made by a scientifically valid method, such as a random number table or a...

  14. 46 CFR 16.230 - Random testing requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... programs for the chemical testing for dangerous drugs on a random basis of crewmembers on inspected vessels... establish programs for the chemical testing for dangerous drugs on a random basis of crewmembers on... random drug testing shall be made by a scientifically valid method, such as a random number table or a...

  15. 46 CFR 16.230 - Random testing requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... programs for the chemical testing for dangerous drugs on a random basis of crewmembers on inspected vessels... establish programs for the chemical testing for dangerous drugs on a random basis of crewmembers on... random drug testing shall be made by a scientifically valid method, such as a random number table or a...

  16. 46 CFR 16.230 - Random testing requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... programs for the chemical testing for dangerous drugs on a random basis of crewmembers on inspected vessels... establish programs for the chemical testing for dangerous drugs on a random basis of crewmembers on... random drug testing shall be made by a scientifically valid method, such as a random number table or a...

  17. Introduction and Validation of Chromium-Free Consumables for Welding Stainless Steels. Version 2

    DTIC Science & Technology

    2015-04-14

    12 4.3 Site-related Permits and Regulations .......................................................................... 13 TEST DESIGN ...109 Appendix K: X-ray Report on ENiCuRu Weld Test Assembly...60 Table 6.24 Radiographic Test Report on Baseline E308L-16 and Test ENiCuRu Weld Assemblies

  18. De novo genome assembly of Cercospora beticola for microsatellite marker development and validation

    USDA-ARS?s Scientific Manuscript database

    Cercospora leaf spot caused by Cercospora beticola is a significant threat to the production of sugar and table beet worldwide. A de novo genome assembly of C. beticola was used to develop eight polymorphic and reproducible microsatellite markers for population genetic analyses. These markers were u...

  19. A Numerical Investigation of Metabolic Reductive Dechlorination in DNAPL Source Zones

    DTIC Science & Technology

    2005-01-01

    APPENDICES ..................................................................................... 249 APPENDIX A UTCHEM VALIDATION...using UTCHEM ............................................................... 82 Table IV.2: Statistics for saturation distribution metrics in 2-D and...Saturation profiles simulated in (a) 2D using UTCHEM and (b) in the same 2D slice extracted from a 3D UTCHEM simulation

  20. Design and Analysis of Optimization Algorithms to Minimize Cryptographic Processing in BGP Security Protocols.

    PubMed

    Sriram, Vinay K; Montgomery, Doug

    2017-07-01

    The Internet is subject to attacks due to vulnerabilities in its routing protocols. One proposed approach to attain greater security is to cryptographically protect network reachability announcements exchanged between Border Gateway Protocol (BGP) routers. This study proposes and evaluates the performance and efficiency of various optimization algorithms for validation of digitally signed BGP updates. In particular, this investigation focuses on the BGPSEC (BGP with SECurity extensions) protocol, currently under consideration for standardization in the Internet Engineering Task Force. We analyze three basic BGPSEC update processing algorithms: Unoptimized, Cache Common Segments (CCS) optimization, and Best Path Only (BPO) optimization. We further propose and study cache management schemes to be used in conjunction with the CCS and BPO algorithms. The performance metrics used in the analyses are: (1) routing table convergence time after BGPSEC peering reset or router reboot events and (2) peak-second signature verification workload. Both analytical modeling and detailed trace-driven simulation were performed. Results show that the BPO algorithm is 330% to 628% faster than the unoptimized algorithm for routing table convergence in a typical Internet core-facing provider edge router.

  1. Development and face validation of strategies for improving consultation skills.

    PubMed

    Lefroy, Janet; Thomas, Adam; Harrison, Chris; Williams, Stephen; O'Mahony, Fidelma; Gay, Simon; Kinston, Ruth; McKinley, R K

    2014-12-01

    While formative workplace based assessment can improve learners' skills, it often does not because the procedures used do not facilitate feedback which is sufficiently specific to scaffold improvement. Provision of pre-formulated strategies to address predicted learning needs has potential to improve the quality and automate the provision of written feedback. To systematically develop, validate and maximise the utility of a comprehensive list of strategies for improvement of consultation skills through a process involving both medical students and their clinical primary and secondary care tutors. Modified Delphi study with tutors, modified nominal group study with students with moderation of outputs by consensus round table discussion by the authors. 35 hospital and 21 GP tutors participated in the Delphi study and contributed 153 new or modified strategies. After review of these and the 205 original strategies, 265 strategies entered the nominal group study to which 46 year four and five students contributed, resulting in the final list of 249 validated strategies. We have developed a valid and comprehensive set of strategies which are considered useful by medical students. This list can be immediately applied by any school which uses the Calgary Cambridge Framework to inform the content of formative feedback on consultation skills. We consider that the list could also be mapped to alternative skills frameworks and so be utilised by schools which do not use the Calgary Cambridge Framework.

  2. An Overview of Starfish: A Table-Centric Tool for Interactive Synthesis

    NASA Technical Reports Server (NTRS)

    Tsow, Alex

    2008-01-01

    Engineering is an interactive process that requires intelligent interaction at many levels. My thesis [1] advances an engineering discipline for high-level synthesis and architectural decomposition that integrates perspicuous representation, designer interaction, and mathematical rigor. Starfish, the software prototype for the design method, implements a table-centric transformation system for reorganizing control-dominated system expressions into high-level architectures. Based on the digital design derivation (DDD) system a designer-guided synthesis technique that applies correctness preserving transformations to synchronous data flow specifications expressed as co- recursive stream equations Starfish enhances user interaction and extends the reachable design space by incorporating four innovations: behavior tables, serialization tables, data refinement, and operator retiming. Behavior tables express systems of co-recursive stream equations as a table of guarded signal updates. Developers and users of the DDD system used manually constructed behavior tables to help them decide which transformations to apply and how to specify them. These design exercises produced several formally constructed hardware implementations: the FM9001 microprocessor, an SECD machine for evaluating LISP, and the SchemEngine, garbage collected machine for interpreting a byte-code representation of compiled Scheme programs. Bose and Tuna, two of DDD s developers, have subsequently commercialized the design derivation methodology at Derivation Systems, Inc. (DSI). DSI has formally derived and validated PCI bus interfaces and a Java byte-code processor; they further executed a contract to prototype SPIDER-NASA's ultra-reliable communications bus. To date, most derivations from DDD and DRS have targeted hardware due to its synchronous design paradigm. However, Starfish expressions are independent of the synchronization mechanism; there is no commitment to hardware or globally broadcast clocks. Though software back-ends for design derivation are limited to the DDD stream-interpreter, targeting synchronous or real-time software is not substantively different from targeting hardware.

  3. Simulation of advective flow under steady-state and transient recharge conditions, Camp Edwards, Massachusetts Military Reservation, Cape Cod, Massachusetts

    USGS Publications Warehouse

    Walter, Donald A.; Masterson, John P.

    2003-01-01

    The U.S. Geological Survey has developed several ground-water models in support of an investigation of ground-water contamination being conducted by the Army National Guard Bureau at Camp Edwards, Massachusetts Military Reservation on western Cape Cod, Massachusetts. Regional and subregional steady-state models and regional transient models were used to (1) improve understanding of the hydrologic system, (2) simulate advective transport of contaminants, (3) delineate recharge areas to municipal wells, and (4) evaluate how model discretization and time-varying recharge affect simulation results. A water-table mound dominates ground-water-flow patterns. Near the top of the mound, which is within Camp Edwards, hydraulic gradients are nearly vertically downward and horizontal gradients are small. In downgradient areas that are further from the top of the water-table mound, the ratio of horizontal to vertical gradients is larger and horizontal flow predominates. The steady-state regional model adequately simulates advective transport in some areas of the aquifer; however, simulation of ground-water flow in areas with local hydrologic boundaries, such as ponds, requires more finely discretized subregional models. Subregional models also are needed to delineate recharge areas to municipal wells that are inadequately represented in the regional model or are near other pumped wells. Long-term changes in recharge rates affect hydraulic heads in the aquifer and shift the position of the top of the water-table mound. Hydraulic-gradient directions do not change over time in downgradient areas, whereas they do change substantially with temporal changes in recharge near the top of the water-table mound. The assumption of steady-state hydraulic conditions is valid in downgradient area, where advective transport paths change little over time. In areas closer to the top of the water-table mound, advective transport paths change as a function of time, transient and steady-state paths do not coincide, and the assumption of steady-state conditions is not valid. The simulation results indicate that several modeling tools are needed to adequately simulate ground-water flow at the site and that the utility of a model varies according to hydrologic conditions in the specific areas of interest.

  4. Modelling alkali metal emissions in large-eddy simulation of a preheated pulverised-coal turbulent jet flame using tabulated chemistry

    NASA Astrophysics Data System (ADS)

    Wan, Kaidi; Xia, Jun; Vervisch, Luc; Liu, Yingzu; Wang, Zhihua; Cen, Kefa

    2018-03-01

    The numerical modelling of alkali metal reacting dynamics in turbulent pulverised-coal combustion is discussed using tabulated sodium chemistry in large eddy simulation (LES). A lookup table is constructed from a detailed sodium chemistry mechanism including five sodium species, i.e. Na, NaO, NaO2, NaOH and Na2O2H2, and 24 elementary reactions. This sodium chemistry table contains four coordinates, i.e. the equivalence ratio, the mass fraction of the sodium element, the gas-phase temperature, and a progress variable. The table is first validated against the detailed sodium chemistry mechanism by zero-dimensional simulations. Then, LES of a turbulent pulverised-coal jet flame is performed and major coal-flame parameters compared against experiments. The chemical percolation devolatilisation (CPD) model and the partially stirred reactor (PaSR) model are employed to predict coal pyrolysis and gas-phase combustion, respectively. The response of the five sodium species in the pulverised-coal jet flame is subsequently examined. Finally, a systematic global sensitivity analysis of the sodium lookup table is performed and the accuracy of the proposed tabulated sodium chemistry approach has been calibrated.

  5. Performance of a lookup table-based approach for measuring tissue optical properties with diffuse optical spectroscopy

    NASA Astrophysics Data System (ADS)

    Nichols, Brandon S.; Rajaram, Narasimhan; Tunnell, James W.

    2012-05-01

    Diffuse optical spectroscopy (DOS) provides a powerful tool for fast and noninvasive disease diagnosis. The ability to leverage DOS to accurately quantify tissue optical parameters hinges on the model used to estimate light-tissue interaction. We describe the accuracy of a lookup table (LUT)-based inverse model for measuring optical properties under different conditions relevant to biological tissue. The LUT is a matrix of reflectance values acquired experimentally from calibration standards of varying scattering and absorption properties. Because it is based on experimental values, the LUT inherently accounts for system response and probe geometry. We tested our approach in tissue phantoms containing multiple absorbers, different sizes of scatterers, and varying oxygen saturation of hemoglobin. The LUT-based model was able to extract scattering and absorption properties under most conditions with errors of less than 5 percent. We demonstrate the validity of the lookup table over a range of source-detector separations from 0.25 to 1.48 mm. Finally, we describe the rapid fabrication of a lookup table using only six calibration standards. This optimized LUT was able to extract scattering and absorption properties with average RMS errors of 2.5 and 4 percent, respectively.

  6. Hydrologic relations between lakes and aquifer in a recharge area near Orlando, Florida

    USGS Publications Warehouse

    Lichtler, William F.; Hughes, G.H.; Pfischner, F.L.

    1976-01-01

    The three lakes investigated in Orange County, Florida, gain water from adjoining water-table aquifer and lose water to Floridan aquifer by downward leakage. Net seepage (net exchange of water between lake and aquifers) can be estimated by equation S = AX + BY, where S is net seepage, X represents hydraulic gradient between lake and water-table aquifer, A is lumped parameter representing effect of hydraulic conductivity and cross-sectional area of materials in flow section of water-table aquifer, Y is head difference between lake level and potentiometric surface of Floridan aquifer, and B is lumped parameter representing effect of hydraulic conductivity, area, and thickness of materials between lake bottom and Floridan aquifer. If values of S, X, and Y are available for two contrasting water-level conditions, coefficients A and B are determinable by solution of two simultaneous equations. If the relation between lake and ground-water level is the same on all sides of the lake--with regard to each aquifer--and if X and Y are truly representative of these relations, then X and Y terms of equation provide valid estimates of inflow to lake from water-table aquifer and outflow from lake to Floridan aquifer. (Woodard-USGS)

  7. Patterns of medicinal plant use: an examination of the Ecuadorian Shuar medicinal flora using contingency table and binomial analyses.

    PubMed

    Bennett, Bradley C; Husby, Chad E

    2008-03-28

    Botanical pharmacopoeias are non-random subsets of floras, with some taxonomic groups over- or under-represented. Moerman [Moerman, D.E., 1979. Symbols and selectivity: a statistical analysis of Native American medical ethnobotany, Journal of Ethnopharmacology 1, 111-119] introduced linear regression/residual analysis to examine these patterns. However, regression, the commonly-employed analysis, suffers from several statistical flaws. We use contingency table and binomial analyses to examine patterns of Shuar medicinal plant use (from Amazonian Ecuador). We first analyzed the Shuar data using Moerman's approach, modified to better meet requirements of linear regression analysis. Second, we assessed the exact randomization contingency table test for goodness of fit. Third, we developed a binomial model to test for non-random selection of plants in individual families. Modified regression models (which accommodated assumptions of linear regression) reduced R(2) to from 0.59 to 0.38, but did not eliminate all problems associated with regression analyses. Contingency table analyses revealed that the entire flora departs from the null model of equal proportions of medicinal plants in all families. In the binomial analysis, only 10 angiosperm families (of 115) differed significantly from the null model. These 10 families are largely responsible for patterns seen at higher taxonomic levels. Contingency table and binomial analyses offer an easy and statistically valid alternative to the regression approach.

  8. An Analysis of the Relationship Between Control Discrimination Accuracy and Alcohol Abuse in the United States Air Force.

    DTIC Science & Technology

    1986-08-01

    theoretical perspective. I thank Dr. Marion Neil for her editorial and organizational contributions that helped to put this work into a presentable form. Over...RESOLUTION T EST CHART I’ NATIPONAL RUNE A( ,I IAN [AN,A ’ I *33 Table 6 (continued) PO I Item ECDA ICDA 125 .27 -. 08 142 .32 .00 *~~~~~7 77 ..-. .--- r...within the validity standards of .40 to .60 described by Downie and Heath (1967). The superior discriminant validity of the TCDA scale suggests that

  9. The boundary conditions for simulations of a shake-table experiment on the seismic response of 3D slope

    NASA Astrophysics Data System (ADS)

    Tang, Liang; Cong, Shengyi; Ling, Xianzhang; Ju, Nengpan

    2017-01-01

    Boundary conditions can significantly affect a slope's behavior under strong earthquakes. To evaluate the importance of boundary conditions for finite element (FE) simulations of a shake-table experiment on the slope response, a validated three-dimensional (3D) nonlinear FE model is presented, and the numerical and experimental results are compared. For that purpose, the robust graphical user-interface "SlopeSAR", based on the open-source computational platform OpenSees, is employed, which simplifies the effort-intensive pre- and post-processing phases. The mesh resolution effect is also addressed. A parametric study is performed to evaluate the influence of boundary conditions on the FE model involving the boundary extent and three types of boundary conditions at the end faces. Generally, variations in the boundary extent produce inconsistent slope deformations. For the two end faces, fixing the y-direction displacement is not appropriate to simulate the shake-table experiment, in which the end walls are rigid and rough. In addition, the influence of the length of the 3D slope's top face and the width of the slope play an important role in the difference between two types of boundary conditions at the end faces (fixing the y-direction displacement and fixing the ( y, z) direction displacement). Overall, this study highlights that the assessment of a comparison between a simulation and an experimental result should be performed with due consideration to the effect of the boundary conditions.

  10. Some applications of the multi-dimensional fractional order for the Riemann-Liouville derivative

    NASA Astrophysics Data System (ADS)

    Ahmood, Wasan Ajeel; Kiliçman, Adem

    2017-01-01

    In this paper, the aim of this work is to study theorem for the one-dimensional space-time fractional deriative, generalize some function for the one-dimensional fractional by table represents the fractional Laplace transforms of some elementary functions to be valid for the multi-dimensional fractional Laplace transform and give the definition of the multi-dimensional fractional Laplace transform. This study includes that, dedicate the one-dimensional fractional Laplace transform for functions of only one independent variable and develop of the one-dimensional fractional Laplace transform to multi-dimensional fractional Laplace transform based on the modified Riemann-Liouville derivative.

  11. Materials Compatibility and Agent Operational Validation for Halon 1211 Replacement: Phases 1 2, and 3. Volume 1

    DTIC Science & Technology

    1993-03-01

    KC-135 Gl-epoxy Winglet 1 *1 = experimental; 2 = prototype development; 3 = production 9 TABLE 3. ADVANCED COMPOSITES IN MILITARY AIRCRAFT (CONCLUDED...specially blended for related agent testing and would not be available, due to its high production cost, for regular distribution.1 ’Personal

  12. The Development and Validation of the Ethical Climate Index for Middle and High Schools.

    ERIC Educational Resources Information Center

    Schulte, Laura E.; Thompson, Franklin; Talbott, Jeanie; Luther, Ann; Garcia, Michelle; Blanchard, Shirley; Conway, Laraine; Mueller, Melanie

    2002-01-01

    Describes the School Ethical Climate Index (SECI), an instrument to measure the ethical climate of a school. The SECI could be used in school districts to assess areas for school improvement and thereby help reduce school disorder and violence. (Contains 4 tables and 39 references.) (Author/WFA)

  13. Krypton and xenon in lunar fines

    NASA Technical Reports Server (NTRS)

    Basford, J. R.; Dragon, J. C.; Pepin, R. O.; Coscio, M. R., Jr.; Murthy, V. R.

    1973-01-01

    Data from grain-size separates, stepwise-heated fractions, and bulk analyses of 20 samples of fines and breccias from five lunar sites are used to define three-isotope and ordinate intercept correlations in an attempt to resolve the lunar heavy rare gas system in a statistically valid approach. Tables of concentrations and isotope compositions are given.

  14. Assessing South Africa Learners' Attitudes Towards Technology by Using the PATT (Pupils' Attitudes towards Technology) Questionnaire.

    ERIC Educational Resources Information Center

    Van Rensburg, Susan; Ankiewicz, Piet; Myburgh, Chris

    1999-01-01

    The PATT (Pupils' Attitude Towards Technology) questionnaire, as validated for the United States, was used to assess and analyze attitudes of 500 girls and 510 boys from the Gauteng Province in South Africa. Findings are compared for both genders. Four tables present results. Contains 43 references. (AEF)

  15. Conceptualizing Sex Offender Denial from a Multifaceted Framework: Investigating the Psychometric Qualities of a New Instrument

    ERIC Educational Resources Information Center

    Jung, Sandy; Daniels, Melissa

    2012-01-01

    The authors examined the psychometric properties of a clinician-rated measure of sex offender denial. Convergent and discriminant validity for the measure was supported, and given its relationship to treatment attitudes, the measure demonstrated utility for assessing treatment change and readiness. (Contains 3 tables.)

  16. Interaction of coastal urban groundwater with infrastructure due to tidal variation

    NASA Astrophysics Data System (ADS)

    Su, X.; Prigiobbe, V.

    2017-12-01

    The urbanization of coastal areas has been increasing during the last century. For these areas, groundwater is one of major source of potable water for the population, the industry, and the agriculture, with an average demand of 30 m3/s [1,2]. Simultaneously, the rate of sea-level rise has been recorded to be approximately 40 mm/yr [3], with potential negative consequences on the coastal groundwater. As the sea-level rises, sea-water intrusion into potable aquifers may become more important [4] and the water table of the shallow aquifer underneath the coastal areas may rise [5]. Therefore, the water quality of the aquifer decreases and interaction between the shallow aquifer and infrastructure may occur. In particular, in the latter case, disruptive events may become more frequent, such as infiltration of groundwater into damaged sewer causing discharge of untreated sewage (combined sewer overflows, CSOs). Here, a study is presented on the modeling of urban groundwater in coastal areas to identify the cause of frequent CSOs in dry weather conditions, i.e., CSOs are not expected to occur. The evolution of the water table was described in response of tidal variation to quantify the interaction between the shallow aquifer and an aging sewer. The watershed of the city of Hoboken (NJ), at the estuary of Hudson river, was implemented in MODFLOW. The model was built using dataset from various sources. Geostatistic was applied to create the aquifer geology and measurements of the water table from monitoring wells within the urban area were used as boundary conditions and model validation. Preliminary results of the simulations are shown the figure, where the water table over a period of 7 months was calculated. The groundwater model with the sewer will help identifying the parts of the network that might be submerged by the groundwater and, therefore, subjected to infiltration. Combining groundwater and sewer modeling with the hydrograph separation method [6], the model prediction of infiltration will be validated. References [1] Pimentel et al. BioScience, 54, 909-918, 2004. [2] Owolabi Glob. Ini., 11, 69-87, 2017. [3] Milne Astro. Geophys., 49, 224-228, 2008 [4] Vzquez-Su et al. Hydro. J. 13, 522-533, 2005. [5] Gburek et al. Ground Water, 37,175-184, 1999. [6] Prigiobbe and Giulianelli. Water Sci.Tech. 60, 727-735, 2009.

  17. TABULATED EQUIVALENT SDR FLAMELET (TESF) MODEFL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    KUNDU, PRITHWISH; AMEEN, mUHSIN MOHAMMED; UNNIKRISHNAN, UMESH

    The code consists of an implementation of a novel tabulated combustion model for non-premixed flames in CFD solvers. This novel technique/model is used to implement an unsteady flamelet tabulation without using progress variables for non-premixed flames. It also has the capability to include history effects which is unique within tabulated flamelet models. The flamelet table generation code can be run in parallel to generate tables with large chemistry mechanisms in relatively short wall clock times. The combustion model/code reads these tables. This framework can be coupled with any CFD solver with RANS as well as LES turbulence models. This frameworkmore » enables CFD solvers to run large chemistry mechanisms with large number of grids at relatively lower computational costs. Currently it has been coupled with the Converge CFD code and validated against available experimental data. This model can be used to simulate non-premixed combustion in a variety of applications like reciprocating engines, gas turbines and industrial burners operating over a wide range of fuels.« less

  18. An Investigation of Applications for Thermodynamic Work Potential Methods: Working Tables and Charts for Estimation of Thermodynamic Work Potential in Equilibrium Mixtures of Jet-A and Air

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri; Roth, Bryce; McDonald, Rob

    2002-01-01

    The objective of this report is to provide a tool to facilitate the application of thermodynamic work potential methods to aircraft and engine analysis. This starts with a discussion of the theoretical background underlying these methods, which is then used to derive various equations useful for thermodynamic analysis of aircraft engines. The work potential analysis method is implemented in the form of a set of working charts and tables that can be used to graphically evaluate work potential stored in high-enthalpy gas. The range of validity for these tables is 300 to 36,000 R, pressures between between 0.01 atm and 100 atm, and fuel-air ratios from zero to stoichiometric. The derivations and charts assume mixtures of Jet-A and air as the working fluid. The thermodynamic properties presented in these charts were calculated based upon standard thermodynamic curve fits.

  19. Pivot tables for mortality analysis, or who needs life tables anyway?

    PubMed

    Wesley, David; Cox, Hugh F

    2007-01-01

    Actuarial life-table analysis has long been used by life insurance medical directors for mortality abstraction from clinical studies. Ironically, today's life actuary instead uses pivot tables to analyze mortality. Pivot tables (a feature/function in MS Excel) collapse various dimensions of data that were previously arranged in an "experience study" format. Summary statistics such as actual deaths, actual and expected mortality (usually measured in dollars), and calculated results such as actual to expected ratios, are then displayed in a 2-dimensional grid. The same analytic process, excluding the dollar focus, can be used for clinical mortality studies. For raw survival data, especially large datasets, this combination of experience study data and pivot tables has clear advantages over life-table analysis in both accuracy and flexibility. Using the SEER breast cancer data, we compare the results of life-table analysis and pivot-table analysis.

  20. Integrated configurable equipment selection and line balancing for mass production with serial-parallel machining systems

    NASA Astrophysics Data System (ADS)

    Battaïa, Olga; Dolgui, Alexandre; Guschinsky, Nikolai; Levin, Genrikh

    2014-10-01

    Solving equipment selection and line balancing problems together allows better line configurations to be reached and avoids local optimal solutions. This article considers jointly these two decision problems for mass production lines with serial-parallel workplaces. This study was motivated by the design of production lines based on machines with rotary or mobile tables. Nevertheless, the results are more general and can be applied to assembly and production lines with similar structures. The designers' objectives and the constraints are studied in order to suggest a relevant mathematical model and an efficient optimization approach to solve it. A real case study is used to validate the model and the developed approach.

  1. VOXEL-LEVEL MAPPING OF TRACER KINETICS IN PET STUDIES: A STATISTICAL APPROACH EMPHASIZING TISSUE LIFE TABLES.

    PubMed

    O'Sullivan, Finbarr; Muzi, Mark; Mankoff, David A; Eary, Janet F; Spence, Alexander M; Krohn, Kenneth A

    2014-06-01

    Most radiotracers used in dynamic positron emission tomography (PET) scanning act in a linear time-invariant fashion so that the measured time-course data are a convolution between the time course of the tracer in the arterial supply and the local tissue impulse response, known as the tissue residue function. In statistical terms the residue is a life table for the transit time of injected radiotracer atoms. The residue provides a description of the tracer kinetic information measurable by a dynamic PET scan. Decomposition of the residue function allows separation of rapid vascular kinetics from slower blood-tissue exchanges and tissue retention. For voxel-level analysis, we propose that residues be modeled by mixtures of nonparametrically derived basis residues obtained by segmentation of the full data volume. Spatial and temporal aspects of diagnostics associated with voxel-level model fitting are emphasized. Illustrative examples, some involving cancer imaging studies, are presented. Data from cerebral PET scanning with 18 F fluoro-deoxyglucose (FDG) and 15 O water (H2O) in normal subjects is used to evaluate the approach. Cross-validation is used to make regional comparisons between residues estimated using adaptive mixture models with more conventional compartmental modeling techniques. Simulations studies are used to theoretically examine mean square error performance and to explore the benefit of voxel-level analysis when the primary interest is a statistical summary of regional kinetics. The work highlights the contribution that multivariate analysis tools and life-table concepts can make in the recovery of local metabolic information from dynamic PET studies, particularly ones in which the assumptions of compartmental-like models, with residues that are sums of exponentials, might not be certain.

  2. Validation of fault-free behavior of a reliable multiprocessor system - FTMP: A case study. [Fault-Tolerant Multi-Processor avionics

    NASA Technical Reports Server (NTRS)

    Clune, E.; Segall, Z.; Siewiorek, D.

    1984-01-01

    A program of experiments has been conducted at NASA-Langley to test the fault-free performance of a Fault-Tolerant Multiprocessor (FTMP) avionics system for next-generation aircraft. Baseline measurements of an operating FTMP system were obtained with respect to the following parameters: instruction execution time, frame size, and the variation of clock ticks. The mechanisms of frame stretching were also investigated. The experimental results are summarized in a table. Areas of interest for future tests are identified, with emphasis given to the implementation of a synthetic workload generation mechanism on FTMP.

  3. A versatile nanotechnology to connect individual nano-objects for the fabrication of hybrid single-electron devices

    NASA Astrophysics Data System (ADS)

    Bernand-Mantel, A.; Bouzehouane, K.; Seneor, P.; Fusil, S.; Deranlot, C.; Brenac, A.; Notin, L.; Morel, R.; Petroff, F.; Fert, A.

    2010-11-01

    We report on the high yield connection of single nano-objects as small as a few nanometres in diameter to separately elaborated metallic electrodes, using a 'table-top' nanotechnology. Single-electron transport measurements validate that transport occurs through a single nano-object. The vertical geometry of the device natively allows an independent choice of materials for each electrode and the nano-object. In addition ferromagnetic materials can be used without encountering oxidation problems. The possibility of elaborating such hybrid nanodevices opens new routes for the democratization of spintronic studies in low dimensions.

  4. Validation of the Conversion between the Mini-Mental State Examination and Montreal Cognitive assessment in Korean Patients with Parkinson’s Disease

    PubMed Central

    Kim, Ryul; Kim, Han-Joon; Kim, Aryun; Jang, Mi-Hee; Kim, Hyun Jeong; Jeon, Beomseok

    2018-01-01

    Objective Two conversion tables between the Mini-Mental State Examination (MMSE) and Montreal Cognitive Assessment (MoCA) have recently been established for Parkinson’s disease (PD). This study aimed to validate them in Korean patients with PD and to evaluate whether they could be influenced by educational level. Methods A total of 391 patients with PD who undertook both the Korean MMSE and the Korean MoCA during the same session were retrospectively assessed. The mean, median, and root mean squared error (RMSE) of the difference between the true and converted MMSE scores and the intraclass correlation coefficient (ICC) were calculated according to educational level (6 or fewer years, 7–12 years, or 13 or more years). Results Both conversions had a median value of 0, with a small mean and RMSE of differences, and a high correlation between the true and converted MMSE scores. In the classification according to educational level, all groups had roughly similar values of the median, mean, RMSE, and ICC both within and between the conversions. Conclusion Our findings suggest that both MMSE-MoCA conversion tables are useful instruments for transforming MoCA scores into converted MMSE scores in Korean patients with PD, regardless of educational level. These will greatly enhance the utility of the existing cognitive data from the Korean PD population in clinical and research settings. PMID:29316782

  5. Comparing the accuracy of brief versus long depression screening instruments which have been validated in low and middle income countries: a systematic review.

    PubMed

    Akena, Dickens; Joska, John; Obuku, Ekwaro A; Amos, Taryn; Musisi, Seggane; Stein, Dan J

    2012-11-01

    Given the high prevalence of depression in primary health care (PHC), the use of screening instruments has been recommended. Both brief and long depression screening instruments have been validated in low and middle income countries (LMIC), including within HIV care settings. However, it remains unknown whether the brief instruments validated in LMIC are as accurate as the long ones. We conducted a search of PUBMED, the COCHRANE library, AIDSLINE, and PSYCH-Info from their inception up to July 2011, for studies that validated depression screening instruments in LMIC. Data were extracted into tables and analyzed using RevMan 5.0 and STATA 11.2 for the presence of heterogeneity. Nineteen studies met our inclusion criteria. The reported prevalence of depression in LMIC ranged from 11.1 to 53%. The area under curve (AUC) scores of the validated instruments ranged from 0.69-0.99. Brief as well as long screening instruments showed acceptable accuracy (AUC≥0.7). Five of the 19 instruments were validated within HIV settings. There was statistically significant heterogeneity between the studies, and hence a meta-analysis could not be conducted to completion. Heterogeneity chi-squared = 189.23 (d.f. = 18) p<.001. Brief depression screening instruments in both general and HIV-PHC are as accurate as the long ones. Brief scales may have an edge over the longer instruments since they can be administered in a much shorter time. However, because the ultra brief scales do not include the whole spectrum of depression symptoms including suicide, their use should be followed by a detailed diagnostic interview.

  6. Validation of the Social Security Administration Life Tables (2004-2014) in Localized Prostate Cancer Patients within the Surveillance, Epidemiology, and End Results database.

    PubMed

    Preisser, Felix; Bandini, Marco; Mazzone, Elio; Nazzani, Sebastiano; Marchioni, Michele; Tian, Zhe; Saad, Fred; Pompe, Raisa S; Shariat, Shahrokh F; Heinzer, Hans; Montorsi, Francesco; Huland, Hartwig; Graefen, Markus; Tilki, Derya; Karakiewicz, Pierre I

    2018-05-22

    Accurate life expectancy estimation is crucial in clinical decision-making including management and treatment of clinically localized prostate cancer (PCa). We hypothesized that Social Security Administration (SSA) life tables' derived survival estimates closely follow observed survival of PCa patients. To test this relationship, we examined 10-yr overall survival rates in patients with clinically localized PCa and compared it with survival estimates derived from the SSA life tables. Within the Surveillance, Epidemiology, and End Results database (2004), we identified patients aged >50-<90yr. Follow-up was at least 10 yr for patients who did not die of disease or other causes. Monte Carlo method was used to define individual survival in years, according to the SSA life tables (2004-2014). Subsequently, SSA life tables' predicted survival was compared with observed survival rates in Kaplan-Meier analyses. Subgroup analyses were stratified according to treatment type and D'Amico risk classification. Overall, 39191 patients with localized PCa were identified. At 10-yr follow-up, the SSA life tables' predicted survival was 69.5% versus 73.1% according to the observed rate (p<0.0001). The largest differences between estimated versus observed survival rates were recorded for D'Amico low-risk PCa (8.0%), brachytherapy (9.1%), and radical prostatectomy (8.6%) patients. Conversely, the smallest differences were recorded for external beam radiotherapy (1.7%) and unknown treatment type (1.6%) patients. Overall, SSA life tables' predicted life expectancy closely approximate observed overall survival rates. However, SSA life tables' predicted rates underestimate by as much as 9.1% the survival in brachytherapy patients, as well as in D'Amico low-risk and radical prostatectomy patients. In these patient categories, an adjustment for the degree of underestimation might be required when counseling is provided in clinical practice. Social Security Administration (SSA) life tables' predicted life expectancy closely approximate observed overall survival rates. However, SSA life tables' predicted rates underestimate by as much as 9.1% the survival in brachytherapy patients, as well as in D'Amico low-risk and radical prostatectomy patients. Copyright © 2018 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  7. Sleep Patterns, Mood, Psychomotor Vigilance Performance, and Command Resilience of Watchstanders on the Five and Dime Watchbill

    DTIC Science & Technology

    2015-02-28

    currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 28-02-2015 2. REPORT TYPE...9 E. ANALYTICAL APPROACH .............................................................................. 9 1. Actigraphy Data...Comparison between Normal and Elevated ESS groups . ............................. 34 Table 8. Organizational Commitment, Psychological Safety, and

  8. Development and Validation of a Simplified Renal Replacement Therapy Suitable for Prolonged Field Care in a Porcine (Sus scrofa) Model of Acute Kidney Injury

    DTIC Science & Technology

    2018-03-01

    of a Simplified Renal Replacement Therapy Suitable for Prolonged Field Care in a Porcine (Sus scrofa) Model of Acute Kidney Injury. PRINCIPAL...and methods, results - include tables/figures, and conclusions/applications.) Objectives/Background: Acute kidney injury (AKI) is a serious

  9. Home-Career Conflict Reduction Revisited: The Effect of Experimental Directions on KOIS Scores for Women

    ERIC Educational Resources Information Center

    Tittle, Carol Kehr; Denker, Elenor Rubin

    1976-01-01

    The effect of experimental directions designed to reduce a home-career conflict in women's occupational choices on the Kuder Occupational Interest Survey was investigated. Results indicate that the validity of the Kuder survey is not compromised by the experimental directions. Tables are presented and implications are discussed. (Author/JKS)

  10. Text of the U.S. Court of Appeals Opinion in the Spirt Equal-Pension Case.

    ERIC Educational Resources Information Center

    Newman, Jon O.; And Others

    1984-01-01

    The opinion of a three-judge panel in a court case involving the validity of gender-based mortality tables and the right of women to receive equal pensions from the Teachers Insurance Annuities Association and the College Retirement Equities Fund is presented, including references to the earlier, related Norris case. (MSE)

  11. Fragmentation Point Detection of JPEG Images at DHT Using Validator

    NASA Astrophysics Data System (ADS)

    Mohamad, Kamaruddin Malik; Deris, Mustafa Mat

    File carving is an important, practical technique for data recovery in digital forensics investigation and is particularly useful when filesystem metadata is unavailable or damaged. The research on reassembly of JPEG files with RST markers, fragmented within the scan area have been done before. However, fragmentation within Define Huffman Table (DHT) segment is yet to be resolved. This paper analyzes the fragmentation within the DHT area and list out all the fragmentation possibilities. Two main contributions are made in this paper. Firstly, three fragmentation points within DHT area are listed. Secondly, few novel validators are proposed to detect these fragmentations. The result obtained from tests done on manually fragmented JPEG files, showed that all three fragmentation points within DHT are successfully detected using validators.

  12. Reliability and criterion validity of measurements using a smart phone-based measurement tool for the transverse rotation angle of the pelvis during single-leg lifting.

    PubMed

    Jung, Sung-Hoon; Kwon, Oh-Yun; Jeon, In-Cheol; Hwang, Ui-Jae; Weon, Jong-Hyuck

    2018-01-01

    The purposes of this study were to determine the intra-rater test-retest reliability of a smart phone-based measurement tool (SBMT) and a three-dimensional (3D) motion analysis system for measuring the transverse rotation angle of the pelvis during single-leg lifting (SLL) and the criterion validity of the transverse rotation angle of the pelvis measurement using SBMT compared with a 3D motion analysis system (3DMAS). Seventeen healthy volunteers performed SLL with their dominant leg without bending the knee until they reached a target placed 20 cm above the table. This study used a 3DMAS, considered the gold standard, to measure the transverse rotation angle of the pelvis to assess the criterion validity of the SBMT measurement. Intra-rater test-retest reliability was determined using the SBMT and 3DMAS using intra-class correlation coefficient (ICC) [3,1] values. The criterion validity of the SBMT was assessed with ICC [3,1] values. Both the 3DMAS (ICC = 0.77) and SBMT (ICC = 0.83) showed excellent intra-rater test-retest reliability in the measurement of the transverse rotation angle of the pelvis during SLL in a supine position. Moreover, the SBMT showed an excellent correlation with the 3DMAS (ICC = 0.99). Measurement of the transverse rotation angle of the pelvis using the SBMT showed excellent reliability and criterion validity compared with the 3DMAS.

  13. A film set for the elicitation of emotion in research: A comprehensive catalog derived from four decades of investigation.

    PubMed

    Gilman, T Lee; Shaheen, Razan; Nylocks, K Maria; Halachoff, Danielle; Chapman, Jessica; Flynn, Jessica J; Matt, Lindsey M; Coifman, Karin G

    2017-12-01

    Emotions are highly influential to many psychological processes. Indeed, research employing emotional stimuli is rapidly escalating across the field of psychology. However, challenges remain regarding discrete evocation of frequently co-elicited emotions such as amusement and happiness, or anger and disgust. Further, as much contemporary work in emotion employs college students, we sought to additionally evaluate the efficacy of film clips to discretely elicit these more challenging emotions in a young adult population using an online medium. The internet is an important tool for investigating responses to emotional stimuli, but validations of emotionally evocative film clips across laboratory and web-based settings are limited in the literature. An additional obstacle is identifying stimuli amidst the numerous film clip validation studies. During our investigation, we recognized the lack of a categorical database to facilitate rapid identification of useful film clips for individual researchers' unique investigations. Consequently, here we also sought to produce the first compilation of such stimuli into an accessible and comprehensive catalog. We based our catalog upon prior work as well as our own, and identified 24 articles and 295 film clips from four decades of research. We present information on the validation of these clips in addition to our own research validating six clips using online administration settings. The results of our search in the literature and our own study are presented in tables designed to facilitate and improve a selection of highly valid film stimuli for future research.

  14. Extension of Generalized Fluid System Simulation Program's Fluid Property Database

    NASA Technical Reports Server (NTRS)

    Patel, Kishan

    2011-01-01

    This internship focused on the development of additional capabilities for the General Fluid Systems Simulation Program (GFSSP). GFSSP is a thermo-fluid code used to evaluate system performance by a finite volume-based network analysis method. The program was developed primarily to analyze the complex internal flow of propulsion systems and is capable of solving many problems related to thermodynamics and fluid mechanics. GFSSP is integrated with thermodynamic programs that provide fluid properties for sub-cooled, superheated, and saturation states. For fluids that are not included in the thermodynamic property program, look-up property tables can be provided. The look-up property tables of the current release version can only handle sub-cooled and superheated states. The primary purpose of the internship was to extend the look-up tables to handle saturated states. This involves a) generation of a property table using REFPROP, a thermodynamic property program that is widely used, and b) modifications of the Fortran source code to read in an additional property table containing saturation data for both saturated liquid and saturated vapor states. Also, a method was implemented to calculate the thermodynamic properties of user-fluids within the saturation region, given values of pressure and enthalpy. These additions required new code to be written, and older code had to be adjusted to accommodate the new capabilities. Ultimately, the changes will lead to the incorporation of this new capability in future versions of GFSSP. This paper describes the development and validation of the new capability.

  15. Reporting to Improve Reproducibility and Facilitate Validity Assessment for Healthcare Database Studies V1.0.

    PubMed

    Wang, Shirley V; Schneeweiss, Sebastian; Berger, Marc L; Brown, Jeffrey; de Vries, Frank; Douglas, Ian; Gagne, Joshua J; Gini, Rosa; Klungel, Olaf; Mullins, C Daniel; Nguyen, Michael D; Rassen, Jeremy A; Smeeth, Liam; Sturkenboom, Miriam

    2017-09-01

    Defining a study population and creating an analytic dataset from longitudinal healthcare databases involves many decisions. Our objective was to catalogue scientific decisions underpinning study execution that should be reported to facilitate replication and enable assessment of validity of studies conducted in large healthcare databases. We reviewed key investigator decisions required to operate a sample of macros and software tools designed to create and analyze analytic cohorts from longitudinal streams of healthcare data. A panel of academic, regulatory, and industry experts in healthcare database analytics discussed and added to this list. Evidence generated from large healthcare encounter and reimbursement databases is increasingly being sought by decision-makers. Varied terminology is used around the world for the same concepts. Agreeing on terminology and which parameters from a large catalogue are the most essential to report for replicable research would improve transparency and facilitate assessment of validity. At a minimum, reporting for a database study should provide clarity regarding operational definitions for key temporal anchors and their relation to each other when creating the analytic dataset, accompanied by an attrition table and a design diagram. A substantial improvement in reproducibility, rigor and confidence in real world evidence generated from healthcare databases could be achieved with greater transparency about operational study parameters used to create analytic datasets from longitudinal healthcare databases. © 2017 The Authors. Pharmacoepidemiology & Drug Safety Published by John Wiley & Sons Ltd.

  16. Validating Biomarkers for PTSD

    DTIC Science & Technology

    2015-04-01

    Recall Participants by Site Recruitment Site Procedure Q1 Q2 Q3 Q4 Year 1 Total NYUMC BCI * 6 5 4 6 21 Blood draw 0 8 4 4 16 Self-report 0 7 4 4...15 Brain imaging 1 8 5 0 14 NCT** 0 7 4 4 15 JJPVAMC/MMSM BCI * 2 4 6 3 15 Blood draw 1 4 4 2 11 Self-report 2 2 5 1 10 Brain imaging 1 2 2 0 5...NCT** 0 4 5 1 10 * BCI = Baseline Clinical Interview **NCT = Neurocognitive Testing Table 3. Completed Procedures for Validating Biomarkers New

  17. The exact analysis of contingency tables in medical research.

    PubMed

    Mehta, C R

    1994-01-01

    A unified view of exact nonparametric inference, with special emphasis on data in the form of contingency tables, is presented. While the concept of exact tests has been in existence since the early work of RA Fisher, the computational complexity involved in actually executing such tests precluded their use until fairly recently. Modern algorithmic advances, combined with the easy availability of inexpensive computing power, has renewed interest in exact methods of inference, especially because they remain valid in the face of small, sparse, imbalanced, or heavily tied data. After defining exact p-values in terms of the permutation principle, we reference algorithms for computing them. Several data sets are then analysed by both exact and asymptotic methods. We end with a discussion of the available software.

  18. Modelling methane fluxes from managed and restored peatlands

    NASA Astrophysics Data System (ADS)

    Cresto Aleina, F.; Rasche, L.; Hermans, R.; Subke, J. A.; Schneider, U. A.; Brovkin, V.

    2015-12-01

    European peatlands have been extensively managed over past centuries. Typical management activities consisted of drainage and afforestation, which lead to considerable damage to the peat and potentially significant carbon loss. Recent efforts to restore previously managed peatlands have been carried out throughout Europe. These restoration efforts have direct implications for water table depth and greenhouse gas emissions, thus impacting on the ecosystem services provided by peatland areas. In order to quantify the impact of peatland restoration on water table depth and greenhouse gas budget, We coupled the Environmental Policy Integrated Climate (EPIC) model to a process-based model for methane emissions (Walter and Heimann, 2000). The new model (EPIC-M) can potentially be applied at the European and even at the global scale, but it is yet to be tested and evaluated. We present results of this new tool from different peatlands in the Flow Country, Scotland. Large parts of the peatlands of the region have been drained and afforested during the 1980s, but since the late 1990s, programs to restore peatlands in the Flow Country have been enforced. This region offers therefore a range of peatlands, from near pristine, to afforested and drained, with different resoration ages in between, where we can apply the EPIC-M model and validate it against experimental data from all land stages of restoration. Goals of this study are to evaluate the EPIC-M model and its performances against in situ measurements of methane emissions and water table changes in drained peatlands and in restored ones. Secondly, our purpose is to study the environmental impact of peatland restoration, including methane emissions, due to the rewetting of drained surfaces. To do so, we forced the EPIC-M model with local meteorological and soil data, and simulated soil temperatures, water table dynamics, and greenhouse gas emissions. This is the first step towards a European-wide application of the EPIC-M model for the assessment of the environmental impact of peatland restoration.

  19. An empirical study of software design practices

    NASA Technical Reports Server (NTRS)

    Card, David N.; Church, Victor E.; Agresti, William W.

    1986-01-01

    Software engineers have developed a large body of software design theory and folklore, much of which was never validated. The results of an empirical study of software design practices in one specific environment are presented. The practices examined affect module size, module strength, data coupling, descendant span, unreferenced variables, and software reuse. Measures characteristic of these practices were extracted from 887 FORTRAN modules developed for five flight dynamics software projects monitored by the Software Engineering Laboratory (SEL). The relationship of these measures to cost and fault rate was analyzed using a contingency table procedure. The results show that some recommended design practices, despite their intuitive appeal, are ineffective in this environment, whereas others are very effective.

  20. Modeling the potential impacts of climate change on the water table level of selected forested wetlands in the southeastern United States

    NASA Astrophysics Data System (ADS)

    Zhu, Jie; Sun, Ge; Li, Wenhong; Zhang, Yu; Miao, Guofang; Noormets, Asko; McNulty, Steve G.; King, John S.; Kumar, Mukesh; Wang, Xuan

    2017-12-01

    The southeastern United States hosts extensive forested wetlands, providing ecosystem services including carbon sequestration, water quality improvement, groundwater recharge, and wildlife habitat. However, these wetland ecosystems are dependent on local climate and hydrology, and are therefore at risk due to climate and land use change. This study develops site-specific empirical hydrologic models for five forested wetlands with different characteristics by analyzing long-term observed meteorological and hydrological data. These wetlands represent typical cypress ponds/swamps, Carolina bays, pine flatwoods, drained pocosins, and natural bottomland hardwood ecosystems. The validated empirical models are then applied at each wetland to predict future water table changes using climate projections from 20 general circulation models (GCMs) participating in Coupled Model Inter-comparison Project 5 (CMIP5) under the Representative Concentration Pathways (RCPs) 4.5 and 8.5 scenarios. We show that combined future changes in precipitation and potential evapotranspiration would significantly alter wetland hydrology including groundwater dynamics by the end of the 21st century. Compared to the historical period, all five wetlands are predicted to become drier over time. The mean water table depth is predicted to drop by 4 to 22 cm in response to the decrease in water availability (i.e., precipitation minus potential evapotranspiration) by the year 2100. Among the five examined wetlands, the depressional wetland in hot and humid Florida appears to be most vulnerable to future climate change. This study provides quantitative information on the potential magnitude of wetland hydrological response to future climate change in typical forested wetlands in the southeastern US.

  1. Applying vegetation indices to detect high water table zones in humid warm-temperate regions using satellite remote sensing

    NASA Astrophysics Data System (ADS)

    Koide, Kaoru; Koike, Katsuaki

    2012-10-01

    This study developed a geobotanical remote sensing method for detecting high water table zones using differences in the conditions of forest trees induced by groundwater supply in a humid warm-temperate region. A new vegetation index (VI) termed added green band NDVI (AgbNDVI) was proposed to discriminate the differences. The AgbNDVI proved to be more sensitive to water stress on green vegetation than existing VIs, such as SAVI and EVI2, and possessed a strong linear correlation with the vegetation fraction. To validate a proposed vegetation index method, a 23 km2 study area was selected in the Tono region of Gifu prefecture, central Japan. The AgbNDVI values were calculated from atmospheric corrected SPOT HRV data. To correctly extract high VI points, the influence factors on forest tree growth were identified using the AgbNDVI values, DEM and forest type data; the study area was then divided into 555 domains chosen from a combination of the influence factors and forest types. Thresholds for extracting high VI points were defined for each domain based on histograms of AgbNDVI values. By superimposing the high VI points on topographic and geologic maps, most high VI points are clearly located on either concave or convex slopes, and are found to be proximal to geologic boundaries—particularly the boundary between the Pliocene gravel layer and the Cretaceous granite, which should act as a groundwater flow path. In addition, field investigations support the correctness of the high VI points, because they are located around groundwater seeps and in high water table zones where the growth increments and biomass of trees are greater than at low VI points.

  2. Sensor Technology Baseline Study for Enabling Condition Based Maintenance Plus in Army Ground Vehicles

    DTIC Science & Technology

    2012-03-01

    for enabling condition based maintenance plus in Army ground vehicles. The sensor study was driven from Failure Mode Effects Analysis ( FMEA ...of Tables Table 1. Sensor technology baseline study based on engine FMEA report. ...................................5 Table 2. Sensor technology...baseline study based on transmission FMEA report. .........................8 Table 3. Sensor technology baseline study based on alternator FMEA report

  3. IRIS Toxicological Review of Ethyl Tertiary Butyl Ether (Etbe) ...

    EPA Pesticide Factsheets

    In August 2013, EPA released the draft literature searches and associated search strategies, evidence tables, and exposure response arrays for ETBE to obtain input from stakeholders and the public prior to developing the draft IRIS assessment. Specifically, EPA was interested in comments on the following: Draft literature search strategies The approach for identifying studies The screening process for selecting pertinent studies The resulting list of pertinent studies Preliminary evidence tables The process for selecting studies to include in evidence tables The quality of the studies in the evidence tables The literature search strategy, which describes the processes for identifying scientific literature, contains the studies that EPA considered and selected to include in the evidence tables. The preliminary evidence tables and exposure-response arrays present the key study data in a standardized format. The evidence tables summarize the available critical scientific literature. The exposure-response figures provide a graphical representation of the responses at different levels of exposure for each study in the evidence table. The draft Toxicological Review of Ethyl Tertiary Butyl Ether provides scientific support and rationale for the hazard and dose-response assessment pertaining to chronic exposure to ethyl tertiary butyl ether.

  4. Causal Factors and Adverse Conditions of Aviation Accidents and Incidents Related to Integrated Resilient Aircraft Control

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Briggs, Jeffrey L.; Evans, Joni K.; Sandifer, Carl E.; Jones, Sharon Monica

    2010-01-01

    The causal factors of accidents from the National Transportation Safety Board (NTSB) database and incidents from the Federal Aviation Administration (FAA) database associated with loss of control (LOC) were examined for four types of operations (i.e., Federal Aviation Regulation Part 121, Part 135 Scheduled, Part 135 Nonscheduled, and Part 91) for the years 1988 to 2004. In-flight LOC is a serious aviation problem. Well over half of the LOC accidents included at least one fatality (80 percent in Part 121), and roughly half of all aviation fatalities in the studied time period occurred in conjunction with LOC. An adverse events table was updated to provide focus to the technology validation strategy of the Integrated Resilient Aircraft Control (IRAC) Project. The table contains three types of adverse conditions: failure, damage, and upset. Thirteen different adverse condition subtypes were gleaned from the Aviation Safety Reporting System (ASRS), the FAA Accident and Incident database, and the NTSB database. The severity and frequency of the damage conditions, initial test conditions, and milestones references are also provided.

  5. Steady-state groundwater recharge in trapezoidal-shaped aquifers: A semi-analytical approach based on variational calculus

    NASA Astrophysics Data System (ADS)

    Mahdavi, Ali; Seyyedian, Hamid

    2014-05-01

    This study presents a semi-analytical solution for steady groundwater flow in trapezoidal-shaped aquifers in response to an areal diffusive recharge. The aquifer is homogeneous, anisotropic and interacts with four surrounding streams of constant-head. Flow field in this laterally bounded aquifer-system is efficiently constructed by means of variational calculus. This is accomplished by minimizing a properly defined penalty function for the associated boundary value problem. Simple yet demonstrative scenarios are defined to investigate anisotropy effects on the water table variation. Qualitative examination of the resulting equipotential contour maps and velocity vector field illustrates the validity of the method, especially in the vicinity of boundary lines. Extension to the case of triangular-shaped aquifer with or without an impervious boundary line is also demonstrated through a hypothetical example problem. The present solution benefits from an extremely simple mathematical expression and exhibits strictly close agreement with the numerical results obtained from Modflow. Overall, the solution may be used to conduct sensitivity analysis on various hydrogeological parameters that affect water table variation in aquifers defined in trapezoidal or triangular-shaped domains.

  6. Assessment of Stone Complexity for PCNL: A Systematic Review of the Literature, How Best Can We Record Stone Complexity in PCNL?

    PubMed

    Withington, John; Armitage, James; Finch, William; Wiseman, Oliver; Glass, Jonathan; Burgess, Neil

    2016-01-01

    This study aims to systematically review the literature reporting tools for scoring stone complexity and the stratification of outcomes by stone complexity. In doing so, we aim to determine whether the evidence favors uniform adoption of any one scoring system. PubMed and Embase databases were systematically searched for relevant studies from 2004 to 2014. Reports selected according to predetermined inclusion and exclusion criteria were appraised in terms of methodologic quality and their findings summarized in structured tables. After review, 15 studies were considered suitable for inclusion. Four distinct scoring systems were identified and a further five studies that aimed to validate aspects of those scoring systems. Six studies reported the stratification of outcomes by stone complexity, without specifically defining a scoring system. All studies reported some correlation between stone complexity and stone clearance. Correlation with complications was less clearly established, where investigated. This review does not allow us to firmly recommend one scoring system over the other. However, the quality of evidence supporting validation of the Guy's Stone Score is marginally superior, according to the criteria applied in this study. Further evaluation of the interobserver reliability of this scoring system is required.

  7. Long-Term Hydrologic Impacts of Controlled Drainage Using DRAINMOD

    NASA Astrophysics Data System (ADS)

    Saadat, S.; Bowling, L. C.; Frankenberger, J.

    2017-12-01

    Controlled drainage is a management strategy designed to mitigate water quality issues caused by subsurface drainage but it may increase surface ponding and runoff. To improve controlled drainage system management, a long-term and broader study is needed that goes beyond the experimental studies. Therefore, the goal of this study was to parametrize the DRAINMOD field-scale, hydrologic model for the Davis Purdue Agricultural Center located in Eastern Indiana and to predict the subsurface drain flow and surface runoff and ponding at this research site. The Green-Ampt equation was used to characterize the infiltration, and digital elevation models (DEMs) were used to estimate the maximum depressional storage as the surface ponding parameter inputs to DRAINMOD. Hydraulic conductivity was estimated using the Hooghoudt equation and the measured drain flow and water table depths. Other model inputs were either estimated or taken from the measurements. The DRAINMOD model was calibrated and validated by comparing model predictions of subsurface drainage and water table depths with field observations from 2012 to 2016. Simulations based on the DRAINMOD model can increase understanding of the environmental and hydrological effects over a broader temporal and spatial scale than is possible using field-scale data and this is useful for developing management recommendations for water resources at field and watershed scales.

  8. The EOS land validation core sites: background information and current status

    USGS Publications Warehouse

    Morisette, J.; Privette, J.L.; Justice, C.; Olson, D.; Dwyer, John L.; Davis, P.; Starr, D.; Wickland, D.

    1999-01-01

    The EOS Land Validation Core Sites1 will provide the user community with timely ground, aircraft, and satellite data for EOS science and validation investigations. The sites, currently 24 distributed worldwide, represent a consensus among the instrument teams and validation investigators and represent a range of global biome types (see Figure 1 and Table 1; Privette et al., 1999; Justice et al., 1998). The sites typically have a history of in situ and remote observations and can expect continued monitoring and land cover research activities. In many cases, a Core Site will have a tower equipped with above-canopy instrumentation for nearcontinuous sampling of landscape radiometric, energy and CO2 flux, meteorological variables, and atmospheric aerosol and water vapor data. These will be complemented by intensive field measurement campaigns. The data collected at these sites will provide an important resource for the broader science community. These sites can also provide a foundation for a validation network supported and used by all international space agencies.

  9. Aeroelastic Considerations in the Preliminary Design Aircraft

    DTIC Science & Technology

    1983-09-01

    system for aeroelastic analysis FINDEX- Lockheed’s DMS for matrices and NASTRAN tables FSD- fully stressed design algorithm Lockheed- Lockheed-California...Company MLC- maneuver load control NASA- National Aeronautics and Space Adminstration NASTRAN - structural finite element program developed by NASA...Computer Program Validation All major computing programs (FAMAS, NASTRAN , etc.), except the weight distribution program, the panel sizing and allowable

  10. Turning the Tables: Using RateMyProfessors.com as a Teaching Tool in the Community College Classroom

    ERIC Educational Resources Information Center

    Miranda, Michael V.

    2018-01-01

    College students use online sources to obtain information about their institution's professors when preparing their schedules for the upcoming semester. While there is some disagreement among educators as to the validity of these student evaluations, students appear to find them to be quite useful. Might these evaluations, however, prove to be…

  11. Digital Flight Control System Validation.

    DTIC Science & Technology

    1982-06-01

    dastia Lfe syal. an ideatIled in figure I sad table 3. -Fm. sad~~b~ vaw." Tomm abm occu I A Pas sr be eveated, the Structue built upon it r must be...to base the Kseetles faiuLte pruiletles ad Specifi sytem . falue. prob- ability forat lo MoSigt will be 10 bue IMPeLASt Is the requiminas fog mnds

  12. Recent statistical methods for orientation data

    NASA Technical Reports Server (NTRS)

    Batschelet, E.

    1972-01-01

    The application of statistical methods for determining the areas of animal orientation and navigation are discussed. The method employed is limited to the two-dimensional case. Various tests for determining the validity of the statistical analysis are presented. Mathematical models are included to support the theoretical considerations and tables of data are developed to show the value of information obtained by statistical analysis.

  13. Prefabricated Roof Beams for Hardened Shelters

    DTIC Science & Technology

    1993-08-01

    beam with a composite concrete slab. Based on the results of the concept evaluation, a test program was designed and conducted to validate the steel...ultimaw, strength. The results of these tests showed that the design procedure accurately predicts the response of the ste,-confined concrete composite...BENDING OF EXTERNALLY REINFORCED CONCRETE BEAMS ........ 67 TABLE 9. SINGLE POINT LOAD BEAM TEST RESULTS

  14. WSTF electrical arc projects

    NASA Technical Reports Server (NTRS)

    Linley, Larry

    1994-01-01

    The objectives of these projects include the following: validate method used to screen wire insulation with arc tracking characteristics; determine damage resistance to arc as a function of source voltage and insulation thickness; investigate propagation characteristics of Kapton at low voltages; and investigate pyrolytic properties of polyimide insulated (Kapton) wire for low voltage (less than 35 VDC) applications. Supporting diagrams and tables are presented.

  15. Computation of Thermally Perfect Compressible Flow Properties

    NASA Technical Reports Server (NTRS)

    Witte, David W.; Tatum, Kenneth E.; Williams, S. Blake

    1996-01-01

    A set of compressible flow relations for a thermally perfect, calorically imperfect gas are derived for a value of c(sub p) (specific heat at constant pressure) expressed as a polynomial function of temperature and developed into a computer program, referred to as the Thermally Perfect Gas (TPG) code. The code is available free from the NASA Langley Software Server at URL http://www.larc.nasa.gov/LSS. The code produces tables of compressible flow properties similar to those found in NACA Report 1135. Unlike the NACA Report 1135 tables which are valid only in the calorically perfect temperature regime the TPG code results are also valid in the thermally perfect, calorically imperfect temperature regime, giving the TPG code a considerably larger range of temperature application. Accuracy of the TPG code in the calorically perfect and in the thermally perfect, calorically imperfect temperature regimes are verified by comparisons with the methods of NACA Report 1135. The advantages of the TPG code compared to the thermally perfect, calorically imperfect method of NACA Report 1135 are its applicability to any type of gas (monatomic, diatomic, triatomic, or polyatomic) or any specified mixture of gases, ease-of-use, and tabulated results.

  16. Validation of Volcanic Ash Forecasting Performed by the Washington Volcanic Ash Advisory Center

    NASA Astrophysics Data System (ADS)

    Salemi, A.; Hanna, J.

    2009-12-01

    In support of NOAA’s mission to protect life and property, the Satellite Analysis Branch (SAB) uses satellite imagery to monitor volcanic eruptions and track volcanic ash. The Washington Volcanic Ash Advisory Center (VAAC) was established in late 1997 through an agreement with the International Civil Aviation Organization (ICAO). A volcanic ash advisory (VAA) is issued every 6 hours while an eruption is occurring. Information about the current location and height of the volcanic ash as well as any pertinent meteorological information is contained within the VAA. In addition, when ash is detected in satellite imagery, 6-, 12- and 18-hour forecasts of ash height and location are provided. This information is garnered from many sources including Meteorological Watch Offices (MWOs), pilot reports (PIREPs), model forecast winds, radiosondes and volcano observatories. The Washington VAAC has performed a validation of their 6, 12 and 18 hour airborne volcanic ash forecasts issued since October, 2007. The volcanic ash forecasts are viewed dichotomously (yes/no) with the frequency of yes and no events placed into a contingency table. A large variety of categorical statistics useful in describing forecast performance are then computed from the resulting contingency table.

  17. A Study of Transparent Plastics for use on Aircraft. Supplement

    NASA Technical Reports Server (NTRS)

    Axilrod, Benjamin M.; Kline, Gordon M.

    1937-01-01

    This supplement to a NACA study issued in May 1937 entitled "A Study of Transparent Plastics for Use on Aircraft", contains two tables. These tables contain data on bursting strengths of plastics, particularly at low temperatures. Table 1 contains the values reported in a table of the original memorandum, and additional values obtained at approximately 25 C, for three samples of Acrylate resin. The second table contains data obtained for the bursting strength when one surface of the plastic was cooled to approximately -35 C.

  18. Pupil Size in Outdoor Environments

    DTIC Science & Technology

    2007-04-06

    studies. .........................19 Table 3: Descriptive statistics for pupils measured over luminance range. .........50 Table 4: N in each...strata for all pupil measurements..........................................50 Table 5: Descriptive statistics stratified against eye color...59 Table 6: Descriptive statistics stratified against gender. .....................................64 Table 7: Descriptive

  19. QSAR modeling based on structure-information for properties of interest in human health.

    PubMed

    Hall, L H; Hall, L M

    2005-01-01

    The development of QSAR models based on topological structure description is presented for problems in human health. These models are based on the structure-information approach to quantitative biological modeling and prediction, in contrast to the mechanism-based approach. The structure-information approach is outlined, starting with basic structure information developed from the chemical graph (connection table). Information explicit in the connection table (element identity and skeletal connections) leads to significant (implicit) structure information that is useful for establishing sound models of a wide range of properties of interest in drug design. Valence state definition leads to relationships for valence state electronegativity and atom/group molar volume. Based on these important aspects of molecules, together with skeletal branching patterns, both the electrotopological state (E-state) and molecular connectivity (chi indices) structure descriptors are developed and described. A summary of four QSAR models indicates the wide range of applicability of these structure descriptors and the predictive quality of QSAR models based on them: aqueous solubility (5535 chemically diverse compounds, 938 in external validation), percent oral absorption (%OA, 417 therapeutic drugs, 195 drugs in external validation testing), AMES mutagenicity (2963 compounds including 290 therapeutic drugs, 400 in external validation), fish toxicity (92 substituted phenols, anilines and substituted aromatics). These models are established independent of explicit three-dimensional (3-D) structure information and are directly interpretable in terms of the implicit structure information useful to the drug design process.

  20. Heavy metals and mineral elements not included on the nutritional labels in table olives.

    PubMed

    López-López, Antonio; López, Rafael; Madrid, Fernando; Garrido-Fernández, Antonio

    2008-10-22

    The average contents, in mg/kg edible portion (e.p.), of elements not considered for nutritional labeling in Spanish table olives were as follows: aluminum, 71.1; boron, 4.41; barium, 2.77; cadmium, 0.04; cobalt, 0.12; chromium, 0.19; lithium, 6.56; nickel, 0.15; lead, 0.15; sulfur, 321; tin, 18.4; strontium, 9.71; and zirconium, 0.04. Sulfur was the most abundant element in table olives, followed by aluminum and tin (related to green olives). There were significant differences between elaboration styles, except for aluminum, tin, and sulfur. Ripe olives had significantly higher concentrations (mg/kg e.p.) of boron (5.32), barium (3.91), cadmium (0.065), cobalt (0.190), chromium (0.256), lithium (10.01), nickel (0.220), and strontium (10.21), but the levels of tin (25.55) and zirconium (0.039) were higher in green olives. The content of contaminants (cadmium, nickel, and tin) was always below the maximum limits legally established. The discriminant analysis led to an overall 86% correct classification of cases (80% after cross-validation).

  1. An Analysis of Class II Supplies Requisitions in the Korean Army’s Organizational Supply

    DTIC Science & Technology

    2009-03-26

    five methods for qualitative research : Case study , Ethnography , 45 Phenomenological study , Grounded theory , and...Approaches .. 42 Table 9 Five Qualitative Research Methods ..................................................................... 45 Table 10 Six...Content analysis. Table 9 provides a brief overview of the five methods . Table 9 Five Qualitative

  2. Development and analysis of SCR requirements tables for system scenarios

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Morrison, Jeffery L.

    1995-01-01

    We describe the use of scenarios to develop and refine requirement tables for parts of the Earth Observing System Data and Information System (EOSDIS). The National Aeronautics and Space Administration (NASA) is developing EOSDIS as part of its Mission-To-Planet-Earth (MTPE) project to accept instrument/platform observation requests from end-user scientists, schedule and perform requested observations of the Earth from space, collect and process the observed data, and distribute data to scientists and archives. Current requirements for the system are managed with tools that allow developers to trace the relationships between requirements and other development artifacts, including other requirements. In addition, the user community (e.g., earth and atmospheric scientists), in conjunction with NASA, has generated scenarios describing the actions of EOSDIS subsystems in response to user requests and other system activities. As part of a research effort in verification and validation techniques, this paper describes our efforts to develop requirements tables from these scenarios for the EOSDIS Core System (ECS). The tables specify event-driven mode transitions based on techniques developed by the Naval Research Lab's (NRL) Software Cost Reduction (SCR) project. The SCR approach has proven effective in specifying requirements for large systems in an unambiguous, terse format that enhance identification of incomplete and inconsistent requirements. We describe development of SCR tables from user scenarios and identify the strengths and weaknesses of our approach in contrast to the requirements tracing approach. We also evaluate the capabilities of both approach to respond to the volatility of requirements in large, complex systems.

  3. IRIS Toxicological Review of Hexahydro-1,3,5-trinitro-1,3,5 ...

    EPA Pesticide Factsheets

    In August 2013, EPA released the draft literature searches and associated search strategies, evidence tables, and exposure response arrays for RDX to obtain input from stakeholders and the public prior to developing the draft IRIS assessment. Specifically, EPA was interested in comments on the following: Draft literature search strategies The approach for identifying studies The screening process for selecting pertinent studies The resulting list of pertinent studies Preliminary evidence tables The process for selecting studies to include in evidence tables The quality of the studies in the evidence tables The literature search strategy, which describes the processes for identifying scientific literature, contains the studies that EPA considered and selected to include in the evidence tables. The preliminary evidence tables and exposure-response arrays present the key study data in a standardized format. The evidence tables summarize the available critical scientific literature. The exposure-response figures provide a graphical representation of the responses at different levels of exposure for each study in the evidence table. The U.S. EPA is conducting a new health assessment of RDX that will appear on the Agency's online database, the Integrated Risk Information System (IRIS). IRIS is an EPA database containing Agency scientific positions on potential adverse human health effects that may result

  4. IRIS Toxicological Review of tert-Butyl Alcohol (tert-Butanol) ...

    EPA Pesticide Factsheets

    In August 2013, EPA released the draft literature searches and associated search strategies, evidence tables, and exposure response arrays for TBA to obtain input from stakeholders and the public prior to developing the draft IRIS assessment. Specifically, EPA was interested in comments on the following: Draft literature search strategies The approach for identifying studies The screening process for selecting pertinent studies The resulting list of pertinent studies Preliminary evidence tables The process for selecting studies to include in evidence tables The quality of the studies in the evidence tables The literature search strategy, which describes the processes for identifying scientific literature, contains the studies that EPA considered and selected to include in the evidence tables. The preliminary evidence tables and exposure-response arrays present the key study data in a standardized format. The evidence tables summarize the available critical scientific literature. The exposure-response figures provide a graphical representation of the responses at different levels of exposure for each study in the evidence table. EPA is undertaking a new health assessment for t-butyl alcohol (TBA) for the Integrated Risk Information System (IRIS). The outcome of this project will be a Toxicological Review and IRIS and IRIS Summary of TBA that will be entered on the IRIS database. IRIS is an EPA da

  5. VizieR Online Data Catalog: Monochromatic conversion factors to LIR & Mdust (Schreiber+, 2018)

    NASA Astrophysics Data System (ADS)

    Schreiber, C.; Elbaz, D.; Pannella, M.; Wang, T.; Ciesla, L.; Franco, M.

    2017-10-01

    These tables contain conversion factors to translate observed fluxes (Sν) or luminosities (ν*Lν) into total infrared luminosity (LIR) and dust mass (Mdust). The conversion factors are provided for the most commonly used ALMA bands (Band 3 to Band 9) and all JWST MIRI broad bands (F777W to F2550W). These factors are tabulated as a function of redshift. For each conversion factor, the tables also provide the logarithmic uncertainty on the conversion (in dex), which reflects the diversity in spectral shape. These data were calibrated on the deep Spitzer and Herschel observations of the CANDELS fields, as well as early ALMA observations. They are therefore valid for galaxies of masses close to 1010Mȯ and above. (3 data files).

  6. A shallow water table fluctuation model in response to precipitation with consideration of unsaturated gravitational flow

    NASA Astrophysics Data System (ADS)

    Park, E.; Jeong, J.

    2017-12-01

    A precise estimation of groundwater fluctuation is studied by considering delayed recharge flux (DRF) and unsaturated zone drainage (UZD). Both DRF and UZD are due to gravitational flow impeded in the unsaturated zone, which may nonnegligibly affect groundwater level changes. In the validation, a previous model without the consideration of unsaturated flow is benchmarked where the actual groundwater level and precipitation data are divided into three periods based on the climatic condition. The estimation capability of the new model is superior to the benchmarked model as indicated by the significantly improved representation of groundwater level with physically interpretable model parameters.

  7. Developing the Polish Educational Needs Assessment Tool (Pol-ENAT) in rheumatoid arthritis and systemic sclerosis: a cross-cultural validation study using Rasch analysis.

    PubMed

    Sierakowska, Matylda; Sierakowski, Stanisław; Sierakowska, Justyna; Horton, Mike; Ndosi, Mwidimi

    2015-03-01

    To undertake cross-cultural adaptation and validation of the educational needs assessment tool (ENAT) for use with people with rheumatoid arthritis (RA) and systemic sclerosis (SSc) in Poland. The study involved two main phases: (1) cross-cultural adaptation of the ENAT from English into Polish and (2) Cross-cultural validation of Polish Educational Needs Assessment Tool (Pol-ENAT). The first phase followed an established process of cross-cultural adaptation of self-report measures. The second phase involved completion of the Pol-ENAT by patients and subjecting the data to Rasch analysis to assess the construct validity, unidimensionality, internal consistency and cross-cultural invariance. An adequate conceptual equivalence was achieved following the adaptation process. The dataset for validation comprised a total of 278 patients, 237 (85.3 %) of which were female. In each disease group (145, RA and 133, SSc), the 7 domains of the Pol-ENAT were found to fit the Rasch model, X (2)(df) = 16.953(14), p = 0.259 and 8.132(14), p = 0.882 for RA and SSc, respectively. Internal consistency of the Pol-ENAT was high (patient separation index = 0.85 and 0.89 for SSc and RA, respectively), and unidimensionality was confirmed. Cross-cultural differential item functioning (DIF) was detected in some subscales, and DIF-adjusted conversion tables were calibrated to enable cross-cultural comparison of data between Poland and the UK. Using a standard process in cross-cultural adaptation, conceptual equivalence was achieved between the original (UK) ENAT and the adapted Pol-ENAT. Fit to the Rasch model, confirmed that the construct validity, unidimensionality and internal consistency of the ENAT have been preserved.

  8. Comparing technical proficiency of elite table tennis players with intellectual disability: simulation testing versus game play.

    PubMed

    Van Biesen, Debbie; Mactavish, Jennifer J; Vanlandewijck, Yves C

    2014-04-01

    Technical skill proficiency among elite table tennis players with intellectual disabilities (ID) was investigated in this study using two approaches: an off-court simulation testing protocol and an on-court, standardized observational framework during game play. Participants included 24 players with ID (M age = 25 yr., SD = 6; M IQ = 61, SD = 9), the top 16 performers, 13 men and 11 women, at the International Federation for sport for para-athletes with an intellectual disability (Inas) World Championships. Self-reported table tennis training experience of the players was 13 +/- 5 yr. In the Simulation Testing condition, players were instructed to play five sets of basic and five sets of advanced skills, which were subsequently assessed by experts using a standardized and validated observational protocol. The same protocol was used to assess the same skills during Game Play. Ratings of overall technical proficiency were not significantly different between Simulation Testing and Game Play conditions. There was a strong positive correlation between technical proficiency measured during Game Play vs Simulation Testing for the variables flick, topspin forehand, and topspin backhand. No correlations were found for the variables contra, block, and push. Insight into this relationship is important for future development of classification systems for ID athletes in the Paralympic Games, because comparing competition observation with the athlete's potential shown during the classification session is essential information for classifiers to confirm the athlete's competition class.

  9. Estimating groundwater evapotranspiration by a subtropical pine plantation using diurnal water table fluctuations: Implications from night-time water use

    NASA Astrophysics Data System (ADS)

    Fan, Junliang; Ostergaard, Kasper T.; Guyot, Adrien; Fujiwara, Stephen; Lockington, David A.

    2016-11-01

    Exotic pine plantations have replaced large areas of the native forests for timber production in the subtropical coastal Australia. To evaluate potential impacts of changes in vegetation on local groundwater discharge, we estimated groundwater evapotranspiration (ETg) by the pine plantation using diurnal water table fluctuations for the dry season of 2012 from August 1st to December 31st. The modified White method was used to estimate the ETg, considering the night-time water use by pine trees (Tn). Depth-dependent specific yields were also determined both experimentally and numerically for estimation of ETg. Night-time water use by pine trees was comprehensively investigated using a combination of groundwater level, sap flow, tree growth, specific yield, soil matric potential and climatic variables measurements. Results reveal a constant average transpiration flux of 0.02 mm h-1 at the plot scale from 23:00 to 05:00 during the study period, which verified the presence of night-time water use. The total ETg for the period investigated was 259.0 mm with an accumulated Tn of 64.5 mm, resulting in an error of 25% on accumulated evapotranspiration from the groundwater if night-time water use was neglected. The results indicate that the development of commercial pine plantations may result in groundwater losses in these areas. It is also recommended that any future application of diurnal water table fluctuation based methods investigate the validity of the zero night-time water use assumption prior to use.

  10. Vipie: web pipeline for parallel characterization of viral populations from multiple NGS samples.

    PubMed

    Lin, Jake; Kramna, Lenka; Autio, Reija; Hyöty, Heikki; Nykter, Matti; Cinek, Ondrej

    2017-05-15

    Next generation sequencing (NGS) technology allows laboratories to investigate virome composition in clinical and environmental samples in a culture-independent way. There is a need for bioinformatic tools capable of parallel processing of virome sequencing data by exactly identical methods: this is especially important in studies of multifactorial diseases, or in parallel comparison of laboratory protocols. We have developed a web-based application allowing direct upload of sequences from multiple virome samples using custom parameters. The samples are then processed in parallel using an identical protocol, and can be easily reanalyzed. The pipeline performs de-novo assembly, taxonomic classification of viruses as well as sample analyses based on user-defined grouping categories. Tables of virus abundance are produced from cross-validation by remapping the sequencing reads to a union of all observed reference viruses. In addition, read sets and reports are created after processing unmapped reads against known human and bacterial ribosome references. Secured interactive results are dynamically plotted with population and diversity charts, clustered heatmaps and a sortable and searchable abundance table. The Vipie web application is a unique tool for multi-sample metagenomic analysis of viral data, producing searchable hits tables, interactive population maps, alpha diversity measures and clustered heatmaps that are grouped in applicable custom sample categories. Known references such as human genome and bacterial ribosomal genes are optionally removed from unmapped ('dark matter') reads. Secured results are accessible and shareable on modern browsers. Vipie is a freely available web-based tool whose code is open source.

  11. [Preliminary study on evaluation system of mental workers strain based on primary and middle school teachers].

    PubMed

    Lian, Yulong; Liu, Jiwen; Zhang, Chen; Yuan, Fang

    2010-09-01

    To use primary and middle schools teacher as samples to preliminarily build the mental work stress effect evaluation system, providing the methological platform for the research on the stress effect mechanism and mental workers interference measures. 851 teachers in primary and middle schools were selected with randomly stratified cluster methods. Use ISTA 6.0 and Life Events Evaluation Table to measure the stress factors, and use Work Tension Reaction Questionnaire, Symptom Self-Evaluation Table Questionnaire, and General Happiness Sensing Table to measure psychological stress reaction, blood sugar and blood fat, blood cortical, ACTH, nerve behavior function, for measuring physiological stress reaction. The Comprehensive Working Ability Index Table to measure working ability. And then use the mathematical model to build the mental workers stress effect evaluation system. And apply the simple random sampling method to select 400 environmental protection workers to perform cross effect validation. The model fits relatively well (RMSEA = 0.100, GFI = 0.93, NNFI = 1.00, CFI = 1.00) and conforms with the theory, reflecting the loads of the indice, such as, working stress reaction, psychological stress reaction, physiological stress reaction and working ability, are relatively high. At the same time, the stress reaction of those 4 dimensions can fit the 2-grade factor (stress effect) very well. The physiological stress reaction is negatively correlated (P < 0.05) with the working stress reaction, psychological stress reaction, working ability decrease, while is positively correlated (P < 0.05) with the working stress, psychological stress reaction, physiological stress reaction and working ability decrease. The social support is the protection factor for working stress, psychological stress reaction, physiological stress reaction and working ability decrease (gamma(s) are -0.55, -0.77, 0.73, -0.79, respectively, P < 0.05). While working stress factors, social life stress factors and dangerous individual characters are the risk factors (P < 0.05) for working stress, psychological stress reaction, physiological stress reaction increase and the working ability decrease. The utilization of the environment protection workers further validates this model. It conforms with the theory to evaluate the mental workers stress effects from the 4 dimensions, working stress, psychological stress reaction, physiological stress reaction, and working ability. And these 4 dimensions influence each other, and also are mutually different. The working and social life stress factors influence the stress effects with certain degrees. This evaluation model can tentatively be the methodological basis for the mental workers occupational stress evaluation.

  12. Saugus River and Tributaries Flood Damage Reduction Study: Lynn, Malden, Revere and Saugus, Massachusetts. Section 1. Feasibility Report.

    DTIC Science & Technology

    1989-12-01

    57 Table 5 Sensitivity Analysis - Point of Pines LPP 61 Table 6 Plan Comparison 64 Table 7 NED Plan Project Costs 96 Table 8 Estimated Operation...Costs 99 Table 13 Selected Plan/Estimated Annual Benefits 101 Table 14 Comparative Impacts - NED Regional Floodgate Plan 102 Table 15 Economic Analysis ...Includes detailed descriptions, plans and profiles and design considerations of the selected plan; coastal analysis of the shorefront; detailed project

  13. [Practical aspects for minimizing errors in the cross-cultural adaptation and validation of quality of life questionnaires].

    PubMed

    Lauffer, A; Solé, L; Bernstein, S; Lopes, M H; Francisconi, C F

    2013-01-01

    The development and validation of questionnaires for evaluating quality of life (QoL) has become an important area of research. However, there is a proliferation of non-validated measuring instruments in the health setting that do not contribute to advances in scientific knowledge. To present, through the analysis of available validated questionnaires, a checklist of the practical aspects of how to carry out the cross-cultural adaptation of QoL questionnaires (generic, or disease-specific) so that no step is overlooked in the evaluation process, and thus help prevent the elaboration of insufficient or incomplete validations. We have consulted basic textbooks and Pubmed databases using the following keywords quality of life, questionnaires, and gastroenterology, confined to «validation studies» in English, Spanish, and Portuguese, and with no time limit, for the purpose of analyzing the translation and validation of the questionnaires available through the Mapi Institute and PROQOLID websites. A checklist is presented to aid in the planning and carrying out of the cross-cultural adaptation of QoL questionnaires, in conjunction with a glossary of key terms in the area of knowledge. The acronym DSTAC was used, which refers to each of the 5 stages involved in the recommended procedure. In addition, we provide a table of the QoL instruments that have been validated into Spanish. This article provides information on how to adapt QoL questionnaires from a cross-cultural perspective, as well as to minimize common errors. Copyright © 2012 Asociación Mexicana de Gastroenterología. Published by Masson Doyma México S.A. All rights reserved.

  14. Fertility in Gyr Cows (Bos indicus) with Fixed Time Artificial Insemination and Visual Estrus Detection Using a Classification Table

    PubMed Central

    Ramírez-Iglesia, Lilido Nelson; Roman Bravo, Rafael María; Díaz de Ramirez, Adelina; Torres, Leandro J.

    2014-01-01

    The aim of this research was to compare two artificial insemination protocols (AIP): hormonal synchronization with fixed time artificial insemination (SC-FTAI) and the use of a table based on visual observation of estrus signs (VO) in order to identify cows in natural or spontaneous estrus being assigned to AI (NSE-IA). Two groups were formed: in the first group 109 cows were assigned to SC-FTAI, in which a commercial protocol is used; the second one included 108 randomly chosen cows, which were assigned to NSE-AI and in this group a modified table was used. Response variable was first service fertility rate (FSF), which was coded 1 for pregnant and 0 for empty. Predictor variables were AIP, postpartum anestrus, daily milk yield, body condition score at AI and calving number. Statistical analyses included association chi-square tests and logistic regression. Results showed an overall 41.94% FSF and a significant association was detected (P < 0.05) between FSF and daily milk yield; pregnancy rates were 42.20% and 41.67% for the SC-FTAI and NSE-IA groups, respectively (P > 0.05). The odds ratio for the effect of AIP was only 1.050, suggesting no differences in FSF between groups. The NSE-AI protocol can enhance both the technique of VO and reproductive efficiency. Further validation of the table is required. PMID:26464929

  15. Gross volume tables for redwood trees in and near the Redwood National Park

    Treesearch

    Philip G. Langley; Terrell D. Smith; Ralph C. Hall

    1971-01-01

    To aid in appraising timber on lands acquired for the Redwood National Park, in northern California, local gross volume tables were developed for Spaulding and Humboldt log rules. This note includes the Spaulding table. The Humboldt tables is 70 percent of the Spaulding table for each category listed. Readers are cautioned that they tables produced in this study do not...

  16. Finding of No Significant Impact and Finding of No Practicable Alternative: Construction of Airfield Drainage Improvement Projects MacDill Air Force Base, Florida

    DTIC Science & Technology

    2011-08-02

    provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB...and Regulations ............................................................. 22 3.2.2 Baseline Air Emissions ...23 TABLE 3.2.2 STATIONARY AIR EMISSIONS INVENTORY, HILLSBOROUGH COUNTY, FLORIDA

  17. Development and Validation of a Porcine (Sus scrofa) Sepsis Model

    DTIC Science & Technology

    2018-03-01

    last IACUC approval, have any methods been identified to reduce the number of live animals used in this protocol? None 10. PUBLICATIONS...SUMMARY: (Please provide, in "ABSTRACT" format, a summary of the protocol objectives, materials and methods , results - include tables/figures, and...Materials and methods : Animals were anesthetized and instrumented for cardiovascular monitoring. Lipopolysaccharide (LPS, a large molecule present on the

  18. Organizational Commitment DEOCS 4.1 Construct Validity Summary

    DTIC Science & Technology

    2017-08-01

    commitment construct that targets more specifically on the workgroup frame of reference. Included is a review of the 4.0 description and items...followed by the proposed modifications to the factor. The DEOCS 4.0 description provided for organizational commitment is “members’ dedication to the...5) examining variance and descriptive statistics, and (6) selecting items that demonstrate the strongest scale properties. Table 1. DEOCS 4.0

  19. Dampening Effects Of Food Importation On Climate Change-Induced Conflict In Africa

    DTIC Science & Technology

    2017-12-01

    The validity of those claims, however, remains in question.47 Abundance theory, on the other hand, relies on a similar cost - benefit calculation but...FDRs). These resource fluctuations have the potential to reach levels extreme enough, as indicated in Table 9 results, to alter the cost - benefit ...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. DAMPENING EFFECTS

  20. A Fuel-Sensitive Reduced-Order Model (ROM) for Piston Engine Scaling Analysis

    DTIC Science & Technology

    2017-09-29

    gram; all others are in SI units except where specified. Ambient enthalpies— NASA formula (McBride et al. 1993): ℎ( − 298.15 ) = ...Aeronautics and Space Administration (US); 1993 Oct. NASA Technical Memorandum No.: 4513. Naber JD, Siebers DL. Effects of gas density and vaporization...valid Fuel’) end ****************************************************** *************** SetHaVals % From NASA data (McBride et al 1993, Table 2

  1. GAT 2.0 Trend Analysis

    DTIC Science & Technology

    2016-12-25

    Emotional , Social, Family, and Spiritual. Pilot testing on a sample of 8,000 Soldiers across grades indicated that the average completion time was 45...New questions were required for the emotional and family dimensions. Table 1 describes the questions and resilience aspects taken from validated...Source of questions used to analyze factor in GAT Emotional Bad/Good Coping Written by Professors Peterson and Park, based on and paraphrasing

  2. Differential Validity of a Differential Aptitude Test

    DTIC Science & Technology

    1990-05-01

    Sir Francis Galton in 1883 first espoused the concept of general mental ability or g, it was not until 1904 that empirical evidence was analyzed...81150) and Apprentice Radio Communications Analysis Specialist ( intelligence ) (AFSC 20230), respectively. Table 7. Educational and Demographic Description...numerical order, with a brief categorization such as "Aircrew Operations," "Precision Measurement," or " Intelligence ." Selection and classification

  3. The Relationship between SAT Scores and Retention to the Second Year: 2007 SAT Validity Sample. Statistical Report No. 2011-4

    ERIC Educational Resources Information Center

    Mattern, Krista D.; Patterson, Brian F.

    2011-01-01

    This report presents the findings from a replication of the analyses from the report, "Is Performance on the SAT Related to College Retention?" (Mattern & Patterson, 2009). The tables presented herein are based on the 2007 sample and the findings are largely the same as those presented in the original report, and show SAT scores are…

  4. Using the Sampling Margin of Error to Assess the Interpretative Validity of Student Evaluations of Teaching

    ERIC Educational Resources Information Center

    James, David E.; Schraw, Gregory; Kuch, Fred

    2015-01-01

    We present an equation, derived from standard statistical theory, that can be used to estimate sampling margin of error for student evaluations of teaching (SETs). We use the equation to examine the effect of sample size, response rates and sample variability on the estimated sampling margin of error, and present results in four tables that allow…

  5. Open-source, small-animal magnetic resonance-guided focused ultrasound system.

    PubMed

    Poorman, Megan E; Chaplin, Vandiver L; Wilkens, Ken; Dockery, Mary D; Giorgio, Todd D; Grissom, William A; Caskey, Charles F

    2016-01-01

    MR-guided focused ultrasound or high-intensity focused ultrasound (MRgFUS/MRgHIFU) is a non-invasive therapeutic modality with many potential applications in areas such as cancer therapy, drug delivery, and blood-brain barrier opening. However, the large financial costs involved in developing preclinical MRgFUS systems represent a barrier to research groups interested in developing new techniques and applications. We aim to mitigate these challenges by detailing a validated, open-source preclinical MRgFUS system capable of delivering thermal and mechanical FUS in a quantifiable and repeatable manner under real-time MRI guidance. A hardware and software package was developed that includes closed-loop feedback controlled thermometry code and CAD drawings for a therapy table designed for a preclinical MRI scanner. For thermal treatments, the modular software uses a proportional integral derivative controller to maintain a precise focal temperature rise in the target given input from MR phase images obtained concurrently. The software computes the required voltage output and transmits it to a FUS transducer that is embedded in the delivery table within the magnet bore. The delivery table holds the FUS transducer, a small animal and its monitoring equipment, and a transmit/receive RF coil. The transducer is coupled to the animal via a water bath and is translatable in two dimensions from outside the magnet. The transducer is driven by a waveform generator and amplifier controlled by real-time software in Matlab. MR acoustic radiation force imaging is also implemented to confirm the position of the focus for mechanical and thermal treatments. The system was validated in tissue-mimicking phantoms and in vivo during murine tumor hyperthermia treatments. Sonications were successfully controlled over a range of temperatures and thermal doses for up to 20 min with minimal temperature overshoot. MR thermometry was validated with an optical temperature probe, and focus visualization was achieved with acoustic radiation force imaging. We developed an MRgFUS platform for small-animal treatments that robustly delivers accurate, precise, and controllable sonications over extended time periods. This system is an open source and could increase the availability of low-cost small-animal systems to interdisciplinary researchers seeking to develop new MRgFUS applications and technology.

  6. An ecohydrological model for studying groundwater-vegetation interactions in wetlands

    NASA Astrophysics Data System (ADS)

    Chui, Ting Fong May; Low, Swee Yang; Liong, Shie-Yui

    2011-10-01

    SummaryDespite their importance to the natural environment, wetlands worldwide face drastic degradation from changes in land use and climatic patterns. To help preservation efforts and guide conservation strategies, a clear understanding of the dynamic relationship between coupled hydrology and vegetation systems in wetlands, and their responses to engineering works and climate change, is needed. An ecohydrological model was developed in this study to address this issue. The model combines a hydrology component based on the Richards' equation for characterizing variably saturated groundwater flow, with a vegetation component described by Lotka-Volterra equations tailored for plant growth. Vegetation is represented by two characteristic wetland herbaceous plant types which differ in their flood and drought resistances. Validation of the model on a study site in the Everglades demonstrated the capability of the model in capturing field-measured water table and transpiration dynamics. The model was next applied on a section of the Nee Soon swamp forest, a tropical wetland in Singapore, for studying the impact of possible drainage works on the groundwater hydrology and native vegetation. Drainage of 10 m downstream of the wetland resulted in a localized zone of influence within half a kilometer from the drainage site with significant adverse impacts on groundwater and biomass levels, indicating a strong need for conservation. Simulated water table-plant biomass relationships demonstrated the capability of the model in capturing the time-lag in biomass response to water table changes. To test the significance of taking plant growth into consideration, the performance of the model was compared to one that substituted the vegetation component with a pre-specified evapotranspiration rate. Unlike its revised counterpart, the original ecohydrological model explicitly accounted for the drainage-induced plant biomass decrease and translated the resulting reduced transpiration toll back to the groundwater hydrology for a more accurate soil water balance. This study represents, to our knowledge, the first development of an ecohydrological model for wetland ecosystems that characterizes the coupled relationship between variably-saturated groundwater flow and plant growth dynamics.

  7. When Educational Material Is Delivered: A Mixed Methods Content Validation Study of the Information Assessment Method

    PubMed Central

    2017-01-01

    Background The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Objective Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. Methods A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Results Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n=45,394], respectively). In part 2 (qualitative results), 22 items were deemed representative, while 1 item was not representative. In part 3 (mixing quantitative and qualitative results), the content validity of 21 items was confirmed, and the 2 nonrelevant items were excluded. A fully validated version was generated (IAM-v2014). Conclusions This study produced a content validated IAM questionnaire that is used by clinicians and information providers to assess the clinical information delivered in continuing education programs. PMID:28292738

  8. Design of Lesehan Chair by Using Kansei Engineering Method And Anthropometry Approach

    NASA Astrophysics Data System (ADS)

    Pambudi, A. T.; Suryoputro, M. R.; Sari, A. D.; Kurnia, R. D.

    2016-01-01

    Special Region of Yogyakarta (DIY) is known as city for academic. Many people come to get some education in college. They live in boarding house with some supporting facilities. The most common facilities is low table which lead students have to sit on the floor while studying on table which could cause higher risk of back pain and musculoskeletal disorder. To identify the solution to reduce back pain and musculoskeletal risk, it is needed to design a lesehan chair which also appropriate to customer needs. Kansei engineering method was used with a total of 30 respondents participated, 15 kansei words collected, and 12 kansei words selected by doing validation and reliability test. The result of this study showed that quality, aesthetics, and comfort level influence the design of lesehan chair. A design of lesehan chair was created by considering the suitable concept and merging it with the physical design and its anthropometry measurement. In this case, marginal homogeneity test is needed to identify the differences between each kansei words attribute and the design or product recommendation. The marginal homogeneity test results show that the design and product recommendation has fulfilled customer's desires and needs. For further research, it is needed to analyse and evaluate the posture of lesehan chair users in order to develop and improve its performance.

  9. Food composition database development for between country comparisons.

    PubMed

    Merchant, Anwar T; Dehghan, Mahshid

    2006-01-19

    Nutritional assessment by diet analysis is a two-stepped process consisting of evaluation of food consumption, and conversion of food into nutrient intake by using a food composition database, which lists the mean nutritional values for a given food portion. Most reports in the literature focus on minimizing errors in estimation of food consumption but the selection of a specific food composition table used in nutrient estimation is also a source of errors. We are conducting a large prospective study internationally and need to compare diet, assessed by food frequency questionnaires, in a comparable manner between different countries. We have prepared a multi-country food composition database for nutrient estimation in all the countries participating in our study. The nutrient database is primarily based on the USDA food composition database, modified appropriately with reference to local food composition tables, and supplemented with recipes of locally eaten mixed dishes. By doing so we have ensured that the units of measurement, method of selection of foods for testing, and assays used for nutrient estimation are consistent and as current as possible, and yet have taken into account some local variations. Using this common metric for nutrient assessment will reduce differential errors in nutrient estimation and improve the validity of between-country comparisons.

  10. Estimation of simvastatin and cetirizine by RP-LC method: Application to freeze and thaw (FT) stability studies.

    PubMed

    Naveed, Safila; Usmanghani, Khan; Sana, Aisha; Ali, Huma; Zafar, Farya; Qamar, Fatima; Sarwer, Ghulam; Abbas, Sarah; Alam, M Tanweer; Shinwari, Muhammad Ibrar

    2018-01-01

    Sensitive, simple, reliable and rapid HPLC technique for the estimation of simvastatin (SMV) and cetirizine has been designed in this study. The chromatographic conditions were set using Shimadzu LC-10 AT VP pump, with UV detector (SPD-10 AV-VP). System integration was performed with CBM-102 (Bus Module). Partitioning of components was attained with pre-packed C-18 column of Purospher Star (5 μm, 250 x 4.6 mm) at ambient conditions. Injected volume of sample was 10 μl. Mobile phase was composed of 50:50 v/v ratio of Acetonitrile/water (pH 3.0 adjusted with ortho-phosphoric acid) having 2 ml/minutes rate of flow. Compounds were detected in UV region at 225 nm. Percent Recovery of simvastatin was observed in the range of 98-102%. All results were found in accept table range of specification. The projected method is consistent, specific, precise, and rapid, that can be employed to quantitate the SMV along with cetirizine HCl. It was estimated by 3 successive cycles of freeze and thaw stability. Results of FT samples were found within accept table limits the method was developed and validated in raw materials, bulk formulations and final drug products.

  11. Specific agreement on dichotomous outcomes can be calculated for more than two raters.

    PubMed

    de Vet, Henrica C W; Dikmans, Rieky E; Eekhout, Iris

    2017-03-01

    For assessing interrater agreement, the concepts of observed agreement and specific agreement have been proposed. The situation of two raters and dichotomous outcomes has been described, whereas often, multiple raters are involved. We aim to extend it for more than two raters and examine how to calculate agreement estimates and 95% confidence intervals (CIs). As an illustration, we used a reliability study that includes the scores of four plastic surgeons classifying photographs of breasts of 50 women after breast reconstruction into "satisfied" or "not satisfied." In a simulation study, we checked the hypothesized sample size for calculation of 95% CIs. For m raters, all pairwise tables [ie, m (m - 1)/2] were summed. Then, the discordant cells were averaged before observed and specific agreements were calculated. The total number (N) in the summed table is m (m - 1)/2 times larger than the number of subjects (n), in the example, N = 300 compared to n = 50 subjects times m = 4 raters. A correction of n√(m - 1) was appropriate to find 95% CIs comparable to bootstrapped CIs. The concept of observed agreement and specific agreement can be extended to more than two raters with a valid estimation of the 95% CIs. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. TOPMODEL simulations of streamflow and depth to water table in Fishing Brook Watershed, New York, 2007-09

    USGS Publications Warehouse

    Nystrom, Elizabeth A.; Burns, Douglas A.

    2011-01-01

    TOPMODEL uses a topographic wetness index computed from surface-elevation data to simulate streamflow and subsurface-saturation state, represented by the saturation deficit. Depth to water table was computed from simulated saturation-deficit values using computed soil properties. In the Fishing Brook Watershed, TOPMODEL was calibrated to the natural logarithm of streamflow at the study area outlet and depth to water table at Sixmile Wetland using a combined multiple-objective function. Runoff and depth to water table responded differently to some of the model parameters, and the combined multiple-objective function balanced the goodness-of-fit of the model realizations with respect to these parameters. Results show that TOPMODEL reasonably simulated runoff and depth to water table during the study period. The simulated runoff had a Nash-Sutcliffe efficiency of 0.738, but the model underpredicted total runoff by 14 percent. Depth to water table computed from simulated saturation-deficit values matched observed water-table depth moderately well; the root mean squared error of absolute depth to water table was 91 millimeters (mm), compared to the mean observed depth to water table of 205 mm. The correlation coefficient for temporal depth-to-water-table fluctuations was 0.624. The variability of the TOPMODEL simulations was assessed using prediction intervals grouped using the combined multiple-objective function. The calibrated TOPMODEL results for the entire study area were applied to several subwatersheds within the study area using computed hydrogeomorphic properties of the subwatersheds.

  13. Context matters: the experience of 14 research teams in systematically reporting contextual factors important for practice change.

    PubMed

    Tomoaia-Cotisel, Andrada; Scammon, Debra L; Waitzman, Norman J; Cronholm, Peter F; Halladay, Jacqueline R; Driscoll, David L; Solberg, Leif I; Hsu, Clarissa; Tai-Seale, Ming; Hiratsuka, Vanessa; Shih, Sarah C; Fetters, Michael D; Wise, Christopher G; Alexander, Jeffrey A; Hauser, Diane; McMullen, Carmit K; Scholle, Sarah Hudson; Tirodkar, Manasi A; Schmidt, Laura; Donahue, Katrina E; Parchman, Michael L; Stange, Kurt C

    2013-01-01

    We aimed to advance the internal and external validity of research by sharing our empirical experience and recommendations for systematically reporting contextual factors. Fourteen teams conducting research on primary care practice transformation retrospectively considered contextual factors important to interpreting their findings (internal validity) and transporting or reinventing their findings in other settings/situations (external validity). Each team provided a table or list of important contextual factors and interpretive text included as appendices to the articles in this supplement. Team members identified the most important contextual factors for their studies. We grouped the findings thematically and developed recommendations for reporting context. The most important contextual factors sorted into 5 domains: (1) the practice setting, (2) the larger organization, (3) the external environment, (4) implementation pathway, and (5) the motivation for implementation. To understand context, investigators recommend (1) engaging diverse perspectives and data sources, (2) considering multiple levels, (3) evaluating history and evolution over time, (4) looking at formal and informal systems and culture, and (5) assessing the (often nonlinear) interactions between contextual factors and both the process and outcome of studies. We include a template with tabular and interpretive elements to help study teams engage research participants in reporting relevant context. These findings demonstrate the feasibility and potential utility of identifying and reporting contextual factors. Involving diverse stakeholders in assessing context at multiple stages of the research process, examining their association with outcomes, and consistently reporting critical contextual factors are important challenges for a field interested in improving the internal and external validity and impact of health care research.

  14. Measures of the food environment: A systematic review of the field, 2007-2015.

    PubMed

    Lytle, Leslie A; Sokol, Rebeccah L

    2017-03-01

    Many studies have examined the relationship between the food environment and health-related outcomes, but fewer consider the integrity of measures used to assess the food environment. The present review builds on and makes comparisons with a previous review examining food environment measures and expands the previous review to include a more in depth examination of reliability and validity of measures and study designs employed. We conducted a systematic review of studies measuring the food environment published between 2007 and 2015. We identified these articles through: PubMed, Embase, Web of Science, PsycINFO, and Global Health databases; tables of contents of relevant journals; and the National Cancer Institute's Measures of the Food Environment website. This search yielded 11,928 citations. We retained and abstracted data from 432 studies. The most common methodology used to study the food environment was geographic analysis (65% of articles) and the domination of this methodology has persisted since the last review. Only 25.9% of studies in this review reported the reliability of measures and 28.2% reported validity, but this was an improvement as compared to the earlier review. Very few of the studies reported construct validity. Studies reporting measures of the school or worksite environment have decreased since the previous review. Only 13.9% of the studies used a longitudinal design. To strengthen research examining the relationship between the food environment and population health, there is a need for robust and psychometrically-sound measures and more sophisticated study designs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. SU-F-T-406: Verification of Total Body Irradiation Commissioned MU Lookup Table Accuracy Using Treatment Planning System for Wide Range of Patient Sizes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, D; Chi, P; Tailor, R

    Purpose: To verify the accuracy of total body irradiation (TBI) measurement commissioning data using the treatment planning system (TPS) for a wide range of patient separations. Methods: Our institution conducts TBI treatments with an 18MV photon beam at 380cm extended SSD using an AP/PA technique. Currently, the monitor units (MU) per field for patient treatments are determined using a lookup table generated from TMR measurements in a water phantom (75 × 41 × 30.5 cm3). The dose prescribed to an umbilicus midline point at spine level is determined based on patient separation, dose/ field and dose rate/MU. One-dimensional heterogeneous dosemore » calculations from Pinnacle TPS were validated with thermoluminescent dosimeters (TLD) placed in an average adult anthropomorphic phantom and also in-vivo on four patients with large separations. Subsequently, twelve patients with various separations (17–47cm) were retrospectively analyzed. Computed tomography (CT) scans were acquired in the left and right decubitus positions from vertex to knee. A treatment plan for each patient was generated. The ratio of the lookup table MU to the heterogeneous TPS MU was compared. Results: TLD Measurements in the anthropomorphic phantom and large TBI patients agreed with Pinnacle calculated dose within 2.8% and 2%, respectively. The heterogeneous calculation compared to the lookup table agreed within 8.1% (ratio range: 1.014–1.081). A trend of reduced accuracy was observed when patient separation increases. Conclusion: The TPS dose calculation accuracy was confirmed by TLD measurements, showing that Pinnacle can model the extended SSD dose without commissioning a special beam model for the extended SSD geometry. The difference between the lookup table and TPS calculation potentially comes from lack of scatter during commissioning when compared to extreme patient sizes. The observed trend suggests the need for development of a correction factor between the lookup table and TPS dose calculations.« less

  16. Proton Affinities of Anionic Bases:  Trends Across the Periodic Table, Structural Effects, and DFT Validation.

    PubMed

    Swart, Marcel; Bickelhaupt, F Matthias

    2006-03-01

    We have carried out an extensive exploration of the gas-phase basicity of archetypal anionic bases across the periodic system using the generalized gradient approximation of density functional theory (DFT) at BP86/QZ4P//BP86/TZ2P. First, we validate DFT as a reliable tool for computing proton affinities and related thermochemical quantities:  BP86/QZ4P//BP86/TZ2P is shown to yield a mean absolute deviation of 1.6 kcal/mol for the proton affinity at 0 K with respect to high-level ab initio benchmark data. The main purpose of this work is to provide the proton affinities (and corresponding entropies) at 298 K of the anionic conjugate bases of all main-group-element hydrides of groups 14-17 and periods 2-6. We have also studied the effect of stepwise methylation of the protophilic center of the second- and third-period bases.

  17. Predictive modeling of addiction lapses in a mobile health application.

    PubMed

    Chih, Ming-Yuan; Patton, Timothy; McTavish, Fiona M; Isham, Andrew J; Judkins-Fisher, Chris L; Atwood, Amy K; Gustafson, David H

    2014-01-01

    The chronically relapsing nature of alcoholism leads to substantial personal, family, and societal costs. Addiction-comprehensive health enhancement support system (A-CHESS) is a smartphone application that aims to reduce relapse. To offer targeted support to patients who are at risk of lapses within the coming week, a Bayesian network model to predict such events was constructed using responses on 2,934 weekly surveys (called the Weekly Check-in) from 152 alcohol-dependent individuals who recently completed residential treatment. The Weekly Check-in is a self-monitoring service, provided in A-CHESS, to track patients' recovery progress. The model showed good predictability, with the area under receiver operating characteristic curve of 0.829 in the 10-fold cross-validation and 0.912 in the external validation. The sensitivity/specificity table assists the tradeoff decisions necessary to apply the model in practice. This study moves us closer to the goal of providing lapse prediction so that patients might receive more targeted and timely support. © 2013.

  18. Predictive Modeling of Addiction Lapses in a Mobile Health Application

    PubMed Central

    Chih, Ming-Yuan; Patton, Timothy; McTavish, Fiona M.; Isham, Andrew; Judkins-Fisher, Chris L.; Atwood, Amy K.; Gustafson, David H.

    2013-01-01

    The chronically relapsing nature of alcoholism leads to substantial personal, family, and societal costs. Addiction-Comprehensive Health Enhancement Support System (A-CHESS) is a smartphone application that aims to reduce relapse. To offer targeted support to patients who are at risk of lapses within the coming week, a Bayesian network model to predict such events was constructed using responses on 2,934 weekly surveys (called the Weekly Check-in) from 152 alcohol-dependent individuals who recently completed residential treatment. The Weekly Check-in is a self-monitoring service, provided in A-CHESS, to track patients’ recovery progress. The model showed good predictability, with the area under receiver operating characteristic curve of 0.829 in the 10-fold cross-validation and 0.912 in the external validation. The sensitivity/specificity table assists the tradeoff decisions necessary to apply the model in practice. This study moves us closer to the goal of providing lapse prediction so that patients might receive more targeted and timely support. PMID:24035143

  19. Data Model and Relational Database Design for Highway Runoff Water-Quality Metadata

    USGS Publications Warehouse

    Granato, Gregory E.; Tessler, Steven

    2001-01-01

    A National highway and urban runoff waterquality metadatabase was developed by the U.S. Geological Survey in cooperation with the Federal Highway Administration as part of the National Highway Runoff Water-Quality Data and Methodology Synthesis (NDAMS). The database was designed to catalog available literature and to document results of the synthesis in a format that would facilitate current and future research on highway and urban runoff. This report documents the design and implementation of the NDAMS relational database, which was designed to provide a catalog of available information and the results of an assessment of the available data. All the citations and the metadata collected during the review process are presented in a stratified metadatabase that contains citations for relevant publications, abstracts (or previa), and reportreview metadata for a sample of selected reports that document results of runoff quality investigations. The database is referred to as a metadatabase because it contains information about available data sets rather than a record of the original data. The database contains the metadata needed to evaluate and characterize how valid, current, complete, comparable, and technically defensible published and available information may be when evaluated for application to the different dataquality objectives as defined by decision makers. This database is a relational database, in that all information is ultimately linked to a given citation in the catalog of available reports. The main database file contains 86 tables consisting of 29 data tables, 11 association tables, and 46 domain tables. The data tables all link to a particular citation, and each data table is focused on one aspect of the information collected in the literature search and the evaluation of available information. This database is implemented in the Microsoft (MS) Access database software because it is widely used within and outside of government and is familiar to many existing and potential customers. The stratified metadatabase design for the NDAMS program is presented in the MS Access file DBDESIGN.mdb and documented with a data dictionary in the NDAMS_DD.mdb file recorded on the CD-ROM. The data dictionary file includes complete documentation of the table names, table descriptions, and information about each of the 419 fields in the database.

  20. Sequence stratigraphic distribution of coaly rocks: Fundamental controls and paralic examples

    USGS Publications Warehouse

    Bohacs, K.; Suter, J.

    1997-01-01

    Significant volumes of terrigenous organic matter can be preserved to form coals only when and where the overall increase in accommodation approximately equals the production rate of peat. Accommodation is a function of subsidence and base level. For mires, base level is very specifically the groundwater table. In paralic settings, the groundwater table is strongly controlled by sea level and the precipitation/evaporation ratio. Peat accumulates over a range of rates, but always with a definite maximum rate set by original organic productivity and space available below depositional base level (groundwater table). Below a threshold accommodation rate (nonzero), no continuous peats accumulate, due to falling or low groundwater table, sedimentary bypass, and extensive erosion by fluvial channels. This is typical of upper highstand, lowstand fan, and basal lowstand-wedge systems tracts. Higher accommodation rates provide relatively stable conditions with rising groundwater tables. Mires initiate and thrive, quickly filling local accommodation vertically and expanding laterally, favoring accumulation of laterally continuous coals in paralic zones within both middle lowstand and middle highstand systems tracts. If the accommodation increase balances or slightly exceeds organic productivity, mires accumulate peat vertically, yielding thicker, more isolated coals most likely during of late lowstand-early transgressive and late transgressive-early highstand periods. At very large accommodation increases, mires are stressed and eventually inundated by clastics or standing water (as in middle transgressive systems tracts). These relations should be valid for mires in all settings, including alluvial, lake plain, and paralic. The tie to sea level in paralic zones depends on local subsidence, sediment supply, and groundwater regimes. These concepts are also useful for investigating the distribution of seal and reservoir facies in nonmarine settings.

  1. Probabilistic modelling of human exposure to intense sweeteners in Italian teenagers: validation and sensitivity analysis of a probabilistic model including indicators of market share and brand loyalty.

    PubMed

    Arcella, D; Soggiu, M E; Leclercq, C

    2003-10-01

    For the assessment of exposure to food-borne chemicals, the most commonly used methods in the European Union follow a deterministic approach based on conservative assumptions. Over the past few years, to get a more realistic view of exposure to food chemicals, risk managers are getting more interested in the probabilistic approach. Within the EU-funded 'Monte Carlo' project, a stochastic model of exposure to chemical substances from the diet and a computer software program were developed. The aim of this paper was to validate the model with respect to the intake of saccharin from table-top sweeteners and cyclamate from soft drinks by Italian teenagers with the use of the software and to evaluate the impact of the inclusion/exclusion of indicators on market share and brand loyalty through a sensitivity analysis. Data on food consumption and the concentration of sweeteners were collected. A food frequency questionnaire aimed at identifying females who were high consumers of sugar-free soft drinks and/or of table top sweeteners was filled in by 3982 teenagers living in the District of Rome. Moreover, 362 subjects participated in a detailed food survey by recording, at brand level, all foods and beverages ingested over 12 days. Producers were asked to provide the intense sweeteners' concentration of sugar-free products. Results showed that consumer behaviour with respect to brands has an impact on exposure assessments. Only probabilistic models that took into account indicators of market share and brand loyalty met the validation criteria.

  2. A Compilation of Global Bio-Optical in Situ Data for Ocean-Colour Satellite Applications

    NASA Technical Reports Server (NTRS)

    Valente, Andre; Sathyendranath, Shubha; Brotus, Vanda; Groom, Steve; Grant, Michael; Taberner, Malcolm; Antoine, David; Arnone, Robert; Balch, William M.; Barker, Kathryn; hide

    2016-01-01

    A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GePCO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately. The compiled data are available at doi:10.1594PANGAEA.854832 (Valente et al., 2015).

  3. Evaluation and Validation (E&V) Team Public Report. Volume 1.

    DTIC Science & Technology

    1984-11-30

    Components. .......... K-109 *Table K-6. Possible Approaches to Integrating Methodology Components Across Life Cycle Phases. .........K-109 SECTION I...provide a detailed and organized approach to the development of technology which will be used as a basis for the E&V of APSEs. The E&V Plan which is...A-4 1.2 Background ....... .................. A-6 2. SCOPE ...... ....................... A-8 3. E&V TECHNICAL APPROACH

  4. Military Aircraft Propulsion Lubricants - Current and Future Trends

    DTIC Science & Technology

    1986-02-01

    are presented in Table I. The selected candidate base oil was a blend of commercially available neopentyl polyol esters. It was selected based on...validation consisted of: 1. a neopentyl polyol ester blend 2. a deposit inhibitor (Ref. 7) 3. a heterocyclic amine oxidation inhibitor 4. dioctyldiphenyl...The use of a glycol or a synthetic hydrocarbon (polyalphaolefin (PAO)) based fluid has been suggested as a possible basestock material for this oil

  5. Improving the Selection, CLassification and Utilization of Army Enlisted Personnel

    DTIC Science & Technology

    1992-11-01

    installations. These jobs were grouped into one combat (11B, 13B, and 19E NOS) and three non -combat clusters [Cleri- cal (71L MOS), Operations (31C...factor contains items from the Non -delinquency, Traditional Values, Conscientiousness, Cooperative- ness, and Internal Control scales. The Adjustment...validities and success rates for various groups of cases: all males, females, graduate males, and non -graduate males. Table 3 Descriptive statistics

  6. Mass timber rocking panel retrofit of a four-story soft-story building with full-scale shake table validation

    Treesearch

    Pouria Bahmani; John van de Lindt; Asif Iqbal; Douglas Rammer

    2017-01-01

    Soft-story wood-frame buildings have been recognized as a disaster preparedness problem for decades. There are tens of thousands of these multi-family three- and four-story structures throughout California and the United States. The majority were constructed between 1920 and 1970, with many being prevalent in the San Francisco Bay Area in California. The NEES Soft...

  7. 20 CFR Appendix C to Part 718 - Blood-Gas Tables

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Blood-Gas Tables C Appendix C to Part 718... PNEUMOCONIOSIS Pt. 718, App. C Appendix C to Part 718—Blood-Gas Tables The following tables set forth the values... tables are met: (1) For arterial blood-gas studies performed at test sites up to 2,999 feet above sea...

  8. 20 CFR Appendix C to Part 718 - Blood-Gas Tables

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false Blood-Gas Tables C Appendix C to Part 718... DUE TO PNEUMOCONIOSIS Pt. 718, App. C Appendix C to Part 718—Blood-Gas Tables The following tables set... of the following tables are met: (1) For arterial blood-gas studies performed at test sites up to 2...

  9. 20 CFR Appendix C to Part 718 - Blood-Gas Tables

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 4 2013-04-01 2013-04-01 false Blood-Gas Tables C Appendix C to Part 718... DUE TO PNEUMOCONIOSIS Pt. 718, App. C Appendix C to Part 718—Blood-Gas Tables The following tables set... of the following tables are met: (1) For arterial blood-gas studies performed at test sites up to 2...

  10. Validating the accuracy of SO2 gas retrievals in the thermal infrared (8-14 μm)

    NASA Astrophysics Data System (ADS)

    Gabrieli, Andrea; Porter, John N.; Wright, Robert; Lucey, Paul G.

    2017-11-01

    Quantifying sulfur dioxide (SO2) in volcanic plumes is important for eruption predictions and public health. Ground-based remote sensing of spectral radiance of plumes contains information on the path-concentration of SO2. However, reliable inversion algorithms are needed to convert plume spectral radiance measurements into SO2 path-concentrations. Various techniques have been used for this purpose. Recent approaches have employed thermal infrared (TIR) imaging between 8 μm and 14 μm to provide two-dimensional mapping of plume SO2 path-concentration, using what might be described as "dual-view" techniques. In this case, the radiance (or its surrogate brightness temperature) is computed for portions of the image that correspond to the plume and compared with spectral radiance obtained for adjacent regions of the image that do not (i.e., "clear sky"). In this way, the contribution that the plume makes to the measured radiance can be isolated from the background atmospheric contribution, this residual signal being converted to an estimate of gas path-concentration via radiative transfer modeling. These dual-view approaches suffer from several issues, mainly the assumption of clear sky background conditions. At this time, the various inversion algorithms remain poorly validated. This paper makes two contributions. Firstly, it validates the aforementioned dual-view approaches, using hyperspectral TIR imaging data. Secondly, it introduces a new method to derive SO2 path-concentrations, which allows for single point SO2 path-concentration retrievals, suitable for hyperspectral imaging with clear or cloudy background conditions. The SO2 amenable lookup table algorithm (SO2-ALTA) uses the MODTRAN5 radiative transfer model to compute radiance for a variety (millions) of plume and atmospheric conditions. Rather than searching this lookup table to find the best fit for each measured spectrum, the lookup table was used to train a partial least square regression (PLSR) model. The coefficients of this model are used to invert measured radiance spectra to path-concentration on a pixel-by-pixel basis. In order to validate the algorithms, TIR hyperspectral measurements were carried out by measuring sky radiance when looking through gas cells filled with known amounts of SO2. SO2-ALTA was also tested on retrieving SO2 path-concentrations from the Kīlauea volcano, Hawai'i. For cloud-free conditions, all three techniques worked well. In cases where background clouds were present, then only SO2-ALTA was found to provide good results, but only under low atmospheric water vapor column amounts.

  11. Ecological periodic tables for benthic macrofaunal usage of estuarine habitats: Insights from a case study in Tillamook Bay, Oregon, USA

    NASA Astrophysics Data System (ADS)

    Ferraro, Steven P.; Cole, Faith A.

    2012-05-01

    This study validates the ecological relevance of estuarine habitat types to the benthic macrofaunal community and, together with previous similar studies, suggests they can serve as elements in ecological periodic tables of benthic macrofaunal usage in the bioregion. We compared benthic macrofaunal Bray-Curtis similarity and the means of eight benthic macrofaunal community measures across seven habitat types in Tillamook Bay, Oregon, USA: intertidal eelgrass (Zostera marina), dwarf eelgrass (Zostera japonica), oyster (Crassostrea gigas) ground culture, burrowing mud shrimp (Upogebia pugettensis), burrowing ghost shrimp (Neotrypaea californiensis), sand and subtidal. Benthic macrofaunal Bray-Curtis similarity differed among all the habitats except ghost shrimp and sand. The habitat rank order on mean benthic macrofaunal species richness, abundance and biomass was dwarf eelgrass ≈ oyster ≥ mud shrimp ≈ eelgrass > sand ≈ ghost shrimp ≈ subtidal. The benthic macrofaunal habitat usage pattern in Tillamook Bay was, with a few exceptions, similar to that in two other US Pacific Northwest estuaries. The exceptions indicate variants of eelgrass and ghost shrimp habitat that differ in benthic macrofaunal usage perhaps due to differences in the coarseness of the sand fraction of the sediments in which they live. The similarities indicate periodic benthic macrofaunal usage patterns across the other habitat types extend over a wider geographic scale and range of environmental conditions than previously known.

  12. Comparing Free-Free and Shaker Table Model Correlation Methods Using Jim Beam

    NASA Technical Reports Server (NTRS)

    Ristow, James; Smith, Kenneth Wayne, Jr.; Johnson, Nathaniel; Kinney, Jackson

    2018-01-01

    Finite element model correlation as part of a spacecraft program has always been a challenge. For any NASA mission, the coupled system response of the spacecraft and launch vehicle can be determined analytically through a Coupled Loads Analysis (CLA), as it is not possible to test the spacecraft and launch vehicle coupled system before launch. The value of the CLA is highly dependent on the accuracy of the frequencies and mode shapes extracted from the spacecraft model. NASA standards require the spacecraft model used in the final Verification Loads Cycle to be correlated by either a modal test or by comparison of the model with Frequency Response Functions (FRFs) obtained during the environmental qualification test. Due to budgetary and time constraints, most programs opt to correlate the spacecraft dynamic model during the environmental qualification test, conducted on a large shaker table. For any model correlation effort, the key has always been finding a proper definition of the boundary conditions. This paper is a correlation case study to investigate the difference in responses of a simple structure using a free-free boundary, a fixed boundary on the shaker table, and a base-drive vibration test, all using identical instrumentation. The NAVCON Jim Beam test structure, featured in the IMAC round robin modal test of 2009, was selected as a simple, well recognized and well characterized structure to conduct this investigation. First, a free-free impact modal test of the Jim Beam was done as an experimental control. Second, the Jim Beam was mounted to a large 20,000 lbf shaker, and an impact modal test in this fixed configuration was conducted. Lastly, a vibration test of the Jim Beam was conducted on the shaker table. The free-free impact test, the fixed impact test, and the base-drive test were used to assess the effect of the shaker modes, evaluate the validity of fixed-base modeling assumptions, and compare final model correlation results between these boundary conditions.

  13. Seismic performance of geosynthetic-soil retaining wall structures

    NASA Astrophysics Data System (ADS)

    Zarnani, Saman

    Vertical inclusions of expanded polystyrene (EPS) placed behind rigid retaining walls were investigated as geofoam seismic buffers to reduce earthquake-induced loads. A numerical model was developed using the program FLAC and the model validated against 1-g shaking table test results of EPS geofoam seismic buffer models. Two constitutive models for the component materials were examined: elastic-perfectly plastic with Mohr-Coulomb (M-C) failure criterion and non-linear hysteresis damping model with equivalent linear method (ELM) approach. It was judged that the M-C model was sufficiently accurate for practical purposes. The mechanical property of interest to attenuate dynamic loads using a seismic buffer was the buffer stiffness defined as K = E/t (E = buffer elastic modulus, t = buffer thickness). For the range of parameters investigated in this study, K ≤50 MN/m3 was observed to be the practical range for the optimal design of these systems. Parametric numerical analyses were performed to generate design charts that can be used for the preliminary design of these systems. A new high capacity shaking table facility was constructed at RMC that can be used to study the seismic performance of earth structures. Reduced-scale models of geosynthetic reinforced soil (GRS) walls were built on this shaking table and then subjected to simulated earthquake loading conditions. In some shaking table tests, combined use of EPS geofoam and horizontal geosynthetic reinforcement layers was investigated. Numerical models were developed using program FLAC together with ELM and M-C constitutive models. Physical and numerical results were compared against predicted values using analysis methods found in the journal literature and in current North American design guidelines. The comparison shows that current Mononobe-Okabe (M-O) based analysis methods could not consistently satisfactorily predict measured reinforcement connection load distributions at all elevations under both static and dynamic loading conditions. The results from GRS model wall tests with combined EPS geofoam and geosynthetic reinforcement layers show that the inclusion of a EPS geofoam layer behind the GRS wall face can reduce earth loads acting on the wall facing to values well below those recorded for conventional GRS wall model configurations.

  14. The space shuttle payload planning working groups. Volume 8: Earth and ocean physics

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The findings and recommendations of the Earth and Ocean Physics working group of the space shuttle payload planning activity are presented. The requirements for the space shuttle mission are defined as: (1) precision measurement for earth and ocean physics experiments, (2) development and demonstration of new and improved sensors and analytical techniques, (3) acquisition of surface truth data for evaluation of new measurement techniques, (4) conduct of critical experiments to validate geophysical phenomena and instrumental results, and (5) development and validation of analytical/experimental models for global ocean dynamics and solid earth dynamics/earthquake prediction. Tables of data are presented to show the flight schedule estimated costs, and the mission model.

  15. Solving LR Conflicts Through Context Aware Scanning

    NASA Astrophysics Data System (ADS)

    Leon, C. Rodriguez; Forte, L. Garcia

    2011-09-01

    This paper presents a new algorithm to compute the exact list of tokens expected by any LR syntax analyzer at any point of the scanning process. The lexer can, at any time, compute the exact list of valid tokens to return only tokens in this set. In the case than more than one matching token is in the valid set, the lexer can resort to a nested LR parser to disambiguate. Allowing nested LR parsing requires some slight modifications when building the LR parsing tables. We also show how LR parsers can parse conflictive and inherently ambiguous languages using a combination of nested parsing and context aware scanning. These expanded lexical analyzers can be generated from high level specifications.

  16. Look-up-table approach for leaf area index retrieval from remotely sensed data based on scale information

    NASA Astrophysics Data System (ADS)

    Zhu, Xiaohua; Li, Chuanrong; Tang, Lingli

    2018-03-01

    Leaf area index (LAI) is a key structural characteristic of vegetation and plays a significant role in global change research. Several methods and remotely sensed data have been evaluated for LAI estimation. This study aimed to evaluate the suitability of the look-up-table (LUT) approach for crop LAI retrieval from Satellite Pour l'Observation de la Terre (SPOT)-5 data and establish an LUT approach for LAI inversion based on scale information. The LAI inversion result was validated by in situ LAI measurements, indicating that the LUT generated based on the PROSAIL (PROSPECT+SAIL: properties spectra + scattering by arbitrarily inclined leaves) model was suitable for crop LAI estimation, with a root mean square error (RMSE) of ˜0.31m2 / m2 and determination coefficient (R2) of 0.65. The scale effect of crop LAI was analyzed based on Taylor expansion theory, indicating that when the SPOT data aggregated by 200 × 200 pixel, the relative error is significant with 13.7%. Finally, an LUT method integrated with scale information was proposed in this article, improving the inversion accuracy with RMSE of 0.20 m2 / m2 and R2 of 0.83.

  17. Flow and heat transfer in water based liquid film fluids dispensed with graphene nanoparticles

    NASA Astrophysics Data System (ADS)

    Zuhra, Samina; Khan, Noor Saeed; Khan, Muhammad Altaf; Islam, Saeed; Khan, Waris; Bonyah, Ebenezer

    2018-03-01

    The unsteady flow and heat transfer characteristics of electrically conducting water based thin liquid film non-Newtonian (Casson and Williamson) nanofluids dispensed with graphene nanoparticles past a stretching sheet are considered in the presence of transverse magnetic field and non-uniform heat source/sink. Embedding the graphene nanoparticles effectively amplifies the thermal conductivity of Casson and Williamson nanofluids. Ordinary differential equations together with the boundary conditions are obtained through similarity variables from the governing equations of the problem, which are solved by the HAM (Homotopy Analysis Method). The solution is expressed through graphs and illustrated which show the influences of all the parameters. The convergence of the HAM solution for the linear operators is obtained. Favorable comparison with previously published research paper is performed to show the correlation for the present work. Skin friction coefficient and Nusselt number are presented through Tables and graphs which show the validation for the achieved results demonstrating that the thin liquid films results from this study are in close agreement with the results reported in the literature. Results achieved by HAM and residual errors are evaluated numerically, given in Tables and also depicted graphically which show the accuracy of the present work.

  18. Comprehensive two-dimensional gas chromatography/time-of-flight mass spectrometry peak sorting algorithm.

    PubMed

    Oh, Cheolhwan; Huang, Xiaodong; Regnier, Fred E; Buck, Charles; Zhang, Xiang

    2008-02-01

    We report a novel peak sorting method for the two-dimensional gas chromatography/time-of-flight mass spectrometry (GC x GC/TOF-MS) system. The objective of peak sorting is to recognize peaks from the same metabolite occurring in different samples from thousands of peaks detected in the analytical procedure. The developed algorithm is based on the fact that the chromatographic peaks for a given analyte have similar retention times in all of the chromatograms. Raw instrument data are first processed by ChromaTOF (Leco) software to provide the peak tables. Our algorithm achieves peak sorting by utilizing the first- and second-dimension retention times in the peak tables and the mass spectra generated during the process of electron impact ionization. The algorithm searches the peak tables for the peaks generated by the same type of metabolite using several search criteria. Our software also includes options to eliminate non-target peaks from the sorting results, e.g., peaks of contaminants. The developed software package has been tested using a mixture of standard metabolites and another mixture of standard metabolites spiked into human serum. Manual validation demonstrates high accuracy of peak sorting with this algorithm.

  19. A robust and accurate numerical method for transcritical turbulent flows at supercritical pressure with an arbitrary equation of state

    NASA Astrophysics Data System (ADS)

    Kawai, Soshi; Terashima, Hiroshi; Negishi, Hideyo

    2015-11-01

    This paper addresses issues in high-fidelity numerical simulations of transcritical turbulent flows at supercritical pressure. The proposed strategy builds on a tabulated look-up table method based on REFPROP database for an accurate estimation of non-linear behaviors of thermodynamic and fluid transport properties at the transcritical conditions. Based on the look-up table method we propose a numerical method that satisfies high-order spatial accuracy, spurious-oscillation-free property, and capability of capturing the abrupt variation in thermodynamic properties across the transcritical contact surface. The method introduces artificial mass diffusivity to the continuity and momentum equations in a physically-consistent manner in order to capture the steep transcritical thermodynamic variations robustly while maintaining spurious-oscillation-free property in the velocity field. The pressure evolution equation is derived from the full compressible Navier-Stokes equations and solved instead of solving the total energy equation to achieve the spurious pressure oscillation free property with an arbitrary equation of state including the present look-up table method. Flow problems with and without physical diffusion are employed for the numerical tests to validate the robustness, accuracy, and consistency of the proposed approach.

  20. Verification and validation of a reliable multicast protocol

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.

    1995-01-01

    This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.

  1. A robust and accurate numerical method for transcritical turbulent flows at supercritical pressure with an arbitrary equation of state

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kawai, Soshi, E-mail: kawai@cfd.mech.tohoku.ac.jp; Terashima, Hiroshi; Negishi, Hideyo

    2015-11-01

    This paper addresses issues in high-fidelity numerical simulations of transcritical turbulent flows at supercritical pressure. The proposed strategy builds on a tabulated look-up table method based on REFPROP database for an accurate estimation of non-linear behaviors of thermodynamic and fluid transport properties at the transcritical conditions. Based on the look-up table method we propose a numerical method that satisfies high-order spatial accuracy, spurious-oscillation-free property, and capability of capturing the abrupt variation in thermodynamic properties across the transcritical contact surface. The method introduces artificial mass diffusivity to the continuity and momentum equations in a physically-consistent manner in order to capture themore » steep transcritical thermodynamic variations robustly while maintaining spurious-oscillation-free property in the velocity field. The pressure evolution equation is derived from the full compressible Navier–Stokes equations and solved instead of solving the total energy equation to achieve the spurious pressure oscillation free property with an arbitrary equation of state including the present look-up table method. Flow problems with and without physical diffusion are employed for the numerical tests to validate the robustness, accuracy, and consistency of the proposed approach.« less

  2. Ergodicity of the generalized lemon billiards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Jingyu; Mohr, Luke; Zhang, Hong-Kun, E-mail: hongkun@math.umass.edu

    2013-12-15

    In this paper, we study a two-parameter family of convex billiard tables, by taking the intersection of two round disks (with different radii) in the plane. These tables give a generalization of the one-parameter family of lemon-shaped billiards. Initially, there is only one ergodic table among all lemon tables. In our generalized family, we observe numerically the prevalence of ergodicity among the some perturbations of that table. Moreover, numerical estimates of the mixing rate of the billiard dynamics on some ergodic tables are also provided.

  3. The development and validation of different decision-making tools to predict urine culture growth out of urine flow cytometry parameter.

    PubMed

    Müller, Martin; Seidenberg, Ruth; Schuh, Sabine K; Exadaktylos, Aristomenis K; Schechter, Clyde B; Leichtle, Alexander B; Hautz, Wolf E

    2018-01-01

    Patients presenting with suspected urinary tract infection are common in every day emergency practice. Urine flow cytometry has replaced microscopic urine evaluation in many emergency departments, but interpretation of the results remains challenging. The aim of this study was to develop and validate tools that predict urine culture growth out of urine flow cytometry parameter. This retrospective study included all adult patients that presented in a large emergency department between January and July 2017 with a suspected urinary tract infection and had a urine flow cytometry as well as a urine culture obtained. The objective was to identify urine flow cytometry parameters that reliably predict urine culture growth and mixed flora growth. The data set was split into a training (70%) and a validation set (30%) and different decision-making approaches were developed and validated. Relevant urine culture growth (respectively mixed flora growth) was found in 40.2% (7.2% respectively) of the 613 patients included. The number of leukocytes and bacteria in flow cytometry were highly associated with urine culture growth, but mixed flora growth could not be sufficiently predicted from the urine flow cytometry parameters. A decision tree, predictive value figures, a nomogram, and a cut-off table to predict urine culture growth from bacteria and leukocyte count were developed, validated and compared. Urine flow cytometry parameters are insufficient to predict mixed flora growth. However, the prediction of urine culture growth based on bacteria and leukocyte count is highly accurate and the developed tools should be used as part of the decision-making process of ordering a urine culture or starting an antibiotic therapy if a urogenital infection is suspected.

  4. The development and validation of different decision-making tools to predict urine culture growth out of urine flow cytometry parameter

    PubMed Central

    Seidenberg, Ruth; Schuh, Sabine K.; Exadaktylos, Aristomenis K.; Schechter, Clyde B.; Leichtle, Alexander B.; Hautz, Wolf E.

    2018-01-01

    Objective Patients presenting with suspected urinary tract infection are common in every day emergency practice. Urine flow cytometry has replaced microscopic urine evaluation in many emergency departments, but interpretation of the results remains challenging. The aim of this study was to develop and validate tools that predict urine culture growth out of urine flow cytometry parameter. Methods This retrospective study included all adult patients that presented in a large emergency department between January and July 2017 with a suspected urinary tract infection and had a urine flow cytometry as well as a urine culture obtained. The objective was to identify urine flow cytometry parameters that reliably predict urine culture growth and mixed flora growth. The data set was split into a training (70%) and a validation set (30%) and different decision-making approaches were developed and validated. Results Relevant urine culture growth (respectively mixed flora growth) was found in 40.2% (7.2% respectively) of the 613 patients included. The number of leukocytes and bacteria in flow cytometry were highly associated with urine culture growth, but mixed flora growth could not be sufficiently predicted from the urine flow cytometry parameters. A decision tree, predictive value figures, a nomogram, and a cut-off table to predict urine culture growth from bacteria and leukocyte count were developed, validated and compared. Conclusions Urine flow cytometry parameters are insufficient to predict mixed flora growth. However, the prediction of urine culture growth based on bacteria and leukocyte count is highly accurate and the developed tools should be used as part of the decision-making process of ordering a urine culture or starting an antibiotic therapy if a urogenital infection is suspected. PMID:29474463

  5. Accuracy assessment/validation methodology and results of 2010–11 land-cover/land-use data for Pools 13, 26, La Grange, and Open River South, Upper Mississippi River System

    USGS Publications Warehouse

    Jakusz, J.W.; Dieck, J.J.; Langrehr, H.A.; Ruhser, J.J.; Lubinski, S.J.

    2016-01-11

    Similar to an AA, validation involves generating random points based on the total area for each map class. However, instead of collecting field data, two or three individuals not involved with the photo-interpretative mapping separately review each of the points onscreen and record a best-fit vegetation type(s) for each site. Once the individual analyses are complete, results are joined together and a comparative analysis is performed. The objective of this initial analysis is to identify areas where the validation results were in agreement (matches) and areas where validation results were in disagreement (mismatches). The two or three individuals then perform an analysis, looking at each mismatched site, and agree upon a final validation class. (If two vegetation types at a specific site appear to be equally prevalent, the validation team is permitted to assign the site two best-fit vegetation types.) Following the validation team’s comparative analysis of vegetation assignments, the data are entered into a database and compared to the mappers’ vegetation assignments. Agreements and disagreements between the map and validation classes are identified, and a contingency table is produced. This document presents the AA processes/results for Pools 13 and La Grange, as well as the validation process/results for Pools 13 and 26 and Open River South.

  6. Comparison of Oregon state highway division Table-1 and Table-2 asphalt : final report.

    DOT National Transportation Integrated Search

    1991-12-01

    The objective of this study was to compare the effect of using the Oregon State Highway Division (OSHD) modified Table-1 asphalts and the OSHD modified Table-2 asphalts in asphalt concrete; the primary factors for comparison were reflective and therm...

  7. Comparative analysis of low-back loading on chiropractors using various workstation table heights and performing various tasks.

    PubMed

    Lorme, Kenneth J; Naqvi, Syed A

    2003-01-01

    There is epidemiologic evidence that chiropractors are a high-risk group for low-back disorders. However, to date there are no known biomechanical studies to determine whether their workstations may be a contributing factor. To investigate whether chiropractors' workstation table height or the tasks they perform make them susceptible to low-back strain. As well as investigating low-back strain, a screening was performed to determine whether chiropractors' upper extremities were at risk for undue strain as workstation table height was varied. Experimental pilot study. A university ergonomic laboratory. An adjustable manipulation table was set at 3 different heights: 465 mm, 665 mm and 845 mm. Each of the 7 volunteer chiropractors were fitted with a triaxial electrogoniometer and were videotaped and photographed for analysis while performing spinal manipulation to the cervical, thoracic, and lumbar spine of a volunteer patient at each workstation table height. Two biomechanical models, one static and one dynamic, were used to record the dependent variables. A screening of various upper extremity variables was also performed with the static model. For the subjects under study, a significant difference was found for the variables maximum sagittal flexion, disk compression force, and ligament strain as table height was varied. For the lumbar and thoracic manipulation tasks, the medium table height (655 mm) was found to create the least low-back strain. For the cervical manipulation task, the high table height (845 mm) was found to be the least straining on the low-back. The low height table (465 mm) was the most straining for all tasks. Upper extremities were not significantly affected by changes to table height. Significant differences were found for the task performed for axial rotational velocity, disk compression force, ligament strain, maximum sagittal flexion, dominant (right) elbow moment, and dominant (right) shoulder moment variables. There was no significant interaction between table height and task performed. Workstation table height was found to have a significant effect on low-back load of subjects under study. The results of this study demonstrate an overall unacceptably high amount of sagittal flexion, ligament strain, and disk compression force on the chiropractor subjects in the tasks performed.

  8. A prediction model of compressor with variable-geometry diffuser based on elliptic equation and partial least squares

    PubMed Central

    Yang, Chuanlei; Wang, Yinyan; Wang, Hechun

    2018-01-01

    To achieve a much more extensive intake air flow range of the diesel engine, a variable-geometry compressor (VGC) is introduced into a turbocharged diesel engine. However, due to the variable diffuser vane angle (DVA), the prediction for the performance of the VGC becomes more difficult than for a normal compressor. In the present study, a prediction model comprising an elliptical equation and a PLS (partial least-squares) model was proposed to predict the performance of the VGC. The speed lines of the pressure ratio map and the efficiency map were fitted with the elliptical equation, and the coefficients of the elliptical equation were introduced into the PLS model to build the polynomial relationship between the coefficients and the relative speed, the DVA. Further, the maximal order of the polynomial was investigated in detail to reduce the number of sub-coefficients and achieve acceptable fit accuracy simultaneously. The prediction model was validated with sample data and in order to present the superiority of compressor performance prediction, the prediction results of this model were compared with those of the look-up table and back-propagation neural networks (BPNNs). The validation and comparison results show that the prediction accuracy of the new developed model is acceptable, and this model is much more suitable than the look-up table and the BPNN methods under the same condition in VGC performance prediction. Moreover, the new developed prediction model provides a novel and effective prediction solution for the VGC and can be used to improve the accuracy of the thermodynamic model for turbocharged diesel engines in the future. PMID:29410849

  9. A prediction model of compressor with variable-geometry diffuser based on elliptic equation and partial least squares.

    PubMed

    Li, Xu; Yang, Chuanlei; Wang, Yinyan; Wang, Hechun

    2018-01-01

    To achieve a much more extensive intake air flow range of the diesel engine, a variable-geometry compressor (VGC) is introduced into a turbocharged diesel engine. However, due to the variable diffuser vane angle (DVA), the prediction for the performance of the VGC becomes more difficult than for a normal compressor. In the present study, a prediction model comprising an elliptical equation and a PLS (partial least-squares) model was proposed to predict the performance of the VGC. The speed lines of the pressure ratio map and the efficiency map were fitted with the elliptical equation, and the coefficients of the elliptical equation were introduced into the PLS model to build the polynomial relationship between the coefficients and the relative speed, the DVA. Further, the maximal order of the polynomial was investigated in detail to reduce the number of sub-coefficients and achieve acceptable fit accuracy simultaneously. The prediction model was validated with sample data and in order to present the superiority of compressor performance prediction, the prediction results of this model were compared with those of the look-up table and back-propagation neural networks (BPNNs). The validation and comparison results show that the prediction accuracy of the new developed model is acceptable, and this model is much more suitable than the look-up table and the BPNN methods under the same condition in VGC performance prediction. Moreover, the new developed prediction model provides a novel and effective prediction solution for the VGC and can be used to improve the accuracy of the thermodynamic model for turbocharged diesel engines in the future.

  10. The expert explorer: a tool for hospital data visualization and adverse drug event rules validation.

    PubMed

    Băceanu, Adrian; Atasiei, Ionuţ; Chazard, Emmanuel; Leroy, Nicolas

    2009-01-01

    An important part of adverse drug events (ADEs) detection is the validation of the clinical cases and the assessment of the decision rules to detect ADEs. For that purpose, a software called "Expert Explorer" has been designed by Ideea Advertising. Anonymized datasets have been extracted from hospitals into a common repository. The tool has 3 main features. (1) It can display hospital stays in a visual and comprehensive way (diagnoses, drugs, lab results, etc.) using tables and pretty charts. (2) It allows designing and executing dashboards in order to generate knowledge about ADEs. (3) It finally allows uploading decision rules obtained from data mining. Experts can then review the rules, the hospital stays that match the rules, and finally give their advice thanks to specialized forms. Then the rules can be validated, invalidated, or improved (knowledge elicitation phase).

  11. Partial slip effect in the flow of MHD micropolar nanofluid flow due to a rotating disk - A numerical approach

    NASA Astrophysics Data System (ADS)

    Ramzan, Muhammad; Chung, Jae Dong; Ullah, Naeem

    The aim of present exploration is to study the flow of micropolar nanofluid due to a rotating disk in the presence of magnetic field and partial slip condition. The governing coupled partial differential equations are reduced to nonlinear ordinary differential equations using appropriate transformations. The differential equations are solved numerically by using Maple dsolve command with option numeric which utilize Runge-Kutta fourth-fifth order Fehlberg technique. A comparison to previous study is also added to validate the present results. Moreover, behavior of different parameters on velocity, microrotation, temperature and concentration of nanofluid are presented via graphs and tables. It is noted that the slip effect and magnetic field decay the velocity and microrotation or spin component.

  12. Aeolian drift sand archives show evidence of Late Holocene groundwater dynamics in NE Belgium

    NASA Astrophysics Data System (ADS)

    Beerten, Koen

    2017-04-01

    The sandy unconfined aquifers of NE Belgium (Kleine Nete catchment, Campine area) underlay a flat and slightly undulating landscape. It is drained by small rivers that occupy shallow valleys separated by weakly expressed interfluves. Instrumental time series (collected since the 1980s) show that the mean highest groundwater table (MHG) on these interfluves (late winter - early spring) is generally 1-2 m below the surface. For earlier periods there are no systematic observations of groundwater tables in the area. Such information would allow to extend the time window for hydrological model validation and verification under different boundary conditions (soil, land-use, climate) and thus build confidence in future hydrological predictions. The sandy interfluves of the Kleine Nete catchment have witnessed strong aeolian morphodynamics during the last few millenia. Many of the podzols that developed during the Holocene became either eroded by wind deflation or buried under drift sand. This situation provides a unique means to study palaeohydrological features, events and processes in such shallow unsaturated zones. Therefore, the aim of this presentation is to explore the potential of pedological, geomorphological and historical archives from drift sand landscapes in the Campine area as proxies for past groundwater tables. The adopted approach includes a wide variety of techniques, such as field descriptions of palaeosol profile morphology, optically stimulated luminescence (OSL) dating of intercalated drift sands, determination of groundwater-controlled blow-out surfaces and observations of surface water bodies on historical maps. The buried podzols often display hydromorphic properties, such as redoximorphic features, vague horizon boundaries and peat development. OSL dating of associated drift sands suggests that a very shallow MHG existed from ca. 6 ka until at least ca. 2 ka. Subsequently, historical maps suggest that groundwater tables started to decline during the second half of the 19th century (ca. 150 a). So far, the aeolian record of palaeohydrological conditions in the Campine area suggests that groundwater tables on interfluves were often shallower during the last few millenia than today, with MHG levels regularly reaching the surface. Since groundwater tables in this area are largely dependent on infiltration, we infer that either evapotranspiration would have been lower, or precipitation would have been higher during the timeframes considered. The significance of these findings is yet to be understood, given the highly discontinuous and integrated nature of the investigated archives. Future work will focus on expanding the palaeohydrological database and confronting the obtained results with hydro(geo)logical modelling exercises.

  13. VizieR Online Data Catalog: xi Tau UBV and MOST light curves (Nemravova+, 2016)

    NASA Astrophysics Data System (ADS)

    Nemravova, J. A.; Harmanec, P.; Broz, M.; Vokrouhlicky, D.; Mourard, D.; Hummel, C. A.; Cameron, C.; Matthews, J. M.; Bolton, C. T.; Bozic, H.; Chini, R.; Dembsky, T.; Engle, S.; Farrington, C.; Grunhut, J. H.; Guenther, D. B.; Guinan, E. F.; Korcakova, D.; Koubsky, P.; Kiek, R.; Kuschnig, R.; Mayer, P.; McCook, G. P.; Moffat, A. F. J.; Nardetto, N.; Pra, A.; Ribeiro, J.; Rowe, J.; Rucinski, S.; Skoda, P.; Slechta, M.; Tallon-Bosc, I.; Votruba, V.; Weiss, W. W.; Wolf, M.; Zasche, P.; Zavala, R. T.

    2016-05-01

    We present reduced observations, that were used in study of the quadruple hierarchical binary xi Tauri. The observational material consists of radial-velocity measurements (tabled1.dat), photometric measurements in the MOST filter (tabled2.dat), and Johnson's U (tabled3.dat), B (tabled4.dat), and V (tabled5.dat), and spectro-interferometric measurements represented by squared visibility moduli (tabled6.dat), and closure phases (tabled7.dat). The~description of the reductions is given in Appendices A (the spectroscopy), B (the photometry), and C (the spectro-interferometry). The procedure of radial-velocity measuring is described in Sect. 3.1. Headers of Tables D.1-D.7 published electronically are also given in Appendix D. (7 data files).

  14. Real-time 4D ERT monitoring of river water intrusion into a former nuclear disposal site using a transient warping-mesh water table boundary (Invited)

    NASA Astrophysics Data System (ADS)

    Johnson, T.; Hammond, G. E.; Versteeg, R. J.; Zachara, J. M.

    2013-12-01

    The Hanford 300 Area, located adjacent to the Columbia River in south-central Washington, USA, is the site of former research and uranium fuel rod fabrication facilities. Waste disposal practices at site included discharging between 33 and 59 metric tons of uranium over a 40 year period into shallow infiltration galleries, resulting in persistent uranium contamination within the vadose and saturated zones. Uranium transport from the vadose zone to the saturated zone is intimately linked with water table fluctuations and river water intrusion driven by upstream dam operations. As river stage increases, the water table rises into the vadose zone and mobilizes contaminated pore water. At the same time, river water moves inland into the aquifer, and river water chemistry facilitates further mobilization by enabling uranium desorption from contaminated sediments. As river stage decreases, flow moves toward the river, ultimately discharging contaminated water at the river bed. River water specific conductance at the 300 Area varies around 0.018 S/m whereas groundwater specific conductance varies around 0.043 S/m. This contrast provides the opportunity to monitor groundwater/river water interaction by imaging changes in bulk conductivity within the saturated zone using time-lapse electrical resistivity tomography. Previous efforts have demonstrated this capability, but have also shown that disconnecting regularization constraints at the water table is critical for obtaining meaningful time-lapse images. Because the water table moves with time, the regularization constraints must also be transient to accommodate the water table boundary. This was previously accomplished with 2D time-lapse ERT imaging by using a finely discretized computational mesh within the water table interval, enabling a relatively smooth water table to be defined without modifying the mesh. However, in 3D this approach requires a computational mesh with an untenable number of elements. In order to accommodate the water table boundary in 3D, we propose a time-lapse warping mesh inversion, whereby mesh elements that traverse the water table are modified to generate a smooth boundary at the known water table position, enabling regularization constraints to be accurately disconnected across the water table boundary at a given time. We demonstrate the approach using a surface ERT array installed adjacent to the Columbia River at the 300 Area, consisting of 352 electrodes and covering an area of approximately 350 m x 350 m. Using autonomous data collection, transmission, and filtering tools coupled with high performance computing resources, the 4D imaging process is automated and executed in real time. Each time lapse survey consists of approximately 40,000 measurements and 4 surveys are collected and processed per day from April 1st , 2013 to September 30th, 2013. The data are inverted on an unstructured tetrahedral mesh that honors LiDAR-based surface topography and is comprised of approximately 905,000 elements. Imaging results show the dynamic 4D extent of river water intrusion, and are validated with well-based fluid conductivity measurements at each monitoring well within the imaging domain.

  15. 28 CFR Appendix B to Part 79 - Blood-Gas Study Tables

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 2 2014-07-01 2014-07-01 false Blood-Gas Study Tables B Appendix B to... COMPENSATION ACT Pt. 79, App. B Appendix B to Part 79—Blood-Gas Study Tables For arterial blood-gas studies... mmHg 65 mmHg or below. Above 50 mmHg Any value. For arterial blood-gas studies performed at test...

  16. 28 CFR Appendix B to Part 79 - Blood-Gas Study Tables

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 28 Judicial Administration 2 2012-07-01 2012-07-01 false Blood-Gas Study Tables B Appendix B to... COMPENSATION ACT Pt. 79, App. B Appendix B to Part 79—Blood-Gas Study Tables For arterial blood-gas studies... mmHg 65 mmHg or below. Above 50 mmHg Any value. For arterial blood-gas studies performed at test...

  17. 28 CFR Appendix B to Part 79 - Blood-Gas Study Tables

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 28 Judicial Administration 2 2013-07-01 2013-07-01 false Blood-Gas Study Tables B Appendix B to... COMPENSATION ACT Pt. 79, App. B Appendix B to Part 79—Blood-Gas Study Tables For arterial blood-gas studies... mmHg 65 mmHg or below. Above 50 mmHg Any value. For arterial blood-gas studies performed at test...

  18. 28 CFR Appendix B to Part 79 - Blood-Gas Study Tables

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Blood-Gas Study Tables B Appendix B to... COMPENSATION ACT Pt. 79, App. B Appendix B to Part 79—Blood-Gas Study Tables For arterial blood-gas studies... mmHg 65 mmHg or below. Above 50 mmHg Any value. For arterial blood-gas studies performed at test...

  19. 28 CFR Appendix B to Part 79 - Blood-Gas Study Tables

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 28 Judicial Administration 2 2011-07-01 2011-07-01 false Blood-Gas Study Tables B Appendix B to... COMPENSATION ACT Pt. 79, App. B Appendix B to Part 79—Blood-Gas Study Tables For arterial blood-gas studies... mmHg 65 mmHg or below. Above 50 mmHg Any value. For arterial blood-gas studies performed at test...

  20. Life-table methods for detecting age-risk factor interactions in long-term follow-up studies.

    PubMed

    Logue, E E; Wing, S

    1986-01-01

    Methodological investigation has suggested that age-risk factor interactions should be more evident in age of experience life tables than in follow-up time tables due to the mixing of ages of experience over follow-up time in groups defined by age at initial examination. To illustrate the two approaches, age modification of the effect of total cholesterol on ischemic heart disease mortality in two long-term follow-up studies was investigated. Follow-up time life table analysis of 116 deaths over 20 years in one study was more consistent with a uniform relative risk due to cholesterol, while age of experience life table analysis was more consistent with a monotonic negative age interaction. In a second follow-up study (160 deaths over 24 years), there was no evidence of a monotonic negative age-cholesterol interaction by either method. It was concluded that age-specific life table analysis should be used when age-risk factor interactions are considered, but that both approaches yield almost identical results in absence of age interaction. The identification of the more appropriate life-table analysis should be ultimately guided by the nature of the age or time phenomena of scientific interest.

  1. Ada Compiler Validation Summary Report: Certificate Number: 911107W1. 11228 Hewlett-Packard HP 9000 Series 700/800 Ada Compiler, Version 5.35 HP 9000 Series 800 Model 835 = HP 9000 Series 800 Model 835

    DTIC Science & Technology

    1991-11-07

    OPTICWS ’The linker options of this Ada implementation, as described in this Appendix, are provided by the customer. unless specifically noted otherwise...Packard Company Print History The following table lists the printings of this document, together with the respective release dates for each edition. The

  2. Handbook for Conducting Analysis of the Manpower, Personnel, and Training Elements for a MANPRINT Assessment

    DTIC Science & Technology

    1991-04-01

    Traditional LCSMM and the Streamlined LCSMM. The Traditional LCSMM is divided into four phases: Concept Exploration, Demonstration and Validation, Full-Scale...to go faster. Whether that person is a typist, pianist or rifleman, his accuracy is nearly always decreased.) Figure 11 provides a graphic...each). Organizations are divided into two levels: primary and secondary. 44 Table 4 EXAMPLES OF MPT QUESTIONS USED FOR ANALYSIS OF RPV MANPOWER

  3. Exploitation of Self Organization in UAV Swarms for Optimization in Combat Environments

    DTIC Science & Technology

    2008-03-01

    behaviors and entangled hierarchy into Swarmfare [59] UAV simulation environment to include these models. • Validate this new model’s success through...Figure 4.3. The hierarchy of control emerges from the entangled hierarchy of the state relations at the simulation , swarm and rule/behaviors level...majors, major) Abstract Model Types (AMT) Figure A.1: SO Abstract Model Type Table 142 Appendix B. Simulators Comparision Name MATLAB Multi UAV MultiUAV

  4. LDDX: A High Efficiency Air Conditioner for DOD Buildings

    DTIC Science & Technology

    2017-02-01

    or favoring by the Department of Defense. Page Intentionally Left Blank i REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188...failing to comply with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO... Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39.18 Page Intentionally Left Blank i COST & PERFORMANCE REPORT Project: EW-201137 TABLE OF

  5. Due Regard Encounter Model Version 1.0

    DTIC Science & Technology

    2013-08-19

    Table No. Page 1 Encounter model categories. 3 2 Geographic domain limits. 8 3 Cut points used for feature quantization. 15 B-1 Validation results. 34...out to the limits of radar coverage [ 8 ]. • Correlated Encounter Model of the National Airspace System (C): A correlated encounter model is used to...Note that no existing model covers encoun- ters between two IFR aircraft in oceanic airspace. The reason for this is that one cannot observe encounters

  6. Experimental seismic behavior of a full-scale four-story soft-story wood-frame building with retrofits II: shake table test results

    Treesearch

    John W. van de Lindt; Pouria Bahmani; Gary Mochizuki; Steven E. Pryor; Mikhail Gershfeld; Jingjing Tian; Michael D. Symans; Douglas Rammer

    2016-01-01

    Soft-story wood-frame buildings have been recognized as a disaster preparedness problem for decades. The majority of these buildings were constructed from the 1920s to the 1960s and are prone to collapse during moderate to large earthquakes due to a characteristic deficiency in strength and stiffness in their first story. In order to propose and validate retrofit...

  7. Downwelled longwave surface irradiance data from five sites for the FIRE/SRB Wisconsin Experiment from October 12 through November 2, 1986

    NASA Technical Reports Server (NTRS)

    Whitlock, Charles H.; Cox, Stephen K.; Lecroy, Stuart R.

    1990-01-01

    Tables are presented which show data from five sites in the First ISCCP (International Satellite Cloud Climatology Project) Regional Experiment (FIRE)/Surface Radiation Budget (SRB) Wisconsin experiment regional from October 12 through November 2, 1986. A discussion of intercomparison results is also included. The field experiment was conducted for the purposes of both intensive cirrus-cloud measurements and SRB algorithm validation activities.

  8. A prospective, single-arm study on the use of the da Vinci® Table Motion with the Trumpf TS7000dV operating table.

    PubMed

    Morelli, Luca; Palmeri, Matteo; Simoncini, Tommaso; Cela, Vito; Perutelli, Alessandra; Selli, Cesare; Buccianti, Piero; Francesca, Francesco; Cecchi, Massimo; Zirafa, Cristina; Bastiani, Luca; Cuschieri, Alfred; Melfi, Franca

    2018-03-30

    The da Vinci® Table Motion (dVTM) comprises a combination of a unique operating table (Trumpf Medical™ TruSystem® 7000dV) capable of isocenter motion connected wirelessly with the da Vinci Xi® robotic platform, thereby enabling patients to be repositioned without removal of instruments and or undocking the robot. Between May 2015 to October 2015, the first human use of dVTM was carried out in this prospective, single-arm, post-market study in the EU, for which 40 patients from general surgery (GS), urology (U), or gynecology (G) were enrolled prospectively. Primary endpoints of the study were dVTM feasibility, efficacy, and safety. Surgeons from the three specialties obtained targeting success and the required table positioning in all cases. Table movement/repositioning was necessary to gain exposure of the operating field in 106/116 table moves (91.3%), change target in 2/116 table moves (1.7%), achieve hemodynamic relief in 4/116 table moves (3.5%), and improve external access for tumor removal in 4/116 table moves (3.5%). There was a significantly higher use of tilt and tilt plus Trendelenburg in GS group (GS vs. U p = 0.055 and GS vs. G p = 0.054). There were no dVTM safety-related or adverse events. The dVTM with TruSystem 7000dV operating table in wireless communication with the da Vinci Xi is a perfectly safe and effective synergistic combination, which allows repositioning of the patient whenever needed without imposing any delay in the execution of the operation. Moreover, it is helpful in avoiding extreme positions and enables the anesthesiologist to provide immediate and effective hemodynamic relief to the patient when needed.

  9. Characteristics of Tables for Disseminating Biobehavioral Results.

    PubMed

    Schneider, Barbara St Pierre; Nagelhout, Ed; Feng, Du

    2018-01-01

    To report the complexity and richness of study variables within biological nursing research, authors often use tables; however, the ease with which consumers understand, synthesize, evaluate, and build upon findings depends partly upon table design. To assess and compare table characteristics within research and review articles published in Biological Research for Nursing and Nursing Research. A total of 10 elements in tables from 48 biobehavioral or biological research or review articles were analyzed. To test six hypotheses, a two-level hierarchical linear model was used for each of the continuous table elements, and a two-level hierarchical generalized linear model was used for each of the categorical table elements. Additionally, the inclusion of probability values in statistical tables was examined. The mean number of tables per article was 3. Tables in research articles were more likely to contain quantitative content, while tables in review articles were more likely to contain both quantitative and qualitative content. Tables in research articles had a greater number of rows, columns, and column-heading levels than tables in review articles. More than one half of statistical tables in research articles had a separate probability column or had probability values within the table, whereas approximately one fourth had probability notes. Authors and journal editorial staff may be generating tables that better depict biobehavioral content than those identified in specific style guidelines. However, authors and journal editorial staff may want to consider table design in terms of audience, including alternative visual displays.

  10. High school chemistry students' learning of the elements, structure, and periodicity of the periodic table: Contributions of inquiry-based activities and exemplary graphics

    NASA Astrophysics Data System (ADS)

    Roddy, Knight Phares, Jr.

    The main research question of this study was: How do selected high school chemistry students' understandings of the elements, structure, and periodicity of the Periodic Table change as they participate in a unit study consisting of inquiry-based activities emphasizing construction of innovative science graphics? The research question was answered using a multiple case study/mixed model design which employed elements of both qualitative and quantitative methodologies during data collection and analyses. The unit study was conducted over a six-week period with 11th -grade students enrolled in a chemistry class. A purposive sample of six students from the class was selected to participate in interviews and concept map coconstruction (Wandersee & Abrams, 1993) periodically across the study. The progress of the selected students of the case study was compared to the progress of the class as a whole. The students of the case study were also compared to a group of high school chemistry students at a comparative school. The results show that the students from both schools left traditional instruction on the periodic table (lecture and textbook activities) with a very limited understanding of the topic. It also revealed that the inquiry-based, visual approach of the unit study helped students make significant conceptual progress in their understanding of the periodic table. The pictorial periodic table (which features photographs of the elements), used in conjunction with the graphic technique of data mapping, enhanced students understanding of the patterns of the physical properties of the elements on the periodic table. The graphic technique of compound mapping helped students learn reactivity patterns between types and groups of elements on the periodic table. The recreation of the periodic table with element cards created from the pictorial periodic table helped students progress in their understanding of periodicity and its key concepts. The Periodic Table Literacy Rubric (PTLR) proved to be a valuable tool for assessing students' conceptual progress, and helped to identify a critical juncture in the learning of periodicity. In addition, the PTLR rubric's historical-conceptual design demonstrates how the history of science can be used to inform today's science teaching.

  11. Vertex centralities in input-output networks reveal the structure of modern economies

    NASA Astrophysics Data System (ADS)

    Blöchl, Florian; Theis, Fabian J.; Vega-Redondo, Fernando; Fisher, Eric O.'N.

    2011-04-01

    Input-output tables describe the flows of goods and services between the sectors of an economy. These tables can be interpreted as weighted directed networks. At the usual level of aggregation, they contain nodes with strong self-loops and are almost completely connected. We derive two measures of node centrality that are well suited for such networks. Both are based on random walks and have interpretations as the propagation of supply shocks through the economy. Random walk centrality reveals the vertices most immediately affected by a shock. Counting betweenness identifies the nodes where a shock lingers longest. The two measures differ in how they treat self-loops. We apply both to data from a wide set of countries and uncover salient characteristics of the structures of these national economies. We further validate our indices by clustering according to sectors’ centralities. This analysis reveals geographical proximity and similar developmental status.

  12. [Experimental orthopedic surgery: the practical aspects and management].

    PubMed

    Di Denia, P; Caligiuri, G; Guzzardella, G A; Fini, M; Giardino, R

    1996-01-01

    The funds to grant for a scientific research project are more and more interesting public and private administrations. A quantitative analysis of experimental research prices in all its phases is mandatory for an optimization process. The aim of this paper is to define practical and economical aspects of the experimental 'in vivo' models designed for the validation of biomaterials, with particular respect to the managerial bookkeeping of consumer goods, based on the experience of our Institute. Some tables were realized in order to quantify the resources needed to perform experimental 'in vivo' models. These tables represent a reliable tool for a continuous monitoring of managerial costs for the current year and for an accurate budget planning for the future years considering the experimental projects in progress and the planned researches. A business organization of public research facilities may lead to an optimization of costs and an easier national and international funds achievement increasing, also, the partnership with private appointers.

  13. Improvements to a global-scale groundwater model to estimate the water table across New Zealand

    NASA Astrophysics Data System (ADS)

    Westerhoff, Rogier; Miguez-Macho, Gonzalo; White, Paul

    2017-04-01

    Groundwater models at the global scale have become increasingly important in recent years to assess the effects of climate change and groundwater depletion. However, these global-scale models are typically not used for studies at the catchment scale, because they are simplified and too spatially coarse. In this study, we improved the global-scale Equilibrium Water Table (EWT) model, so it could better assess water table depth and water table elevation at the national scale for New Zealand. The resulting National Water Table (NWT) model used improved input data (i.e., national input data of terrain, geology, and recharge) and model equations (e.g., a hydraulic conductivity - depth relation). The NWT model produced maps of the water table that identified the main alluvial aquifers with fine spatial detail. Two regional case studies at the catchment scale demonstrated excellent correlation between the water table elevation and observations of hydraulic head. The NWT water tables are an improved water table estimation over the EWT model. In two case studies the NWT model provided a better approximation to observed water table for deep aquifers and the improved resolution of the model provided the capability to fill the gaps in data-sparse areas. This national model calculated water table depth and elevation across regional jurisdictions. Therefore, the model is relevant where trans-boundary issues, such as source protection and catchment boundary definition, occur. The NWT model also has the potential to constrain the uncertainty of catchment-scale models, particularly where data are sparse. Shortcomings of the NWT model are caused by the inaccuracy of input data and the simplified model properties. Future research should focus on improved estimation of input data (e.g., hydraulic conductivity and terrain). However, more advanced catchment-scale groundwater models should be used where groundwater flow is dominated by confining layers and fractures.

  14. A Smartphone Application for Customized Frequency Table Selection in Cochlear Implants.

    PubMed

    Jethanamest, Daniel; Azadpour, Mahan; Zeman, Annette M; Sagi, Elad; Svirsky, Mario A

    2017-09-01

    A novel smartphone-based software application can facilitate self-selection of frequency allocation tables (FAT) in postlingually deaf cochlear implant (CI) users. CIs use FATs to represent the tonotopic organization of a normal cochlea. Current CI fitting methods typically use a standard FAT for all patients regardless of individual differences in cochlear size and electrode location. In postlingually deaf patients, different amounts of mismatch can result between the frequency-place function they experienced when they had normal hearing and the frequency-place function that results from the standard FAT. For some CI users, an alternative FAT may enhance sound quality or speech perception. Currently, no widely available tools exist to aid real-time selection of different FATs. This study aims to develop a new smartphone tool for this purpose and to evaluate speech perception and sound quality measures in a pilot study of CI subjects using this application. A smartphone application for a widely available mobile platform (iOS) was developed to serve as a preprocessor of auditory input to a clinical CI speech processor and enable interactive real-time selection of FATs. The application's output was validated by measuring electrodograms for various inputs. A pilot study was conducted in six CI subjects. Speech perception was evaluated using word recognition tests. All subjects successfully used the portable application with their clinical speech processors to experience different FATs while listening to running speech. The users were all able to select one table that they judged provided the best sound quality. All subjects chose a FAT different from the standard FAT in their everyday clinical processor. Using the smartphone application, the mean consonant-nucleus-consonant score with the default FAT selection was 28.5% (SD 16.8) and 29.5% (SD 16.4) when using a self-selected FAT. A portable smartphone application enables CI users to self-select frequency allocation tables in real time. Even though the self-selected FATs that were deemed to have better sound quality were only tested acutely (i.e., without long-term experience with them), speech perception scores were not inferior to those obtained with the clinical FATs. This software application may be a valuable tool for improving future methods of CI fitting.

  15. Virtual test: A student-centered software to measure student's critical thinking on human disease

    NASA Astrophysics Data System (ADS)

    Rusyati, Lilit; Firman, Harry

    2016-02-01

    The study "Virtual Test: A Student-Centered Software to Measure Student's Critical Thinking on Human Disease" is descriptive research. The background is importance of computer-based test that use element and sub element of critical thinking. Aim of this study is development of multiple choices to measure critical thinking that made by student-centered software. Instruments to collect data are (1) construct validity sheet by expert judge (lecturer and medical doctor) and professional judge (science teacher); and (2) test legibility sheet by science teacher and junior high school student. Participants consisted of science teacher, lecturer, and medical doctor as validator; and the students as respondent. Result of this study are describe about characteristic of virtual test that use to measure student's critical thinking on human disease, analyze result of legibility test by students and science teachers, analyze result of expert judgment by science teachers and medical doctor, and analyze result of trial test of virtual test at junior high school. Generally, result analysis shown characteristic of multiple choices to measure critical thinking was made by eight elements and 26 sub elements that developed by Inch et al.; complete by relevant information; and have validity and reliability more than "enough". Furthermore, specific characteristic of multiple choices to measure critical thinking are information in form science comic, table, figure, article, and video; correct structure of language; add source of citation; and question can guide student to critical thinking logically.

  16. Context Matters: The Experience of 14 Research Teams in Systematically Reporting Contextual Factors Important for Practice Change

    PubMed Central

    Tomoaia-Cotisel, Andrada; Scammon, Debra L.; Waitzman, Norman J.; Cronholm, Peter F.; Halladay, Jacqueline R.; Driscoll, David L.; Solberg, Leif I.; Hsu, Clarissa; Tai-Seale, Ming; Hiratsuka, Vanessa; Shih, Sarah C.; Fetters, Michael D.; Wise, Christopher G.; Alexander, Jeffrey A.; Hauser, Diane; McMullen, Carmit K.; Scholle, Sarah Hudson; Tirodkar, Manasi A.; Schmidt, Laura; Donahue, Katrina E.; Parchman, Michael L.; Stange, Kurt C.

    2013-01-01

    PURPOSE We aimed to advance the internal and external validity of research by sharing our empirical experience and recommendations for systematically reporting contextual factors. METHODS Fourteen teams conducting research on primary care practice transformation retrospectively considered contextual factors important to interpreting their findings (internal validity) and transporting or reinventing their findings in other settings/situations (external validity). Each team provided a table or list of important contextual factors and interpretive text included as appendices to the articles in this supplement. Team members identified the most important contextual factors for their studies. We grouped the findings thematically and developed recommendations for reporting context. RESULTS The most important contextual factors sorted into 5 domains: (1) the practice setting, (2) the larger organization, (3) the external environment, (4) implementation pathway, and (5) the motivation for implementation. To understand context, investigators recommend (1) engaging diverse perspectives and data sources, (2) considering multiple levels, (3) evaluating history and evolution over time, (4) looking at formal and informal systems and culture, and (5) assessing the (often nonlinear) interactions between contextual factors and both the process and outcome of studies. We include a template with tabular and interpretive elements to help study teams engage research participants in reporting relevant context. CONCLUSIONS These findings demonstrate the feasibility and potential utility of identifying and reporting contextual factors. Involving diverse stakeholders in assessing context at multiple stages of the research process, examining their association with outcomes, and consistently reporting critical contextual factors are important challenges for a field interested in improving the internal and external validity and impact of health care research. PMID:23690380

  17. 26 CFR 1.807-1 - Mortality and morbidity tables.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... insurance (active life reserves); accidental death benefits 1959 Accidental Death Benefits Table. 3... tables of period 2 disablement rates and the 1930 to 1950 termination rates of the 1952 Disability Study... reserves) The 1930 to 1950 termination rates of the 1952 Disability study of the Society of Actuaries. 5...

  18. 26 CFR 1.807-1 - Mortality and morbidity tables.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... insurance (active life reserves); accidental death benefits 1959 Accidental Death Benefits Table. 3... tables of period 2 disablement rates and the 1930 to 1950 termination rates of the 1952 Disability Study... reserves) The 1930 to 1950 termination rates of the 1952 Disability study of the Society of Actuaries. 5...

  19. 26 CFR 1.807-1 - Mortality and morbidity tables.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... insurance (active life reserves); accidental death benefits 1959 Accidental Death Benefits Table. 3... tables of period 2 disablement rates and the 1930 to 1950 termination rates of the 1952 Disability Study... reserves) The 1930 to 1950 termination rates of the 1952 Disability study of the Society of Actuaries. 5...

  20. 26 CFR 1.807-1 - Mortality and morbidity tables.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... insurance (active life reserves); accidental death benefits 1959 Accidental Death Benefits Table. 3... tables of period 2 disablement rates and the 1930 to 1950 termination rates of the 1952 Disability Study... reserves) The 1930 to 1950 termination rates of the 1952 Disability study of the Society of Actuaries. 5...

  1. When Educational Material Is Delivered: A Mixed Methods Content Validation Study of the Information Assessment Method.

    PubMed

    Badran, Hani; Pluye, Pierre; Grad, Roland

    2017-03-14

    The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n=45,394], respectively). In part 2 (qualitative results), 22 items were deemed representative, while 1 item was not representative. In part 3 (mixing quantitative and qualitative results), the content validity of 21 items was confirmed, and the 2 nonrelevant items were excluded. A fully validated version was generated (IAM-v2014). This study produced a content validated IAM questionnaire that is used by clinicians and information providers to assess the clinical information delivered in continuing education programs. ©Hani Badran, Pierre Pluye, Roland Grad. Originally published in JMIR Medical Education (http://mededu.jmir.org), 14.03.2017.

  2. Calibration and validation of rainfall thresholds for shallow landslide forecasting in Sicily, southern Italy

    NASA Astrophysics Data System (ADS)

    Gariano, S. L.; Brunetti, M. T.; Iovine, G.; Melillo, M.; Peruccacci, S.; Terranova, O.; Vennari, C.; Guzzetti, F.

    2015-01-01

    Empirical rainfall thresholds are tools to forecast the possible occurrence of rainfall-induced shallow landslides. Accurate prediction of landslide occurrence requires reliable thresholds, which need to be properly validated before their use in operational warning systems. We exploited a catalogue of 200 rainfall conditions that have resulted in at least 223 shallow landslides in Sicily, southern Italy, in the 11-year period 2002-2011, to determine regional event duration-cumulated event rainfall (ED) thresholds for shallow landslide occurrence. We computed ED thresholds for different exceedance probability levels and determined the uncertainty associated to the thresholds using a consolidated bootstrap nonparametric technique. We further determined subregional thresholds, and we studied the role of lithology and seasonal periods in the initiation of shallow landslides in Sicily. Next, we validated the regional rainfall thresholds using 29 rainfall conditions that have resulted in 42 shallow landslides in Sicily in 2012. We based the validation on contingency tables, skill scores, and a receiver operating characteristic (ROC) analysis for thresholds at different exceedance probability levels, from 1% to 50%. Validation of rainfall thresholds is hampered by lack of information on landslide occurrence. Therefore, we considered the effects of variations in the contingencies and the skill scores caused by lack of information. Based on the results obtained, we propose a general methodology for the objective identification of a threshold that provides an optimal balance between maximization of correct predictions and minimization of incorrect predictions, including missed and false alarms. We expect that the methodology will increase the reliability of rainfall thresholds, fostering the operational use of validated rainfall thresholds in operational early warning system for regional shallow landslide forecasting.

  3. Intratester Reliability and Construct Validity of a Hip Abductor Eccentric Strength Test.

    PubMed

    Brindle, Richard A; Ebaugh, David; Milner, Clare E

    2018-06-06

    Side-lying hip abductor strength tests are commonly used to evaluate muscle strength. In a "break" test, the tester applies sufficient force to lower the limb to the table while the patient resists. The peak force is postulated to occur while the leg is lowering, thus representing the participant's eccentric muscle strength. However, it is unclear whether peak force occurs before or after the leg begins to lower. To determine intrarater reliability and construct validity of a hip abductor eccentric strength test. Intrarater reliability and construct validity study. Twenty healthy adults (26 [6] y; 1.66 [0.06] m; 62.2 [8.0] kg) made 2 visits to the laboratory at least 1 week apart. During the hip abductor eccentric strength test, a handheld dynamometer recorded peak force and time to peak force, and limb position was recorded via a motion capture system. Intrarater reliability was determined using intraclass correlation, SEM, and minimal detectable difference. Construct validity was assessed by determining if peak force occurred after the start of the lowering phase using a 1-sample t test. The hip abductor eccentric strength test had substantial intrarater reliability (intraclass correlation (3,3)  = .88; 95% confidence interval, .65-.95), SEM of 0.9 %BWh, and a minimal detectable difference of 2.5 %BWh. Construct validity was established as peak force occurred 2.1 (0.6) seconds (range: 0.7-3.7 s) after the start of the lowering phase of the test (P ≤ .001). The hip abductor eccentric strength test is a valid and reliable measure of eccentric muscle strength. This test may be used clinically to assess changes in eccentric muscle strength over time.

  4. A compilation of global bio-optical in situ data for ocean-colour satellite applications

    NASA Astrophysics Data System (ADS)

    Valente, André; Sathyendranath, Shubha; Brotas, Vanda; Groom, Steve; Grant, Michael; Taberner, Malcolm; Antoine, David; Arnone, Robert; Balch, William M.; Barker, Kathryn; Barlow, Ray; Bélanger, Simon; Berthon, Jean-François; Beşiktepe, Şükrü; Brando, Vittorio; Canuti, Elisabetta; Chavez, Francisco; Claustre, Hervé; Crout, Richard; Frouin, Robert; García-Soto, Carlos; Gibb, Stuart W.; Gould, Richard; Hooker, Stanford; Kahru, Mati; Klein, Holger; Kratzer, Susanne; Loisel, Hubert; McKee, David; Mitchell, Brian G.; Moisan, Tiffany; Muller-Karger, Frank; O'Dowd, Leonie; Ondrusek, Michael; Poulton, Alex J.; Repecaud, Michel; Smyth, Timothy; Sosik, Heidi M.; Twardowski, Michael; Voss, Kenneth; Werdell, Jeremy; Wernand, Marcel; Zibordi, Giuseppe

    2016-06-01

    A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GeP&CO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately. The compiled data are available at doi:10.1594/PANGAEA.854832 (Valente et al., 2015).

  5. Mortality table construction

    NASA Astrophysics Data System (ADS)

    Sutawanir

    2015-12-01

    Mortality tables play important role in actuarial studies such as life annuities, premium determination, premium reserve, valuation pension plan, pension funding. Some known mortality tables are CSO mortality table, Indonesian Mortality Table, Bowers mortality table, Japan Mortality table. For actuary applications some tables are constructed with different environment such as single decrement, double decrement, and multiple decrement. There exist two approaches in mortality table construction : mathematics approach and statistical approach. Distribution model and estimation theory are the statistical concepts that are used in mortality table construction. This article aims to discuss the statistical approach in mortality table construction. The distributional assumptions are uniform death distribution (UDD) and constant force (exponential). Moment estimation and maximum likelihood are used to estimate the mortality parameter. Moment estimation methods are easier to manipulate compared to maximum likelihood estimation (mle). However, the complete mortality data are not used in moment estimation method. Maximum likelihood exploited all available information in mortality estimation. Some mle equations are complicated and solved using numerical methods. The article focus on single decrement estimation using moment and maximum likelihood estimation. Some extension to double decrement will introduced. Simple dataset will be used to illustrated the mortality estimation, and mortality table.

  6. Field validation of Tasmania's aquaculture industry bounce-diving schedules using Doppler analysis of decompression stress.

    PubMed

    Smart, David R; Van den Broek, Cory; Nishi, Ron; Cooper, P David; Eastman, David

    2014-09-01

    Tasmania's aquaculture industry produces over 40,000 tonnes of fish annually, valued at over AUD500M. Aquaculture divers perform repetitive, short-duration bounce dives in fish pens to depths up to 21 metres' sea water (msw). Past high levels of decompression illness (DCI) may have resulted from these 'yo-yo' dives. This study aimed to assess working divers, using Doppler ultrasonic bubble detection, to determine if yo-yo diving was a risk factor for DCI, determine dive profiles with acceptable risk and investigate productivity improvement. Field data were collected from working divers during bounce diving at marine farms near Hobart, Australia. Ascent rates were less than 18 m·min⁻¹, with routine safety stops (3 min at 3 msw) during the final ascent. The Kisman-Masurel method was used to grade bubbling post dive as a means of assessing decompression stress. In accordance with Defence Research and Development Canada Toronto practice, dives were rejected as excessive risk if more than 50% of scores were over Grade 2. From 2002 to 2008, Doppler data were collected from 150 bounce-dive series (55 divers, 1,110 bounces). Three series of bounce profiles, characterized by in-water times, were validated: 13-15 msw, 10 bounces inside 75 min; 16-18 msw, six bounces inside 50 min; and 19-21 msw, four bounces inside 35 min. All had median bubble grades of 0. Further evaluation validated two successive series of bounces. Bubble grades were consistent with low-stress dive profiles. Bubble grades did not correlate with the number of bounces, but did correlate with ascent rate and in-water time. These data suggest bounce diving was not a major factor causing DCI in Tasmanian aquaculture divers. Analysis of field data has improved industry productivity by increasing the permissible number of bounces, compared to earlier empirically-derived tables, without compromising safety. The recommended Tasmanian Bounce Diving Tables provide guidance for bounce diving to a depth of 21 msw, and two successive bounce dive series in a day's diving.

  7. Reflections on Teaching Periodic Table Concepts: A Case Study of Selected Schools in South Africa

    ERIC Educational Resources Information Center

    Mokiwa, Hamza Omari

    2017-01-01

    The Periodic Table of Elements is central to the study of modern Physics and Chemistry. It is however, considered by teachers as difficult to teach. This paper reports on a case study exploring reflections on teaching periodic table concepts in five secondary schools from South Africa. Qualitative methodology of interviews and document analysis…

  8. Design of experiments in medical physics: Application to the AAA beam model validation.

    PubMed

    Dufreneix, S; Legrand, C; Di Bartolo, C; Bremaud, M; Mesgouez, J; Tiplica, T; Autret, D

    2017-09-01

    The purpose of this study is to evaluate the usefulness of the design of experiments in the analysis of multiparametric problems related to the quality assurance in radiotherapy. The main motivation is to use this statistical method to optimize the quality assurance processes in the validation of beam models. Considering the Varian Eclipse system, eight parameters with several levels were selected: energy, MLC, depth, X, Y 1 and Y 2 jaw dimensions, wedge and wedge jaw. A Taguchi table was used to define 72 validation tests. Measurements were conducted in water using a CC04 on a TrueBeam STx, a TrueBeam Tx, a Trilogy and a 2300IX accelerator matched by the vendor. Dose was computed using the AAA algorithm. The same raw data was used for all accelerators during the beam modelling. The mean difference between computed and measured doses was 0.1±0.5% for all beams and all accelerators with a maximum difference of 2.4% (under the 3% tolerance level). For all beams, the measured doses were within 0.6% for all accelerators. The energy was found to be an influencing parameter but the deviations observed were smaller than 1% and not considered clinically significant. Designs of experiment can help define the optimal measurement set to validate a beam model. The proposed method can be used to identify the prognostic factors of dose accuracy. The beam models were validated for the 4 accelerators which were found dosimetrically equivalent even though the accelerator characteristics differ. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  9. Analysis of thermal processing of table olives using computational fluid dynamics.

    PubMed

    Dimou, A; Panagou, E; Stoforos, N G; Yanniotis, S

    2013-11-01

    In the present work, the thermal processing of table olives in brine in a stationary metal can was studied through computational fluid dynamics (CFD). The flow patterns of the brine and the temperature evolution in the olives and brine during the heating and the cooling cycles of the process were calculated using the CFD code. Experimental temperature measurements at 3 points (2 inside model olive particles and 1 at a point in the brine) in a can (with dimensions of 75 mm × 105 mm) filled with 48 olives in 4% (w/v) brine, initially held at 20 °C, heated in water at 100 °C for 10 min, and thereafter cooled in water at about 20 °C for 10 min, validated model predictions. The distribution of temperature and F-values and the location of the slowest heating zone and the critical point within the product, as far as microbial destruction is concerned, were assessed for several cases. For the cases studied, the critical point was located at the interior of the olives at the 2nd, or between the 1st and the 2nd olive row from the bottom of the container, the exact location being affected by olive size, olive arrangement, and geometry of the container. © 2013 Institute of Food Technologists®

  10. Does Training in Table Creation Enhance Table Interpretation? A Quasi-Experimental Study with Follow-Up

    ERIC Educational Resources Information Center

    Karazsia, Bryan T.; Wong, Kendal

    2016-01-01

    Quantitative and statistical literacy are core domains in the undergraduate psychology curriculum. An important component of such literacy includes interpretation of visual aids, such as tables containing results from statistical analyses. This article presents results of a quasi-experimental study with longitudinal follow-up that tested the…

  11. Assessing personal talent determinants in young racquet sport players: a systematic review.

    PubMed

    Faber, Irene R; Bustin, Paul M J; Oosterveld, Frits G J; Elferink-Gemser, Marije T; Nijhuis-Van der Sanden, Maria W G

    2016-01-01

    Since junior performances have little predictive value for future success, other solutions are sought to assess a young player's potential. The objectives of this systematic review are (1) to provide an overview of instruments measuring personal talent determinants of young players in racquet sports, and (2) to evaluate these instruments regarding their validity for talent development. Electronic searches were conducted in PubMed, PsychINFO, Web of Knowledge, ScienceDirect and SPORTDiscus (1990 to 31 March 2014). Search terms represented tennis, table tennis, badminton and squash, the concept of talent, methods of testing and children. Thirty articles with information regarding over 100 instruments were included. Validity evaluation showed that instruments focusing on intellectual and perceptual abilities, and coordinative skills discriminate elite from non-elite players and/or are related to current performance, but their predictive validity is not confirmed. There is moderate evidence that the assessments of mental and goal management skills predict future performance. Data on instruments measuring physical characteristics prohibit a conclusion due to conflicting findings. This systematic review yielded an ambiguous end point. The lack of longitudinal studies precludes verification of the instrument's capacity to forecast future performance. Future research should focus on instruments assessing multidimensional talent determinants and their predictive value in longitudinal designs.

  12. Clinical decision making in response to performance validity test failure in a psychiatric setting.

    PubMed

    Marcopulos, Bernice A; Caillouet, Beth A; Bailey, Christopher M; Tussey, Chriscelyn; Kent, Julie-Ann; Frederick, Richard

    2014-01-01

    This study examined the clinical utility of a performance validity test (PVT) for screening consecutive referrals (N = 436) to a neuropsychology service at a state psychiatric hospital treating both civilly committed and forensic patients. We created a contingency table with Test of Memory Malingering (TOMM) pass/fail (355/81) and secondary gain present/absent (181/255) to examine pass rates associated with patient demographic, clinical and forensic status characteristics. Of the 81 failed PVTs, 48 had secondary gain defined as active criminal legal charges; 33 failed PVTs with no secondary gain. These individuals tended to be older, female, Caucasian, and civilly committed compared with the group with secondary gain who failed. From estimations of TOMM False Positive Rate and True Positive Rate we estimated base rates of neurocognitive malingering for our clinical population using the Test Validation Summary (TVS; Frederick & Bowden, 2009 ). Although PVT failure is clearly more common in a group with secondary gain (31%), there were a number of false positives (11%). Clinical ratings of patients without gain who failed suggested cognitive deficits, behavioral issues, and inattention. Low scores on PVTs in the absence of secondary gain provide useful information on test engagement and can inform clinical decisions about testing.

  13. Transmission imaging for integrated PET-MR systems.

    PubMed

    Bowen, Spencer L; Fuin, Niccolò; Levine, Michael A; Catana, Ciprian

    2016-08-07

    Attenuation correction for PET-MR systems continues to be a challenging problem, particularly for body regions outside the head. The simultaneous acquisition of transmission scan based μ-maps and MR images on integrated PET-MR systems may significantly increase the performance of and offer validation for new MR-based μ-map algorithms. For the Biograph mMR (Siemens Healthcare), however, use of conventional transmission schemes is not practical as the patient table and relatively small diameter scanner bore significantly restrict radioactive source motion and limit source placement. We propose a method for emission-free coincidence transmission imaging on the Biograph mMR. The intended application is not for routine subject imaging, but rather to improve and validate MR-based μ-map algorithms; particularly for patient implant and scanner hardware attenuation correction. In this study we optimized source geometry and assessed the method's performance with Monte Carlo simulations and phantom scans. We utilized a Bayesian reconstruction algorithm, which directly generates μ-map estimates from multiple bed positions, combined with a robust scatter correction method. For simulations with a pelvis phantom a single torus produced peak noise equivalent count rates (34.8 kcps) dramatically larger than a full axial length ring (11.32 kcps) and conventional rotating source configurations. Bias in reconstructed μ-maps for head and pelvis simulations was  ⩽4% for soft tissue and  ⩽11% for bone ROIs. An implementation of the single torus source was filled with (18)F-fluorodeoxyglucose and the proposed method quantified for several test cases alone or in comparison with CT-derived μ-maps. A volume average of 0.095 cm(-1) was recorded for an experimental uniform cylinder phantom scan, while a bias of  <2% was measured for the cortical bone equivalent insert of the multi-compartment phantom. Single torus μ-maps of a hip implant phantom showed significantly less artifacts and improved dynamic range, and differed greatly for highly attenuating materials in the case of the patient table, compared to CT results. Use of a fixed torus geometry, in combination with translation of the patient table to perform complete tomographic sampling, generated highly quantitative measured μ-maps and is expected to produce images with significantly higher SNR than competing fixed geometries at matched total acquisition time.

  14. Transmission imaging for integrated PET-MR systems

    NASA Astrophysics Data System (ADS)

    Bowen, Spencer L.; Fuin, Niccolò; Levine, Michael A.; Catana, Ciprian

    2016-08-01

    Attenuation correction for PET-MR systems continues to be a challenging problem, particularly for body regions outside the head. The simultaneous acquisition of transmission scan based μ-maps and MR images on integrated PET-MR systems may significantly increase the performance of and offer validation for new MR-based μ-map algorithms. For the Biograph mMR (Siemens Healthcare), however, use of conventional transmission schemes is not practical as the patient table and relatively small diameter scanner bore significantly restrict radioactive source motion and limit source placement. We propose a method for emission-free coincidence transmission imaging on the Biograph mMR. The intended application is not for routine subject imaging, but rather to improve and validate MR-based μ-map algorithms; particularly for patient implant and scanner hardware attenuation correction. In this study we optimized source geometry and assessed the method’s performance with Monte Carlo simulations and phantom scans. We utilized a Bayesian reconstruction algorithm, which directly generates μ-map estimates from multiple bed positions, combined with a robust scatter correction method. For simulations with a pelvis phantom a single torus produced peak noise equivalent count rates (34.8 kcps) dramatically larger than a full axial length ring (11.32 kcps) and conventional rotating source configurations. Bias in reconstructed μ-maps for head and pelvis simulations was  ⩽4% for soft tissue and  ⩽11% for bone ROIs. An implementation of the single torus source was filled with 18F-fluorodeoxyglucose and the proposed method quantified for several test cases alone or in comparison with CT-derived μ-maps. A volume average of 0.095 cm-1 was recorded for an experimental uniform cylinder phantom scan, while a bias of  <2% was measured for the cortical bone equivalent insert of the multi-compartment phantom. Single torus μ-maps of a hip implant phantom showed significantly less artifacts and improved dynamic range, and differed greatly for highly attenuating materials in the case of the patient table, compared to CT results. Use of a fixed torus geometry, in combination with translation of the patient table to perform complete tomographic sampling, generated highly quantitative measured μ-maps and is expected to produce images with significantly higher SNR than competing fixed geometries at matched total acquisition time.

  15. QTLTableMiner++: semantic mining of QTL tables in scientific articles.

    PubMed

    Singh, Gurnoor; Kuzniar, Arnold; van Mulligen, Erik M; Gavai, Anand; Bachem, Christian W; Visser, Richard G F; Finkers, Richard

    2018-05-25

    A quantitative trait locus (QTL) is a genomic region that correlates with a phenotype. Most of the experimental information about QTL mapping studies is described in tables of scientific publications. Traditional text mining techniques aim to extract information from unstructured text rather than from tables. We present QTLTableMiner ++ (QTM), a table mining tool that extracts and semantically annotates QTL information buried in (heterogeneous) tables of plant science literature. QTM is a command line tool written in the Java programming language. This tool takes scientific articles from the Europe PMC repository as input, extracts QTL tables using keyword matching and ontology-based concept identification. The tables are further normalized using rules derived from table properties such as captions, column headers and table footers. Furthermore, table columns are classified into three categories namely column descriptors, properties and values based on column headers and data types of cell entries. Abbreviations found in the tables are expanded using the Schwartz and Hearst algorithm. Finally, the content of QTL tables is semantically enriched with domain-specific ontologies (e.g. Crop Ontology, Plant Ontology and Trait Ontology) using the Apache Solr search platform and the results are stored in a relational database and a text file. The performance of the QTM tool was assessed by precision and recall based on the information retrieved from two manually annotated corpora of open access articles, i.e. QTL mapping studies in tomato (Solanum lycopersicum) and in potato (S. tuberosum). In summary, QTM detected QTL statements in tomato with 74.53% precision and 92.56% recall and in potato with 82.82% precision and 98.94% recall. QTM is a unique tool that aids in providing QTL information in machine-readable and semantically interoperable formats.

  16. Testing of a Shrouded, Short Mixing Stack Gas Eductor Model Using High Temperature Primary Flow.

    DTIC Science & Technology

    1982-10-01

    problem but of less significance than the heated surfaces of shipboard structure. Various types of electronic equipments and sensors carried by a combatant...here was to validate current procedures by comparison with previous data it was not considered essential to rein- stall these sensors or duplicate...sec) 205 tABLE XIX Mixing Stack Temperatura Data, Model B Thermocouple Axial Mixing Stack Temperature _ mbjr Posii--- .. (I IF) . Uptake 180 850 950

  17. Analysis of Whole-Sky Imager Data to Determine the Validity of PCFLOS models

    DTIC Science & Technology

    1992-12-01

    included in the data sample. 2-5 3.1. Data arrangement for a r x c contingency table ....................... 3-2 3.2. ARIMA models estimated for each...satellites. This model uses the multidimen- sional Boehm Sawtooth Wave Model to establish climatic probabilities through repetitive simula- tions of...analysis techniques to develop an ARIMAe model for each direction at the Columbia and Kirtland sites. Then, the models can be compared and analyzed to

  18. Downward shortwave surface irradiance from 17 sites for the FIRE/SRB Wisconsin experiment

    NASA Technical Reports Server (NTRS)

    Whitlock, Charles H.; Hay, John E.; Robinson, David A.; Cox, Stephen K.; Wardle, David I.; Lecroy, Stuart R.

    1990-01-01

    A field experiment was conducted in Wisconsin during Oct. to Nov. 1986 for purposes of both intensive cirrus cloud measurments and SRB algorithm validation activities. The cirrus cloud measurements were part of the FIRE. Tables are presented which show data from 17 sites in the First ISCCP (International Satellite Cloud Climatology Project) Regional Experiment/Surface Radiation Budget (FIRE/SRB) Wisconsin experiment region. A discussion of intercomparison results and calibration inconsistencies is also included.

  19. Strategic Defense Initiative Demonstration/Validation Program Environmental Assessment. Ground-Based Surveillance and Tracking System (GSTS),

    DTIC Science & Technology

    1987-08-01

    conflicts between the facility and local standards, and to evaluate the probability of conflict resulting from any planned expansions. 2-2 Visual Resources...on staffing and housing for the facility itself is contained in Table 2-7. Additional data on the socio - economic background of Ebeye, including...resources, noise, and socio - economics. As a result of that evaluation, consequences vere assigned to one of three categories: insignificant, mitigable, or

  20. Spine surgeon's kinematics during discectomy, part II: operating table height and visualization methods, including microscope.

    PubMed

    Park, Jeong Yoon; Kim, Kyung Hyun; Kuh, Sung Uk; Chin, Dong Kyu; Kim, Keun Su; Cho, Yong Eun

    2014-05-01

    Surgeon spine angle during surgery was studied ergonomically and the kinematics of the surgeon's spine was related with musculoskeletal fatigue and pain. Spine angles varied depending on operation table height and visualization method, and in a previous paper we showed that the use of a loupe and a table height at the midpoint between the umbilicus and the sternum are optimal for reducing musculoskeletal loading. However, no studies have previously included a microscope as a possible visualization method. The objective of this study is to assess differences in surgeon spine angles depending on operating table height and visualization method, including microscope. We enrolled 18 experienced spine surgeons for this study, who each performed a discectomy using a spine surgery simulator. Three different methods were used to visualize the surgical field (naked eye, loupe, microscope) and three different operating table heights (anterior superior iliac spine, umbilicus, the midpoint between the umbilicus and the sternum) were studied. Whole spine angles were compared for three different views during the discectomy simulation: midline, ipsilateral, and contralateral. A 16-camera optoelectronic motion analysis system was used, and 16 markers were placed from the head to the pelvis. Lumbar lordosis, thoracic kyphosis, cervical lordosis, and occipital angle were compared between the different operating table heights and visualization methods as well as a natural standing position. Whole spine angles differed significantly depending on visualization method. All parameters were closer to natural standing values when discectomy was performed with a microscope, and there were no differences between the naked eye and the loupe. Whole spine angles were also found to differ from the natural standing position depending on operating table height, and became closer to natural standing position values as the operating table height increased, independent of the visualization method. When using a microscope, lumbar lordosis, thoracic kyphosis, and cervical lordosis showed no differences according to table heights above the umbilicus. This study suggests that the use of a microscope and a table height above the umbilicus are optimal for reducing surgeon musculoskeletal fatigue.

  1. PP087. Multicenter external validation and recalibration of a model for preconceptional prediction of recurrent early-onset preeclampsia.

    PubMed

    van Kuijk, Sander; Delahaije, Denise; Dirksen, Carmen; Scheepers, Hubertina C J; Spaanderman, Marc; Ganzevoort, W; Duvekot, Hans; Oudijk, M A; van Pampus, M G; Dadelszen, Peter von; Peeters, Louis L; Smiths, Luc

    2013-04-01

    In an earlier paper we reported on the development of a model aimed at the prediction of preeclampsia recurrence, based on variables obtained before the next pregnancy (fasting glucose, BMI, previous birth of a small-for-gestational-age infant, duration of the previous pregnancy, and the presence of hypertension). To externally validate and recalibrate the prediction model for the risk of recurrence of early-onset preeclampsia. We collected data about course and outcome of the next ongoing pregnancy in 229 women with a history of early-onset preeclampsia. Recurrence was defined as preeclampsia requiring delivery before 34 weeks. We computed risk of recurrence and assessed model performance. In addition, we constructed a table comparing sensitivity, specificity, and predictive values for different suggested risk-thresholds. Early-onset preeclampsia recurred in 6.6% of women. The model systematically underestimated recurrence risk. The model's discriminative ability was modest, the area under the receiver operating characteristic curve was 58.9% (95% CI: 45.1 - 72.7). Using relevant risk-thresholds, the model created groups that were only moderately different in terms of their average risk of recurrent preeclampsia (Table 1). Compared to an AUC of 65% in the development cohort, the discriminate ability of the model was diminished. It had inadequate performance to classify women into clinically relevant risk groups. Copyright © 2013. Published by Elsevier B.V.

  2. Age correction in monitoring audiometry: method to update OSHA age-correction tables to include older workers

    PubMed Central

    Dobie, Robert A; Wojcik, Nancy C

    2015-01-01

    Objectives The US Occupational Safety and Health Administration (OSHA) Noise Standard provides the option for employers to apply age corrections to employee audiograms to consider the contribution of ageing when determining whether a standard threshold shift has occurred. Current OSHA age-correction tables are based on 40-year-old data, with small samples and an upper age limit of 60 years. By comparison, recent data (1999–2006) show that hearing thresholds in the US population have improved. Because hearing thresholds have improved, and because older people are increasingly represented in noisy occupations, the OSHA tables no longer represent the current US workforce. This paper presents 2 options for updating the age-correction tables and extending values to age 75 years using recent population-based hearing survey data from the US National Health and Nutrition Examination Survey (NHANES). Both options provide scientifically derived age-correction values that can be easily adopted by OSHA to expand their regulatory guidance to include older workers. Methods Regression analysis was used to derive new age-correction values using audiometric data from the 1999–2006 US NHANES. Using the NHANES median, better-ear thresholds fit to simple polynomial equations, new age-correction values were generated for both men and women for ages 20–75 years. Results The new age-correction values are presented as 2 options. The preferred option is to replace the current OSHA tables with the values derived from the NHANES median better-ear thresholds for ages 20–75 years. The alternative option is to retain the current OSHA age-correction values up to age 60 years and use the NHANES-based values for ages 61–75 years. Conclusions Recent NHANES data offer a simple solution to the need for updated, population-based, age-correction tables for OSHA. The options presented here provide scientifically valid and relevant age-correction values which can be easily adopted by OSHA to expand their regulatory guidance to include older workers. PMID:26169804

  3. Age correction in monitoring audiometry: method to update OSHA age-correction tables to include older workers.

    PubMed

    Dobie, Robert A; Wojcik, Nancy C

    2015-07-13

    The US Occupational Safety and Health Administration (OSHA) Noise Standard provides the option for employers to apply age corrections to employee audiograms to consider the contribution of ageing when determining whether a standard threshold shift has occurred. Current OSHA age-correction tables are based on 40-year-old data, with small samples and an upper age limit of 60 years. By comparison, recent data (1999-2006) show that hearing thresholds in the US population have improved. Because hearing thresholds have improved, and because older people are increasingly represented in noisy occupations, the OSHA tables no longer represent the current US workforce. This paper presents 2 options for updating the age-correction tables and extending values to age 75 years using recent population-based hearing survey data from the US National Health and Nutrition Examination Survey (NHANES). Both options provide scientifically derived age-correction values that can be easily adopted by OSHA to expand their regulatory guidance to include older workers. Regression analysis was used to derive new age-correction values using audiometric data from the 1999-2006 US NHANES. Using the NHANES median, better-ear thresholds fit to simple polynomial equations, new age-correction values were generated for both men and women for ages 20-75 years. The new age-correction values are presented as 2 options. The preferred option is to replace the current OSHA tables with the values derived from the NHANES median better-ear thresholds for ages 20-75 years. The alternative option is to retain the current OSHA age-correction values up to age 60 years and use the NHANES-based values for ages 61-75 years. Recent NHANES data offer a simple solution to the need for updated, population-based, age-correction tables for OSHA. The options presented here provide scientifically valid and relevant age-correction values which can be easily adopted by OSHA to expand their regulatory guidance to include older workers. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  4. Chemical food composition: implications for atherosclerosis prevention.

    PubMed

    Scherr, Carlos; Ribeiro, Jorge Pinto

    2011-01-01

    To compare the fatty acid and cholesterol content in food acquired in Brazil with the composition found in the most frequently used reference tables in the country. The fatty acid and cholesterol content in 41 food items frequently used in our country and the various directions to prepare them were reviewed by using specific methodology and the information was compared to the tables adopted by Unicamp and UNIFESP. According to Unicamp table, the cholesterol content found in parmesan cheese was 100.7 mg/100 g, while it was 68 mg/100 g in UNIFESP table, that is, a 48% (p < 0.05), higher content in the former. This study table found a cholesterol content 31% lower (94 mg/100 g vs. 123 mg/100 g, p < 0.05) for yellow cheese. For whole milk, we found a 52% difference regarding cholesterol content, while the difference for saturated fat ranged from 1.4 g/100 g in Unicamp table to 2.130 g/100 g in our study table (p < 0.05). For some food items, no statistically significant differences were found among the tables. However, when a 1,800-calorie diet was prescribed, the discrepancies among the tables and lack of information resulted in clinically relevant differences in dietary recommendations. There are important differences in food fat content between the fatty acid and cholesterol content formally analyzed and the content shown on commonly used tables, and this can compromise our recommendations on preventing atherosclerosis. One possible explanation for the differences would be the fact that the UNIFESP table is American in origin.

  5. Soil moisture datasets at five sites in the central Sierra Nevada and northern Coast Ranges, California

    USGS Publications Warehouse

    Stern, Michelle A.; Anderson, Frank A.; Flint, Lorraine E.; Flint, Alan L.

    2018-05-03

    In situ soil moisture datasets are important inputs used to calibrate and validate watershed, regional, or statewide modeled and satellite-based soil moisture estimates. The soil moisture dataset presented in this report includes hourly time series of the following: soil temperature, volumetric water content, water potential, and total soil water content. Data were collected by the U.S. Geological Survey at five locations in California: three sites in the central Sierra Nevada and two sites in the northern Coast Ranges. This report provides a description of each of the study areas, procedures and equipment used, processing steps, and time series data from each site in the form of comma-separated values (.csv) tables.

  6. [Raman spectroscopy fluorescence background correction and its application in clustering analysis of medicines].

    PubMed

    Chen, Shan; Li, Xiao-ning; Liang, Yi-zeng; Zhang, Zhi-min; Liu, Zhao-xia; Zhang, Qi-ming; Ding, Li-xia; Ye, Fei

    2010-08-01

    During Raman spectroscopy analysis, the organic molecules and contaminations will obscure or swamp Raman signals. The present study starts from Raman spectra of prednisone acetate tablets and glibenclamide tables, which are acquired from the BWTek i-Raman spectrometer. The background is corrected by R package baselineWavelet. Then principle component analysis and random forests are used to perform clustering analysis. Through analyzing the Raman spectra of two medicines, the accurate and validity of this background-correction algorithm is checked and the influences of fluorescence background on Raman spectra clustering analysis is discussed. Thus, it is concluded that it is important to correct fluorescence background for further analysis, and an effective background correction solution is provided for clustering or other analysis.

  7. Computerization of guidelines: a knowledge specification method to convert text to detailed decision tree for electronic implementation.

    PubMed

    Aguirre-Junco, Angel-Ricardo; Colombet, Isabelle; Zunino, Sylvain; Jaulent, Marie-Christine; Leneveut, Laurence; Chatellier, Gilles

    2004-01-01

    The initial step for the computerization of guidelines is the knowledge specification from the prose text of guidelines. We describe a method of knowledge specification based on a structured and systematic analysis of text allowing detailed specification of a decision tree. We use decision tables to validate the decision algorithm and decision trees to specify and represent this algorithm, along with elementary messages of recommendation. Edition tools are also necessary to facilitate the process of validation and workflow between expert physicians who will validate the specified knowledge and computer scientist who will encode the specified knowledge in a guide-line model. Applied to eleven different guidelines issued by an official agency, the method allows a quick and valid computerization and integration in a larger decision support system called EsPeR (Personalized Estimate of Risks). The quality of the text guidelines is however still to be developed further. The method used for computerization could help to define a framework usable at the initial step of guideline development in order to produce guidelines ready for electronic implementation.

  8. Corrigendum to "Misconceptions impairing the validity of the stopping power tables in the SRIM library and suggestions for doing better in the future" [Nucl. Instr. Meth. B 380 (2016) 57-70

    NASA Astrophysics Data System (ADS)

    Wittmaack, Klaus

    2016-12-01

    In the Introduction of the above paper I made joint reference to four previous publications containing tables of electronic stopping cross sections. However, not all of the reports are similar in nature, a fact that I missed to clarify; the error also escaped attention of the reviewer. The first three cited sets of tables were prepared in analogy to the SRIM library, using a rather limited number of available experimental data for extrapolation based on an approach that I showed to have no justification. This main subject of my paper is not repeated here. The fourth reference, the ICRU Report 73 [1] is completely different in character. The first part comprises a very valuable overview on stopping power theory as well as on methods to measure energy losses. The second part contains electronic stopping cross sections predicted by the binary-collision code PASS developed by Sigmund and Schinner [2]. The early application of the code [1] was limited to projectile numbers Z1 ⩽ 18 (Ar) and reduced energies E/M1 ⩾ 25 keV/u. More recent work suggests that there is room for a wider range of applications of the PASS code [3].

  9. An Integrated Framework for Human-Robot Collaborative Manipulation.

    PubMed

    Sheng, Weihua; Thobbi, Anand; Gu, Ye

    2015-10-01

    This paper presents an integrated learning framework that enables humanoid robots to perform human-robot collaborative manipulation tasks. Specifically, a table-lifting task performed jointly by a human and a humanoid robot is chosen for validation purpose. The proposed framework is split into two phases: 1) phase I-learning to grasp the table and 2) phase II-learning to perform the manipulation task. An imitation learning approach is proposed for phase I. In phase II, the behavior of the robot is controlled by a combination of two types of controllers: 1) reactive and 2) proactive. The reactive controller lets the robot take a reactive control action to make the table horizontal. The proactive controller lets the robot take proactive actions based on human motion prediction. A measure of confidence of the prediction is also generated by the motion predictor. This confidence measure determines the leader/follower behavior of the robot. Hence, the robot can autonomously switch between the behaviors during the task. Finally, the performance of the human-robot team carrying out the collaborative manipulation task is experimentally evaluated on a platform consisting of a Nao humanoid robot and a Vicon motion capture system. Results show that the proposed framework can enable the robot to carry out the collaborative manipulation task successfully.

  10. Carbon cycling responses to a water table drawdown and decadal vegetation changes in a bog

    NASA Astrophysics Data System (ADS)

    Talbot, J.; Roulet, N. T.

    2009-12-01

    The quantity of carbon stored in peat depends on the imbalance between production and decomposition of organic matter. This imbalance is mainly controlled by the wetness of the peatland, usually described by the water table depth. However, long-term processes resulting from hydrological changes, such as vegetation succession, also play a major role in the biogeochemistry of peatlands. Previous studies have looked at the impact of a water table lowering on carbon fluxes in different types of peatlands. However, most of these studies were conducted within a time frame that did not allow the examination of vegetation changes due to the water table lowering. We conducted a study along a drainage gradient resulting from the digging of a drainage ditch 85 years ago in a portion of the Mer Bleue bog, located near Ottawa, Canada. According to water table reconstructions based on testate amoeba, the drainage dropped the water table by approximately 18 cm. On the upslope side of the ditch, the water table partly recovered and the vegetation changed only marginally. However, on the downslope side of the ditch, the water table stayed persistently lower and trees established (Larix and Betula). The importance of Sphagnum decreased with a lower water table, and evergreen shrubs were replaced by deciduous shrubs. The water table drop and subsequent vegetation changes had combined and individual effects on the carbon functioning of the peatland. Methane fluxes decreased because of the water table lowering, but were not affected by vegetation changes, whereas respiration and net ecosystem productivity were affected by both. The carbon storage of the system increased because of an increase in plant biomass, but the long-term carbon storage as peat decreased. The inclusion of the feedback effect that vegetation has on the carbon functioning of a peatland when a disturbance occurs is crucial to simulate the long-term carbon balance of this ecosystem.

  11. Gender Differences in Coping among Elite Table Tennis Players

    ERIC Educational Resources Information Center

    Kirimoglu, Huseyin

    2011-01-01

    The current study aims to investigate the explanatory power of social support and coping in relation to a competitive sport event between male and female table tennis players. 246 university students table tennis players (120 men and 126 women) from different region and part of Turkey were invited to participate in a survey study included the…

  12. Online Periodic Table: A Cautionary Note

    ERIC Educational Resources Information Center

    Izci, Kemal; Barrow, Lloyd H.; Thornhill, Erica

    2013-01-01

    The purpose of this study was (a) to evaluate ten online periodic table sources for their accuracy and (b) to compare the types of information and links provided to users. Limited studies have been reported on online periodic table (Diener and Moore 2011; Slocum and Moore in "J Chem Educ" 86(10):1167, 2009). Chemistry students'…

  13. Biomedical and Behavioral Research Scientists: Their Training and Supply. Volume 2: Statistical Tables.

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC. Office of Scientific and Engineering Personnel.

    Volume Two of a three volume set of the Biomedical and Behavioral Research Scientists study presents tables of data which were required for the study's development by the National Research Council. Data from these tables were obtained from the Association of American Medical Colleges, the American Dental Association, the American Medical…

  14. Dietary intake of artificial sweeteners by the Belgian population.

    PubMed

    Huvaere, Kevin; Vandevijvere, Stefanie; Hasni, Moez; Vinkx, Christine; Van Loco, Joris

    2012-01-01

    This study investigated whether the Belgian population older than 15 years is at risk of exceeding ADI levels for acesulfame-K, saccharin, cyclamate, aspartame and sucralose through an assessment of usual dietary intake of artificial sweeteners and specific consumption of table-top sweeteners. A conservative Tier 2 approach, for which an extensive label survey was performed, showed that mean usual intake was significantly lower than the respective ADIs for all sweeteners. Even consumers with high intakes were not exposed to excessive levels, as relative intakes at the 95th percentile (p95) were 31% for acesulfame-K, 13% for aspartame, 30% for cyclamate, 17% for saccharin, and 16% for sucralose of the respective ADIs. Assessment of intake using a Tier 3 approach was preceded by optimisation and validation of an analytical method based on liquid chromatography with mass spectrometric detection. Concentrations of sweeteners in various food matrices and table-top sweeteners were determined and mean positive concentration values were included in the Tier 3 approach, leading to relative intakes at p95 of 17% for acesulfame-K, 5% for aspartame, 25% for cyclamate, 11% for saccharin, and 7% for sucralose of the corresponding ADIs. The contribution of table-top sweeteners to the total usual intake (<1% of ADI) was negligible. A comparison of observed intake for the total population with intake for diabetics (acesulfame-K: 3.55 versus 3.75; aspartame: 6.77 versus 6.53; cyclamate: 1.97 versus 2.06; saccharine: 1.14 versus 0.97; sucralose: 3.08 versus 3.03, expressed as mg kg(-1) bodyweight day(-1) at p95) showed that the latter group was not exposed to higher levels. It was concluded that the Belgian population is not at risk of exceeding the established ADIs for sweeteners.

  15. Comparison of the Lund and Browder table to computed tomography scan three-dimensional surface area measurement for a pediatric cohort.

    PubMed

    Rumpf, R Wolfgang; Stewart, William C L; Martinez, Stephen K; Gerrard, Chandra Y; Adolphi, Natalie L; Thakkar, Rajan; Coleman, Alan; Rajab, Adrian; Ray, William C; Fabia, Renata

    2018-01-01

    Treating burns effectively requires accurately assessing the percentage of the total body surface area (%TBSA) affected by burns. Current methods for estimating %TBSA, such as Lund and Browder (L&B) tables, rely on historic body statistics. An increasingly obese population has been blamed for increasing errors in %TBSA estimates. However, this assumption has not been experimentally validated. We hypothesized that errors in %TBSA estimates using L&B were due to differences in the physical proportions of today's children compared with children in the early 1940s when the chart was developed and that these differences would appear as body mass index (BMI)-associated systematic errors in the L&B values versus actual body surface areas. We measured the TBSA of human pediatric cadavers using computed tomography scans. Subjects ranged from 9 mo to 15 y in age. We chose outliers of the BMI distribution (from the 31st percentile at the low through the 99th percentile at the high). We examined surface area proportions corresponding to L&B regions. Measured regional proportions based on computed tomography scans were in reasonable agreement with L&B, even with subjects in the tails of the BMI range. The largest deviation was 3.4%, significantly less than the error seen in real-world %TBSA estimates. While today's population is more obese than those studied by L&B, their body region proportions scale surprisingly well. The primary error in %TBSA estimation is not due to changing physical proportions of today's children and may instead lie in the application of the L&B table. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Seismoelectric couplings in a poroelastic material containing two immiscible fluid phases

    NASA Astrophysics Data System (ADS)

    Jardani, A.; Revil, A.

    2015-08-01

    A new approach of seismoelectric imaging has been recently proposed to detect saturation fronts in which seismic waves are focused in the subsurface to scan its heterogeneous nature and determine saturation fronts. Such type of imaging requires however a complete modelling of the seismoelectric properties of porous media saturated by two immiscible fluid phases, one being usually electrically insulating (for instance water and oil). We combine an extension of Biot dynamic theory, valid for porous media containing two immiscible Newtonian fluids, with an extension of the electrokinetic theory based on the notion of effective volumetric charge densities dragged by the flow of each fluid phase. These effective charge densities can be related directly to the permeability and saturation of each fluid phase. The coupled partial differential equations are solved with the finite element method. We also derive analytically the transfer function connecting the macroscopic electrical field to the acceleration of the fast P wave (coseismic electrical field) and we study the influence of the water content on this coupling. We observe that the amplitude of the co-seismic electrical disturbance is very sensitive to the water content with an increase in amplitude with water saturation. We also investigate the seismoelectric conversions (interface effect) occurring at the water table. We show that the conversion response at the water table can be identifiable only when the saturation contrasts between the vadose and saturated zones are sharp enough. A relatively dry vadose zone represents the best condition to identify the water table through seismoelectric measurements. Indeed, in this case, the coseismic electrical disturbances are vanishingly small compared to the seismoelectric interface response.

  17. Minimalistic models of the vertical distribution of roots under stochastic hydrological forcing

    NASA Astrophysics Data System (ADS)

    Laio, Francesco

    2014-05-01

    The assessment of the vertical root profile can be useful for multiple purposes: the partition of water fluxes between evaporation and transpiration, the evaluation of root soil reinforcement for bioengineering applications, the influence of roots on biogeochemical and microbial processes in the soil, etc. In water-controlled ecosystems the shape of the root profile is mainly determined by the soil moisture availability at different depths. The long term soil water balance in the root zone can be assessed by modeling the stochastic incoming and outgoing water fluxes, influenced by the stochastic rainfall pulses and/or by the water table fluctuations. Through an ecohydrological analysis one obtains that in water-controlled ecosystems the vertical root distribution is a decreasing function with depth, whose parameters depend on pedologic and climatic factors. The model can be extended to suitably account for the influence of the water table fluctuations, when the water table is shallow enough to exert an influence on root development, in which case the vertical root distribution tends to assume a non-monotonic form. In order to evaluate the validity of the ecohydrological estimation of the root profile we have tested it on a case study in the north of Tuscany (Italy). We have analyzed data from 17 landslide-prone sites: in each of these sites we have assessed the pedologic and climatic descriptors necessary to apply the model, and we have measured the mean rooting depth. The results show a quite good matching between observed and modeled mean root depths. The merit of this minimalistic approach to the modeling of the vertical root distribution relies on the fact that it allows a quantitative estimation of the main features of the vertical root distribution without resorting to time- and money-demanding measuring surveys.

  18. Ecology of testate amoebae in an Amazonian peatland and development of a transfer function for palaeohydrological reconstruction.

    PubMed

    Swindles, Graeme T; Reczuga, Monika; Lamentowicz, Mariusz; Raby, Cassandra L; Turner, T Edward; Charman, Dan J; Gallego-Sala, Angela; Valderrama, Elvis; Williams, Christopher; Draper, Frederick; Honorio Coronado, Euridice N; Roucoux, Katherine H; Baker, Tim; Mullan, Donal J

    2014-08-01

    Tropical peatlands represent globally important carbon sinks with a unique biodiversity and are currently threatened by climate change and human activities. It is now imperative that proxy methods are developed to understand the ecohydrological dynamics of these systems and for testing peatland development models. Testate amoebae have been used as environmental indicators in ecological and palaeoecological studies of peatlands, primarily in ombrotrophic Sphagnum-dominated peatlands in the mid- and high-latitudes. We present the first ecological analysis of testate amoebae in a tropical peatland, a nutrient-poor domed bog in western (Peruvian) Amazonia. Litter samples were collected from different hydrological microforms (hummock to pool) along a transect from the edge to the interior of the peatland. We recorded 47 taxa from 21 genera. The most common taxa are Cryptodifflugia oviformis, Euglypha rotunda type, Phryganella acropodia, Pseudodifflugia fulva type and Trinema lineare. One species found only in the southern hemisphere, Argynnia spicata, is present. Arcella spp., Centropyxis aculeata and Lesqueresia spiralis are indicators of pools containing standing water. Canonical correspondence analysis and non-metric multidimensional scaling illustrate that water table depth is a significant control on the distribution of testate amoebae, similar to the results from mid- and high-latitude peatlands. A transfer function model for water table based on weighted averaging partial least-squares (WAPLS) regression is presented and performs well under cross-validation (r(2)(apparent)= 0.76, RMSE = 4.29; r(2)(jack)= 0.68, RMSEP =5.18). The transfer function was applied to a 1-m peat core, and sample-specific reconstruction errors were generated using bootstrapping. The reconstruction generally suggests near-surface water tables over the last 3,000 years, with a shift to drier conditions at c. cal. 1218-1273 AD.

  19. Validation of administrative data used for the diagnosis of upper gastrointestinal events following nonsteroidal anti-inflammatory drug prescription.

    PubMed

    Abraham, N S; Cohen, D C; Rivers, B; Richardson, P

    2006-07-15

    To validate veterans affairs (VA) administrative data for the diagnosis of nonsteroidal anti-inflammatory drug (NSAID)-related upper gastrointestinal events (UGIE) and to develop a diagnostic algorithm. A retrospective study of veterans prescribed an NSAID as identified from the national pharmacy database merged with in-patient and out-patient data, followed by primary chart abstraction. Contingency tables were constructed to allow comparison with a random sample of patients prescribed an NSAID, but without UGIE. Multivariable logistic regression analysis was used to derive a predictive algorithm. Once derived, the algorithm was validated in a separate cohort of veterans. Of 906 patients, 606 had a diagnostic code for UGIE; 300 were a random subsample of 11 744 patients (control). Only 161 had a confirmed UGIE. The positive predictive value (PPV) of diagnostic codes was poor, but improved from 27% to 51% with the addition of endoscopic procedural codes. The strongest predictors of UGIE were an in-patient ICD-9 code for gastric ulcer, duodenal ulcer and haemorrhage combined with upper endoscopy. This algorithm had a PPV of 73% when limited to patients >or=65 years (c-statistic 0.79). Validation of the algorithm revealed a PPV of 80% among patients with an overlapping NSAID prescription. NSAID-related UGIE can be assessed using VA administrative data. The optimal algorithm includes an in-patient ICD-9 code for gastric or duodenal ulcer and gastrointestinal bleeding combined with a procedural code for upper endoscopy.

  20. Analytical representation for ephemeris with short time-span - Aplication to the longitude of Titan

    NASA Astrophysics Data System (ADS)

    XI, Xiaojin; Vienne, Alain

    2017-06-01

    Ephemerides of the natural satellites are generally presented in the form of tables, or computed on line, for example like some best ones from JPL or IMCCE. In the sense of fitted the more recent and best observations, analytical representation is not so sufficient, although these representations are valid over a very long time-span. But in some analytical studies, it could be benefitted to have the both advantages. We present here the case of the study of the rotation of Titan, in which we need a representation of the true longitude of Titan. Frequency analysis can be used partially on the numerical ephemerides because of limited time-span. To complete it, we use the form of the analytical representation to obtained their numerical parameters.The method is presented and some results are given.

  1. Proton affinities of maingroup-element hydrides and noble gases: trends across the periodic table, structural effects, and DFT validation.

    PubMed

    Swart, Marcel; Rösler, Ernst; Bickelhaupt, F Matthias

    2006-10-01

    We have carried out an extensive exploration of the gas-phase basicity of archetypal neutral bases across the periodic system using the generalized gradient approximation (GGA) of the density functional theory (DFT) at BP86/QZ4P//BP86/TZ2P. First, we validate DFT as a reliable tool for computing proton affinities and related thermochemical quantities: BP86/QZ4P//BP86/TZ2P is shown to yield a mean absolute deviation of 2.0 kcal/mol for the proton affinity at 298 K with respect to experiment, and 1.2 kcal/mol with high-level ab initio benchmark data. The main purpose of this work is to provide the proton affinities (and corresponding entropies) at 298 K of the neutral bases constituted by all maingroup-element hydrides of groups 15-17 and the noble gases, that is, group 18, and periods 1-6. We have also studied the effect of step-wise methylation of the protophilic center of the second- and third-period bases. Copyright 2006 Wiley Periodicals, Inc.

  2. Cloth-covered chiropractic treatment tables as a source of allergens and pathogenic microbes.

    PubMed

    Evans, Marion W; Campbell, Alan; Husbands, Chris; Breshears, Jennell; Ndetan, Harrison; Rupert, Ronald

    2008-03-01

    Vinyl chiropractic tables have been found to harbor pathogenic bacteria, but wiping with a simple disinfection agent can significantly reduce the risk of bacteria. The aim of this study was to assess the presence of microbes and other allergens or pathogens on cloth chiropractic tables. Cloth-covered tables in a chiropractic college teaching clinic were selected. Samples were taken from the facial piece and hand rests with RODAC plates containing nutrient agar, followed by confirmatory testing when indicated. Numerous microbacteria strains were found, including Staphylococcus aureus and Propionibacterium. Allergen-producing molds, including Candida, were also found. Cloth tables were shown to contain pathogenic microbacteria and allergens. The chiropractic profession should establish an infection control protocol relevant to treatment tables and discard use of cloth-covered treatment tables in this process.

  3. Imaging Quaternary glacial deposits and basement topography using the transient electromagnetic method for modeling aquifer environments

    NASA Astrophysics Data System (ADS)

    Simard, Patrick Tremblay; Chesnaux, Romain; Rouleau, Alain; Daigneault, Réal; Cousineau, Pierre A.; Roy, Denis W.; Lambert, Mélanie; Poirier, Brigitte; Poignant-Molina, Léo

    2015-08-01

    Aquifer formations along the northern shore of the Saint-Lawrence River in Quebec (Canada) mainly consist of glacial and coastal deposits of variable thickness overlying Precambrian bedrock. These deposits are important because they provide the main water supply for many communities. As part of a continuing project aimed at developing an inventory of the groundwater resources in the Charlevoix and Haute-Côte-Nord (CHCN) regions of the province of Quebec in Canada, the central loop transient electromagnetic (TEM) method was used to map the principal hydrogeological environments in these regions. One-dimensional smooth inversion models of the TEM soundings have been used to construct two-dimensional electrical resistivity sections, which provided images for hydrogeological validation. Electrical contour lines of aquifer environments were compared against available well logs and Quaternary surface maps in order to interpret TEM soundings. A calibration table was achieved to represent common deposits and basements. The calibration table was then exported throughout the CHCN region. This paper presents three case studies; one in the Forestville site, another in the Les Escoumins site and the other in the Saint-Urbain site. These sites were selected as targets for geophysical surveys because of the general lack of local direct hydrogeological data related to them.

  4. Installation Restoration Program Stage 3. Remedial Investigation/ Feasibility Study, Elmendorf AFB, Alaska. Volume 1. Sections 1-4, Text

    DTIC Science & Technology

    1990-05-01

    DISSOLVED INORGANIC 4-16 CONSTITUENTS IN GROUNDWATER TABLE 4.1.8 CANCER CAUSING POTENTIAL OF CONTAMINATION 4-18 DETECTED AT ELMENDORF APB TABLE 4.1.9...TABLE 5-34 EXAMPLE CALCMATION OF EXCESS LIFETIME CANCER RISK 5-142 I DUE TO EXPOSURE TABLE 5-35 CONTAMINATION AND RISKS AT SITE IS-1 5-145 TABLE 5-36...sand. They occur on low terraces and in broad depressions . Slopes range from zero to 7 percent. o The Purches series consists of loamy-skeletal, mixed

  5. Use of 3×2 tables with an intention to diagnose approach to assess clinical performance of diagnostic tests: meta-analytical evaluation of coronary CT angiography studies

    PubMed Central

    Schuetz, Georg M; Schlattmann, Peter

    2012-01-01

    Objective To determine whether a 3×2 table, using an intention to diagnose approach, is better than the “classic” 2×2 table at handling transparent reporting and non-evaluable results, when assessing the accuracy of a diagnostic test. Design Based on a systematic search for diagnostic accuracy studies of coronary computed tomography (CT) angiography, full texts of relevant studies were evaluated to determine whether they could calculate an alternative 3×2 table. To quantify an overall effect, we pooled diagnostic accuracy values according to a meta-analytical approach. Data sources Medline (via PubMed), Embase (via Ovid), and ISI Web of Science electronic databases. Eligibility criteria Prospective English or German language studies comparing coronary CT with conventional coronary angiography in all patients and providing sufficient data for a patient level analysis. Results 120 studies (10 287 patients) were eligible. Studies varied greatly in their approaches to handling non-evaluable findings. We found 26 studies (including 2298 patients) that allowed us to calculate both 2×2 tables and 3×2 tables. Using a bivariate random effects model, we compared the 2×2 table with the 3×2 table, and found significant differences for pooled sensitivity (98.2 (95% confidence interval 96.7 to 99.1) v 92.7 (88.5 to 95.3)), area under the curve (0.99 (0.98 to 1.00) v 0.93 (0.91 to 0.95)), positive likelihood ratio (9.1 (6.2 to 13.3) v 4.4 (3.3 to 6.0)), and negative likelihood ratio (0.02 (0.01 to 0.04) v 0.09 (0.06 to 0.15); (P<0.05)). Conclusion Parameters for diagnostic performance significantly decrease if non-evaluable results are included by a 3×2 table for analysis (intention to diagnose approach). This approach provides a more realistic picture of the clinical potential of diagnostic tests. PMID:23097549

  6. An Army Illumination Model (AAIM)

    DTIC Science & Technology

    2008-11-01

    digitized for each of the three light types and coded into the model. 10 Figure 3. Spectra of light sources in table 2. Clear Mercury 0 20 40 60...λ λ∆= λ 1100 300 ii eE , (16) where eiλ is the radiant energy at a specific wavelength taken from the digitized spectra and ∆λ is the bin width...to a factor less than 2. Validation was also done via comparison with results from Garstang (6). His figures 2 and 3 were digitized and compared

  7. Handwashing Practices among Hospital Patients: Knowledge and Perception of Ambulatory Patients and Nursing Personnel

    DTIC Science & Technology

    1989-01-01

    ORGANIZATION 6b. OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATION AFIT STUDENT AT WRIGHT (If applicable) STATE UNIVERSITY AFIT/CIA 6c. ADDRESS (City, State... depersonalized and answers were indicated on a Likert Scale, 5 responses were considered objective. Validity and reliability were established for these 3...College 7 20 Undergraduate Degree 19 54 3 Graduate Degree 3 9 I I I I I I I 57 Table 7 Characteristics of Subsets of Nursinq Personnel Participants I

  8. Identification of miRNA Signatures Associated with Epithelial Ovarian Cancer Chemoresistance with Further Biological and Functional Validation of Identified Key miRNAs

    DTIC Science & Technology

    2012-08-01

    separated on 12% SDS PAGE gels and transferred to nitrocellulose membranes. After blocking with 5% non- fat milk (Labscientific, Inc) in TBS-Tween buffer... Raw mass spectrometric data were processed and analyzed for variations in the spectral counts of peptides between sample sets and bioinformatics was...accomplished using Ingenuity Pathways Analysis (IPA). Results: The total numbers of proteins and peptides identified are listed in the table

  9. Validation Testing and Numerical Modeling of Advanced Armor Materials

    DTIC Science & Technology

    2012-11-01

    Rigid Anvil A ground steel high-hard plate was placed in front of a massive block of steel that was anchored to the gun table. After every shot, the...photovoltaic sensors • 15.24-cm RHA steel block with a 1.27-cm thick high hard steel anvil plate The following instruments/equipment was used to perform...Range 167 was prepared using the 2.77-cm bore gas operated gun, an RHA anvil block with a high hard steel faceplate that was surface polished, and a

  10. Algorithms for Processing and Analysis of Ocean Color Satellite Data for Coastal Case 2 Waters. Chapter 16

    NASA Technical Reports Server (NTRS)

    Stumpf, Richard P.; Arnone, Robert A.; Gould, Richard W., Jr.; Ransibrahmanakul, Varis; Tester, Patricia A.

    2003-01-01

    SeaWiFS has the ability to enhance our understanding of many oceanographic processes. However, its utility in the coastal zone has been limited by valid bio-optical algorithms and by the determination of accurate water reflectances, particularly in the blue bands (412-490 nm), which have a significant impact on the effectiveness of all bio-optical algorithms. We have made advances in three areas: algorithm development (Table 16.1), field data collection, and data applications.

  11. Digital Systems Validation Handbook. Volume 2

    DTIC Science & Technology

    1989-02-01

    0 TABLE 7.2-3. FAILURE RATES FOR MAJOR RDFCS COMPONENTS COMPONENT UNIT FAILURE RATE* Pitch Angle Gyro 303 Roll Angle Gyro 303 Yaw Rate Gyro 200...Airplane Weight 314,500 lb Altitude 35 ft Angle of Attack 10.91 0 Indicated Air Speed 168 kts Flap Deployment 22 o Transition capability was added to go...various pieces of information into the form needed by the FCCs. For example, roll angle and pitch angle are converted to three-wire AC signals, properly

  12. Light Absorbing Particle (LAP) Measurements in the Lower Stratosphere

    NASA Technical Reports Server (NTRS)

    Baumgardner, D.; Raga, G. B.; Anderson, B.; Diskin, G.; Sachse, G.; Kok, G.

    2003-01-01

    This viewgraph presentation covers the capabilities and design of the Single Particle Soot Photometer (SP-2), and reviews its role on the Sage III Ozone Loss Validation Experiment (SOLVE II) field campaign during 2003. On SOLVE II the SP-2 was carried into the Arctic onboard a DC-8 aircraft, in order to determine the size distribution of light-absorbing and non light-absorbing particles in the stratosphere. Graphs and tables relate some of the results from SOLVE II.

  13. Identification and Analysis of Labor Productivity Components Based on ACHIEVE Model (Case Study: Staff of Kermanshah University of Medical Sciences)

    PubMed Central

    Ziapour, Arash; Khatony, Alireza; Kianipour, Neda; Jafary, Faranak

    2015-01-01

    Identification and analysis of the components of labor productivity based on ACHIEVE model was performed among employees in different parts of Kermanshah University of Medical Sciences in 2014. This was a descriptive correlational study in which the population consisted of 270 working personnel in different administrative groups (contractual, fixed- term and regular) at Kermanshah University of Medical Sciences (872 people) that were selected among 872 people through stratified random sampling method based on Krejcie and Morgan sampling table. The survey tool included labor productivity questionnaire of ACHIEVE. Questionnaires were confirmed in terms of content and face validity, and their reliability was calculated using Cronbach’s alpha coefficient. The data were analyzed by SPSS-18 software using descriptive and inferential statistics. The mean scores for labor productivity dimensions of the employees, including environment (environmental fit), evaluation (training and performance feedback), validity (valid and legal exercise of personnel), incentive (motivation or desire), help (organizational support), clarity (role perception or understanding), ability (knowledge and skills) variables and total labor productivity were 4.10±0.630, 3.99±0.568, 3.97±0.607, 3.76±0.701, 3.63±0.746, 3.59±0.777, 3.49±0.882 and 26.54±4.347, respectively. Also, the results indicated that the seven factors of environment, performance assessment, validity, motivation, organizational support, clarity, and ability were effective in increasing labor productivity. The analysis of the current status of university staff in the employees’ viewpoint suggested that the two factors of environment and evaluation, which had the greatest impact on labor productivity in the viewpoint of the staff, were in a favorable condition and needed to be further taken into consideration by authorities. PMID:25560364

  14. Identification and analysis of labor productivity components based on ACHIEVE model (case study: staff of Kermanshah University of Medical Sciences).

    PubMed

    Ziapour, Arash; Khatony, Alireza; Kianipour, Neda; Jafary, Faranak

    2014-12-15

    Identification and analysis of the components of labor productivity based on ACHIEVE model was performed among employees in different parts of Kermanshah University of Medical Sciences in 2014. This was a descriptive correlational study in which the population consisted of 270 working personnel in different administrative groups (contractual, fixed- term and regular) at Kermanshah University of Medical Sciences (872 people) that were selected among 872 people through stratified random sampling method based on Krejcie and Morgan sampling table. The survey tool included labor productivity questionnaire of ACHIEVE. Questionnaires were confirmed in terms of content and face validity, and their reliability was calculated using Cronbach's alpha coefficient. The data were analyzed by SPSS-18 software using descriptive and inferential statistics. The mean scores for labor productivity dimensions of the employees, including environment (environmental fit), evaluation (training and performance feedback), validity (valid and legal exercise of personnel), incentive (motivation or desire), help (organizational support), clarity (role perception or understanding), ability (knowledge and skills) variables and total labor productivity were 4.10±0.630, 3.99±0.568, 3.97±0.607, 3.76±0.701, 3.63±0.746, 3.59±0.777, 3.49±0.882 and 26.54±4.347, respectively. Also, the results indicated that the seven factors of environment, performance assessment, validity, motivation, organizational support, clarity, and ability were effective in increasing labor productivity. The analysis of the current status of university staff in the employees' viewpoint suggested that the two factors of environment and evaluation, which had the greatest impact on labor productivity in the viewpoint of the staff, were in a favorable condition and needed to be further taken into consideration by authorities.

  15. Formal Logic and Flowchart for Diagnosis Validity Verification and Inclusion in Clinical Decision Support Systems

    NASA Astrophysics Data System (ADS)

    Sosa, M.; Grundel, L.; Simini, F.

    2016-04-01

    Logical reasoning is part of medical practice since its origins. Modern Medicine has included information-intensive tools to refine diagnostics and treatment protocols. We are introducing formal logic teaching in Medical School prior to Clinical Internship, to foster medical practice. Two simple examples (Acute Myocardial Infarction and Diabetes Mellitus) are given in terms of formal logic expression and truth tables. Flowcharts of both diagnostic processes help understand the procedures and to validate them logically. The particularity of medical information is that it is often accompanied by “missing data” which suggests to adapt formal logic to a “three state” logic in the future. Medical Education must include formal logic to understand complex protocols and best practices, prone to mutual interactions.

  16. Cloth-covered chiropractic treatment tables as a source of allergens and pathogenic microbes☆

    PubMed Central

    Evans, Marion W.; Campbell, Alan; Husbands, Chris; Breshears, Jennell; Ndetan, Harrison; Rupert, Ronald

    2008-01-01

    Abstract Objective Vinyl chiropractic tables have been found to harbor pathogenic bacteria, but wiping with a simple disinfection agent can significantly reduce the risk of bacteria. The aim of this study was to assess the presence of microbes and other allergens or pathogens on cloth chiropractic tables. Methods Cloth-covered tables in a chiropractic college teaching clinic were selected. Samples were taken from the facial piece and hand rests with RODAC plates containing nutrient agar, followed by confirmatory testing when indicated. Results Numerous microbacteria strains were found, including Staphylococcus aureus and Propionibacterium. Allergen-producing molds, including Candida, were also found. Conclusion Cloth tables were shown to contain pathogenic microbacteria and allergens. The chiropractic profession should establish an infection control protocol relevant to treatment tables and discard use of cloth-covered treatment tables in this process. PMID:19674718

  17. Contingency Table Browser - prediction of early stage protein structure.

    PubMed

    Kalinowska, Barbara; Krzykalski, Artur; Roterman, Irena

    2015-01-01

    The Early Stage (ES) intermediate represents the starting structure in protein folding simulations based on the Fuzzy Oil Drop (FOD) model. The accuracy of FOD predictions is greatly dependent on the accuracy of the chosen intermediate. A suitable intermediate can be constructed using the sequence-structure relationship information contained in the so-called contingency table - this table expresses the likelihood of encountering various structural motifs for each tetrapeptide fragment in the amino acid sequence. The limited accuracy with which such structures could previously be predicted provided the motivation for a more indepth study of the contingency table itself. The Contingency Table Browser is a tool which can visualize, search and analyze the table. Our work presents possible applications of Contingency Table Browser, among them - analysis of specific protein sequences from the point of view of their structural ambiguity.

  18. The Associations Between Clerkship Objective Structured Clinical Examination (OSCE) Grades and Subsequent Performance.

    PubMed

    Dong, Ting; Zahn, Christopher; Saguil, Aaron; Swygert, Kimberly A; Yoon, Michelle; Servey, Jessica; Durning, Steven

    2017-01-01

    Construct: We investigated the extent of the associations between medical students' clinical competency measured by performance in Objective Structured Clinical Examinations (OSCE) during Obstetrics/Gynecology and Family Medicine clerkships and later performance in both undergraduate and graduate medical education. There is a relative dearth of studies on the correlations between undergraduate OSCE scores and future exam performance within either undergraduate or graduate medical education and almost none on linking these simulated encounters to eventual patient care. Of the research studies that do correlate clerkship OSCE scores with future performance, these often have a small sample size and/or include only 1 clerkship. Students in USU graduating classes of 2007 through 2011 participated in the study. We investigated correlations between clerkship OSCE grades with United States Medical Licensing Examination Step 2 Clinical Knowledge, Clinical Skills, and Step 3 Exams scores as well as Postgraduate Year 1 program director's evaluation scores on Medical Expertise and Professionalism. We also conducted contingency table analysis to examine the associations between poor performance on clerkship OSCEs with failing Step 3 and receiving poor program director ratings. The correlation coefficients were weak between the clerkship OSCE grades and the outcomes. The strongest correlations existed between the clerkship OSCE grades and the Step 2 CS Integrated Clinical Encounter component score, Step 2 Clinical Skills, and Step 3 scores. Contingency table associations between poor performances on both clerkships OSCEs and poor Postgraduate Year 1 Program Director ratings were significant. The results of this study provide additional but limited validity evidence for the use of OSCEs during clinical clerkships given their associations with subsequent performance measures.

  19. Risk assessment of groundwater level variability using variable Kriging methods

    NASA Astrophysics Data System (ADS)

    Spanoudaki, Katerina; Kampanis, Nikolaos A.

    2015-04-01

    Assessment of the water table level spatial variability in aquifers provides useful information regarding optimal groundwater management. This information becomes more important in basins where the water table level has fallen significantly. The spatial variability of the water table level in this work is estimated based on hydraulic head measured during the wet period of the hydrological year 2007-2008, in a sparsely monitored basin in Crete, Greece, which is of high socioeconomic and agricultural interest. Three Kriging-based methodologies are elaborated in Matlab environment to estimate the spatial variability of the water table level in the basin. The first methodology is based on the Ordinary Kriging approach, the second involves auxiliary information from a Digital Elevation Model in terms of Residual Kriging and the third methodology calculates the probability of the groundwater level to fall below a predefined minimum value that could cause significant problems in groundwater resources availability, by means of Indicator Kriging. The Box-Cox methodology is applied to normalize both the data and the residuals for improved prediction results. In addition, various classical variogram models are applied to determine the spatial dependence of the measurements. The Matérn model proves to be the optimal, which in combination with Kriging methodologies provides the most accurate cross validation estimations. Groundwater level and probability maps are constructed to examine the spatial variability of the groundwater level in the basin and the associated risk that certain locations exhibit regarding a predefined minimum value that has been set for the sustainability of the basin's groundwater resources. Acknowledgement The work presented in this paper has been funded by the Greek State Scholarships Foundation (IKY), Fellowships of Excellence for Postdoctoral Studies (Siemens Program), 'A simulation-optimization model for assessing the best practices for the protection of surface water and groundwater in the coastal zone', (2013 - 2015). Varouchakis, E. A. and D. T. Hristopulos (2013). "Improvement of groundwater level prediction in sparsely gauged basins using physical laws and local geographic features as auxiliary variables." Advances in Water Resources 52: 34-49. Kitanidis, P. K. (1997). Introduction to geostatistics, Cambridge: University Press.

  20. Game Analysis, Validation, and Potential Application of EyeToy Play and Play 2 to Upper-Extremity Rehabilitation

    PubMed Central

    Caldwell, Michelle; Dickerhoof, Erica; Hall, Anastasia; Odakura, Bryan; Fanchiang, Hsin-Chen

    2014-01-01

    Objective. To describe and analyze the potential use of games in the commercially available EyeToy Play and EyeToy Play 2 on required/targeted training skills and feedback provided for clinical application. Methods. A summary table including all games was created. Two movement experts naïve to the software validated required/targeted training skills and feedback for 10 randomly selected games. Ten healthy school-aged children played to further validate the required/targeted training skills. Results. All but two (muscular and cardiovascular endurance) had excellent agreement in required/targeted training skills, and there was 100% agreement on feedback. Children's performance in required/targeted training skills (number of unilateral reaches and bilateral reaches, speed, muscular endurance, and cardiovascular endurance) significantly differed between games (P < .05). Conclusion. EyeToy Play games could be used to train children's arm function. However, a careful evaluation of the games is needed since performance might not be consistent between players and therapists' interpretation. PMID:25610652

  1. 2005 AG20/20 Annual Review

    NASA Technical Reports Server (NTRS)

    Ross, Kenton W.; McKellip, Rodney D.

    2005-01-01

    Topics covered include: Implementation and Validation of Sensor-Based Site-Specific Crop Management; Enhanced Management of Agricultural Perennial Systems (EMAPS) Using GIS and Remote Sensing; Validation and Application of Geospatial Information for Early Identification of Stress in Wheat; Adapting and Validating Precision Technologies for Cotton Production in the Mid-Southern United States - 2004 Progress Report; Development of a System to Automatically Geo-Rectify Images; Economics of Precision Agriculture Technologies in Cotton Production-AG 2020 Prescription Farming Automation Algorithms; Field Testing a Sensor-Based Applicator for Nitrogen and Phosphorus Application; Early Detection of Citrus Diseases Using Machine Vision and DGPS; Remote Sensing of Citrus Tree Stress Levels and Factors; Spectral-based Nitrogen Sensing for Citrus; Characterization of Tree Canopies; In-field Sensing of Shallow Water Tables and Hydromorphic Soils with an Electromagnetic Induction Profiler; Maintaining the Competitiveness of Tree Fruit Production Through Precision Agriculture; Modeling and Visualizing Terrain and Remote Sensing Data for Research and Education in Precision Agriculture; Thematic Soil Mapping and Crop-Based Strategies for Site-Specific Management; and Crop-Based Strategies for Site-Specific Management.

  2. Towards Usable E-Health. A Systematic Review of Usability Questionnaires.

    PubMed

    Sousa, Vanessa E C; Dunn Lopez, Karen

    2017-05-10

    The use of e-health can lead to several positive outcomes. However, the potential for e-health to improve healthcare is partially dependent on its ease of use. In order to determine the usability for any technology, rigorously developed and appropriate measures must be chosen. To identify psychometrically tested questionnaires that measure usability of e-health tools, and to appraise their generalizability, attributes coverage, and quality. We conducted a systematic review of studies that measured usability of e-health tools using four databases (Scopus, PubMed, CINAHL, and HAPI). Non-primary research, studies that did not report measures, studies with children or people with cognitive limitations, and studies about assistive devices or medical equipment were systematically excluded. Two authors independently extracted information including: questionnaire name, number of questions, scoring method, item generation, and psychometrics using a data extraction tool with pre-established categories and a quality appraisal scoring table. Using a broad search strategy, 5,558 potentially relevant papers were identified. After removing duplicates and applying exclusion criteria, 35 articles remained that used 15 unique questionnaires. From the 15 questionnaires, only 5 were general enough to be used across studies. Usability attributes covered by the questionnaires were: learnability (15), efficiency (12), and satisfaction (11). Memorability (1) was the least covered attribute. Quality appraisal showed that face/content (14) and construct (7) validity were the most frequent types of validity assessed. All questionnaires reported reliability measurement. Some questionnaires scored low in the quality appraisal for the following reasons: limited validity testing (7), small sample size (3), no reporting of user centeredness (9) or feasibility estimates of time, effort, and expense (7). Existing questionnaires provide a foundation for research on e-health usability. However, future research is needed to broaden the coverage of the usability attributes and psychometric properties of the available questionnaires.

  3. Incorporating water table dynamics in climate modeling: 1. Water table observations and equilibrium water table simulations

    NASA Astrophysics Data System (ADS)

    Fan, Ying; Miguez-Macho, Gonzalo; Weaver, Christopher P.; Walko, Robert; Robock, Alan

    2007-05-01

    Soil moisture is a key participant in land-atmosphere interactions and an important determinant of terrestrial climate. In regions where the water table is shallow, soil moisture is coupled to the water table. This paper is the first of a two-part study to quantify this coupling and explore its implications in the context of climate modeling. We examine the observed water table depth in the lower 48 states of the United States in search of salient spatial and temporal features that are relevant to climate dynamics. As a means to interpolate and synthesize the scattered observations, we use a simple two-dimensional groundwater flow model to construct an equilibrium water table as a result of long-term climatic and geologic forcing. Model simulations suggest that the water table depth exhibits spatial organization at watershed, regional, and continental scales, which may have implications for the spatial organization of soil moisture at similar scales. The observations suggest that water table depth varies at diurnal, event, seasonal, and interannual scales, which may have implications for soil moisture memory at these scales.

  4. Six simple questions to detect malnutrition or malnutrition risk in elderly women

    PubMed Central

    Gutiérrez-Gómez, Tranquilina; Cortés, Ernesto; Peñarrieta-de Córdova, Isabel; Gil-Guillén, Vicente Francisco; Ferrer-Diego, Rosa María

    2015-01-01

    Of the numerous instruments available to detect nutritional risk, the most widely used is the Mini Nutritional Assessment (MNA), but it takes 15–20 min to complete and its systematic administration in primary care units is not feasible in practice. We developed a tool to evaluate malnutrition risk that can be completed more rapidly using just clinical variables. Between 2008 and 2013, we conducted a cross-sectional study of 418 women aged ≥60 years from Mexico. Our outcome was positive MNA and our secondary variables included were: physical activity, diabetes mellitus, hypertension, educational level, dentition, psychological problems, living arrangements, history of falls, age and the number of tablets taken daily. The sample was divided randomly into two groups: construction and validation. Construction: a risk table was constructed to estimate the likelihood of the outcome, and risk groups were formed. Validation: the area under the ROC curve (AUC) was calculated and we compared the expected and the observed outcomes. The following risk factors were identified: physical activity, hypertension, diabetes, dentition, psychological problems and living with the family. The AUC was 0.77 (95% CI [0.68–0.86], p < 0.001). No differences were found between the expected and the observed outcomes (p = 0.902). This study presents a new malnutrition screening test for use in elderly women. The test is based on six very simple, quick and easy-to-evaluate questions, enabling the MNA to be reserved for confirmation. However, it should be used with caution until validation studies have been performed in other geographical areas. PMID:26500824

  5. The clinical effectiveness and cost-effectiveness of testing for cytochrome P450 polymorphisms in patients with schizophrenia treated with antipsychotics: a systematic review and economic evaluation.

    PubMed

    Fleeman, N; McLeod, C; Bagust, A; Beale, S; Boland, A; Dundar, Y; Jorgensen, A; Payne, K; Pirmohamed, M; Pushpakom, S; Walley, T; de Warren-Penny, P; Dickson, R

    2010-01-01

    To determine whether testing for cytochrome P450 (CYP) polymorphisms in adults entering antipsychotic treatment for schizophrenia leads to improvement in outcomes, is useful in medical, personal or public health decision-making, and is a cost-effective use of health-care resources. The following electronic databases were searched for relevant published literature: Cochrane Controlled Trials Register, Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effectiveness, EMBASE, Health Technology Assessment database, ISI Web of Knowledge, MEDLINE, PsycINFO, NHS Economic Evaluation Database, Health Economic Evaluation Database, Cost-effectiveness Analysis (CEA) Registry and the Centre for Health Economics website. In addition, publicly available information on various genotyping tests was sought from the internet and advisory panel members. A systematic review of analytical validity, clinical validity and clinical utility of CYP testing was undertaken. Data were extracted into structured tables and narratively discussed, and meta-analysis was undertaken when possible. A review of economic evaluations of CYP testing in psychiatry and a review of economic models related to schizophrenia were also carried out. For analytical validity, 46 studies of a range of different genotyping tests for 11 different CYP polymorphisms (most commonly CYP2D6) were included. Sensitivity and specificity were high (99-100%). For clinical validity, 51 studies were found. In patients tested for CYP2D6, an association between genotype and tardive dyskinesia (including Abnormal Involuntary Movement Scale scores) was found. The only other significant finding linked the CYP2D6 genotype to parkinsonism. One small unpublished study met the inclusion criteria for clinical utility. One economic evaluation assessing the costs and benefits of CYP testing for prescribing antidepressants and 28 economic models of schizophrenia were identified; none was suitable for developing a model to examine the cost-effectiveness of CYP testing. Tests for determining genotypes appear to be accurate although not all aspects of analytical validity were reported. Given the absence of convincing evidence from clinical validity studies, the lack of clinical utility and economic studies, and the unsuitability of published schizophrenia models, no model was developed; instead key features and data requirements for economic modelling are presented. Recommendations for future research cover both aspects of research quality and data that will be required to inform the development of future economic models.

  6. Relationship between pure Schistosoma haematobium infection in Upper Egypt and irrigation systems. Part 1: methods of study.

    PubMed

    Hammam, H M; Allam, F A; Hassanein, F; El-Garby, M T

    1975-01-01

    Four villages in Assiut Governorate were studied. They were matched for availability and time of introduction of medical services, the size of population and the socioeconomic status. One village had a basin system of irrigation. The other three villages had perennial irrigation introduced at different dates. A sketch map of each village was made showing the location of every house and the irrigation channels. Total coverage was intended in Gezirat El-Maabda (with basin irrigation) and Nazza Karar (with perennial irrigation-recently introduced). In El-Ghorayeb and Garf Sarhan (with older systems of perennial irrigation) systematic random samples were studied. The Study included a full, double check clinical examination of urine and stools samples and a social study. Data about educational level and activities that bring the individual in contact with canal water were recorded. Tables showing the age and sex distribution of the total population and the population studied in each village are presented and show validity of the samples taken from the population.

  7. Estimated depth to the water table and estimated rate of recharge in outcrops of the Chicot and Evangeline aquifers near Houston, Texas

    USGS Publications Warehouse

    Noble, J.E.; Bush, P.W.; Kasmarek, M.C.; Barbie, D.L.

    1996-01-01

    In 1989, the U.S. Geological Survey, in cooperation with the Harris-Galveston Coastal Subsidence District, began a field study to determine the depth to the water table and to estimate the rate of recharge in outcrops of the Chicot and Evangeline aquifers near Houston, Texas. The study area comprises about 2,000 square miles of outcrops of the Chicot and Evangeline aquifers in northwest Harris County, Montgomery County, and southern Walker County. Because of the scarcity of measurable water-table wells, depth to the water table below land surface was estimated using a surface geophysical technique, seismic refraction. The water table in the study area generally ranges from about 10 to 30 foot below land surface and typically is deeper in areas of relatively high land-surface altitude than in areas of relatively low land- surface altitude. The water table has demonstrated no long-term trends since ground-water development began, with the probable exception of the water table in the Katy area: There the water table is more than 75 feet deep, probably due to ground-water pumpage from deeper zones. An estimated rate of recharge in the aquifer outcrops was computed using the interface method in which environmental tritium is a ground-water tracer. The estimated average total recharge rate in the study area is 6 inches per year. This rate is an upper bound on the average recharge rate during the 37 years 1953-90 because it is based on the deepest penetration (about 80 feet) of postnuclear-testing tritium concentrations. The rate, which represents one of several components of a complex regional hydrologic budget, is considered reasonable but is not definitive because of uncertainty regarding the assumptions and parameters used in its computation.

  8. Does a perceptuomotor skills assessment have added value to detect talent for table tennis in primary school children?

    PubMed

    Faber, Irene R; Pion, Johan; Munivrana, Goran; Faber, Niels R; Nijhuis-Van der Sanden, Maria W G

    2017-04-18

    Talent detection intends to support lifelong sports participation, reduce dropouts and stimulate sports at the elite level. For this purpose it is important to reveal the specific profile which directs children to the sports that connect to their strengths and preferences. This study evaluated a perceptuomotor skills assessment as part of talent detection for table tennis, a sport in which perceptuomotor skills are considered essential to cope with the difficult technical aspects. Primary school children (n = 121) and gifted young table tennis players (n = 146) were assessed using the Dutch perceptuomotor skills assessment measuring "ball control" and "gross motor function". A discriminant function analysis confirmed the added value by identifying primary school children fitting the table tennis perceptuomotor profile of the young gifted table tennis players (28%). General linear model analyses for the assessment's individual test items showed that the table tennis players outperformed their primary school peers on all "ball control" items (P < 0.001). In conclusion, the assessment appears to be of added value for talent detection in table tennis at this young age. Longitudinal studies need to reveal the predictive value for sports participation and elite sports.

  9. An efficient deterministic-probabilistic approach to modeling regional groundwater flow: 1. Theory

    USGS Publications Warehouse

    Yen, Chung-Cheng; Guymon, Gary L.

    1990-01-01

    An efficient probabilistic model is developed and cascaded with a deterministic model for predicting water table elevations in regional aquifers. The objective is to quantify model uncertainty where precise estimates of water table elevations may be required. The probabilistic model is based on the two-point probability method which only requires prior knowledge of uncertain variables mean and coefficient of variation. The two-point estimate method is theoretically developed and compared with the Monte Carlo simulation method. The results of comparisons using hypothetical determinisitic problems indicate that the two-point estimate method is only generally valid for linear problems where the coefficients of variation of uncertain parameters (for example, storage coefficient and hydraulic conductivity) is small. The two-point estimate method may be applied to slightly nonlinear problems with good results, provided coefficients of variation are small. In such cases, the two-point estimate method is much more efficient than the Monte Carlo method provided the number of uncertain variables is less than eight.

  10. An Efficient Deterministic-Probabilistic Approach to Modeling Regional Groundwater Flow: 1. Theory

    NASA Astrophysics Data System (ADS)

    Yen, Chung-Cheng; Guymon, Gary L.

    1990-07-01

    An efficient probabilistic model is developed and cascaded with a deterministic model for predicting water table elevations in regional aquifers. The objective is to quantify model uncertainty where precise estimates of water table elevations may be required. The probabilistic model is based on the two-point probability method which only requires prior knowledge of uncertain variables mean and coefficient of variation. The two-point estimate method is theoretically developed and compared with the Monte Carlo simulation method. The results of comparisons using hypothetical determinisitic problems indicate that the two-point estimate method is only generally valid for linear problems where the coefficients of variation of uncertain parameters (for example, storage coefficient and hydraulic conductivity) is small. The two-point estimate method may be applied to slightly nonlinear problems with good results, provided coefficients of variation are small. In such cases, the two-point estimate method is much more efficient than the Monte Carlo method provided the number of uncertain variables is less than eight.

  11. Assessment of the Broadleaf Crops Leaf Area Index Product from the Terra MODIS Instrument

    NASA Technical Reports Server (NTRS)

    Tan, Bin; Hu, Jiannan; Huang, Dong; Yang, Wenze; Zhang, Ping; Shabanov, Nikolay V.; Knyazikhin, Yuri; Nemani, Ramakrishna R.; Myneni, Ranga B.

    2005-01-01

    The first significant processing of Terra MODIS data, called Collection 3, covered the period from November 2000 to December 2002. The Collection 3 leaf area index (LAI) and fraction vegetation absorbed photosynthetically active radiation (FPAR) products for broadleaf crops exhibited three anomalies (a) high LAI values during the peak growing season, (b) differences in LAI seasonality between the radiative transfer-based main algorithm and the vegetation index based back-up algorithm, and (c) too few retrievals from the main algorithm during the summer period when the crops are at full flush. The cause of these anomalies is a mismatch between reflectances modeled by the algorithm and MODIS measurements. Therefore, the Look-Up-Tables accompanying the algorithm were revised and implemented in Collection 4 processing. The main algorithm with the revised Look-Up-Tables generated retrievals for over 80% of the pixels with valid data. Retrievals from the back-up algorithm, although few, should be used with caution as they are generated from surface reflectances with high uncertainties.

  12. Locating the Seventh Cervical Spinous Process: Development and Validation of a Multivariate Model Using Palpation and Personal Information.

    PubMed

    Ferreira, Ana Paula A; Póvoa, Luciana C; Zanier, José F C; Ferreira, Arthur S

    2017-02-01

    The aim of this study was to develop and validate a multivariate prediction model, guided by palpation and personal information, for locating the seventh cervical spinous process (C7SP). A single-blinded, cross-sectional study at a primary to tertiary health care center was conducted for model development and temporal validation. One-hundred sixty participants were prospectively included for model development (n = 80) and time-split validation stages (n = 80). The C7SP was located using the thorax-rib static method (TRSM). Participants underwent chest radiography for assessment of the inner body structure located with TRSM and using radio-opaque markers placed over the skin. Age, sex, height, body mass, body mass index, and vertex-marker distance (D V-M ) were used to predict the distance from the C7SP to the vertex (D V-C7 ). Multivariate linear regression modeling, limits of agreement plot, histogram of residues, receiver operating characteristic curves, and confusion tables were analyzed. The multivariate linear prediction model for D V-C7 (in centimeters) was D V-C7 = 0.986D V-M + 0.018(mass) + 0.014(age) - 1.008. Receiver operating characteristic curves had better discrimination of D V-C7 (area under the curve = 0.661; 95% confidence interval = 0.541-0.782; P = .015) than D V-M (area under the curve = 0.480; 95% confidence interval = 0.345-0.614; P = .761), with respective cutoff points at 23.40 cm (sensitivity = 41%, specificity = 63%) and 24.75 cm (sensitivity = 69%, specificity = 52%). The C7SP was correctly located more often when using predicted D V-C7 in the validation sample than when using the TRSM in the development sample: n = 53 (66%) vs n = 32 (40%), P < .001. Better accuracy was obtained when locating the C7SP by use of a multivariate model that incorporates palpation and personal information. Copyright © 2016. Published by Elsevier Inc.

  13. Erratum: ``Spectroscopic Survey of M Dwarfs within 100 Parsecs of the Sun'' (AJ, 130, 1871 [2005])

    NASA Astrophysics Data System (ADS)

    Bochanski, John J.; Hawley, Suzanne L.; Reid, I. Neill; Covey, Kevin R.; West, Andrew A.; Tinney, C. G.; Gizis, John E.

    2006-06-01

    In Table 2 of the recent paper titled ``Spectroscopic Survey of M Dwarfs within 100 Parsecs of the Sun'' by Bochanski et al., the authors presented UVW space velocities, proper motions, radial velocities, and distances to the 574 M dwarfs within their sample. The UVW motions were then examined as a function of vertical distance from the Galactic plane, with a discussion on the significance of the results and their application to dynamic heating models. The authors have discovered an error in the calculation of the UVW motions. During the preparation of the manuscript, the computed space motions were not accurately recorded for a given star, resulting in sporadic errors throughout Table 2 and the subsequent analysis. In addition, the authors want to explicitly state that the UVW motions, corrected to the local standard of rest, are in a right-handed system, with a positive U-velocity in the direction of the Galactic center. The new space velocities for the M dwarfs within this sample affect Tables 2 and 4-6 and Figures 8 and 9. The new values are included below, but the authors stress that the original conclusions presented in § 6 of the original paper remain valid. In the new version of Figure 9, the general decrease in velocity dispersion of the broad component (circles) with distance from the plane is preserved, along with the mostly constant dispersion of the narrow velocity dispersion component (squares). For completeness, a new illustrative demonstration of our kinematic analysis is shown, along with updated versions of Tables 4-6, which present the details of the kinematic analysis for UVW. The authors sincerely regret any confusion introduced by this error and wish to thank Francesca Figueras for helpful discussion.

  14. Groundwater Controls on Vegetation Composition and Patterning in Mountain Meadows

    NASA Astrophysics Data System (ADS)

    Loheide, S. P.; Lowry, C.; Moore, C. E.; Lundquist, J. D.

    2010-12-01

    Mountain meadows are groundwater dependent ecosystems that are hotspots of biodiversity and productivity in the Sierra Nevada of California. Meadow vegetation relies on shallow groundwater during the region’s dry summer growing season. Vegetation composition in this environment is influenced both by 1) oxygen stress that occurs when portions of the root zone are saturated and anaerobic conditions are created that limit root respiration and 2) water stress that occurs when the water table drops and water-limited conditions are created in the root zone. A watershed model that explicitly accounts for snowmelt processes was linked to a fine resolution groundwater flow model of Tuolumne Meadows in Yosemite National Park, CA to simulated spatially distributed water table dynamics. This linked hydrologic model was calibrated to observations from a well observation network for 2006-2008, and validated using data from 2009. A vegetation survey was also conducted at the site in which the three dominant species were identified at more than 200 plots distributed across the meadow. Nonparametric multiplicative regression was performed to create and select the best models for predicting vegetation dominance based on simulated hydrologic regime. The hydrologic niche of three vegetation types representing wet, moist, and dry meadow vegetation communities was best described using both 1) a sum exceedance value calculated as the integral of water table position above a threshold of oxygen stress and 2) a sum deceedance value calculated as the integral of water table position below a threshold of water stress. This linked hydrologic and vegetative modeling framework advances our ability to predict the propagation of human-induced climatic and land-use/-cover changes through the hydrologic system to the ecosystem.

  15. Self-Selection of Frequency Tables with Bilateral Mismatches in an Acoustic Simulation of a Cochlear Implant

    PubMed Central

    Fitzgerald, Matthew B.; Prosolovich, Ksenia; Tan, Chin-Tuan; Glassman, E. Katelyn; Svirsky, Mario A.

    2017-01-01

    Background Many recipients of bilateral cochlear implants (CIs) may have differences in electrode insertion depth. Previous reports indicate that when a bilateral mismatch is imposed, performance on tests of speech understanding or sound localization becomes worse. If recipients of bilateral CIs cannot adjust to a difference in insertion depth, adjustments to the frequency table may be necessary to maximize bilateral performance. Purpose The purpose of this study was to examine the feasibility of using real-time manipulations of the frequency table to offset any decrements in performance resulting from a bilateral mismatch. Research Design A simulation of a CI was used because it allows for explicit control of the size of a bilateral mismatch. Such control is not available with users of CIs. Study Sample A total of 31 normal-hearing young adults participated in this study. Data Collection and Analysis Using a CI simulation, four bilateral mismatch conditions (0, 0.75, 1.5, and 3 mm) were created. In the left ear, the analysis filters and noise bands of the CI simulation were the same. In the right ear, the noise bands were shifted higher in frequency to simulate a bilateral mismatch. Then, listeners selected a frequency table in the right ear that was perceived as maximizing bilateral speech intelligibility. Word-recognition scores were then assessed for each bilateral mismatch condition. Listeners were tested with both a standard frequency table, which preserved a bilateral mismatch, or with their self-selected frequency table. Results Consistent with previous reports, bilateral mismatches of 1.5 and 3 mm yielded decrements in word recognition when the standard table was used in both ears. However, when listeners used the self-selected frequency table, performance was the same regardless of the size of the bilateral mismatch. Conclusions Self-selection of a frequency table appears to be a feasible method for ameliorating the negative effects of a bilateral mismatch. These data may have implications for recipients of bilateral CIs who cannot adapt to a bilateral mismatch, because they suggest that (1) such individuals may benefit from modification of the frequency table in one ear and (2) self-selection of a “most intelligible” frequency table may be a useful tool for determining how the frequency table should be altered to optimize speech recognition. PMID:28534729

  16. Estimated Depth to Ground Water and Configuration of the Water Table in the Portland, Oregon Area

    USGS Publications Warehouse

    Snyder, Daniel T.

    2008-01-01

    Reliable information on the configuration of the water table in the Portland metropolitan area is needed to address concerns about various water-resource issues, especially with regard to potential effects from stormwater injection systems such as UIC (underground injection control) systems that are either existing or planned. To help address these concerns, this report presents the estimated depth-to-water and water-table elevation maps for the Portland area, along with estimates of the relative uncertainty of the maps and seasonal water-table fluctuations. The method of analysis used to determine the water-table configuration in the Portland area relied on water-level data from shallow wells and surface-water features that are representative of the water table. However, the largest source of available well data is water-level measurements in reports filed by well constructors at the time of new well installation, but these data frequently were not representative of static water-level conditions. Depth-to-water measurements reported in well-construction records generally were shallower than measurements by the U.S. Geological Survey (USGS) in the same or nearby wells, although many depth-to-water measurements were substantially deeper than USGS measurements. Magnitudes of differences in depth-to-water measurements reported in well records and those measured by the USGS in the same or nearby wells ranged from -119 to 156 feet with a mean of the absolute value of the differences of 36 feet. One possible cause for the differences is that water levels in many wells reported in well records were not at equilibrium at the time of measurement. As a result, the analysis of the water-table configuration relied on water levels measured during the current study or used in previous USGS investigations in the Portland area. Because of the scarcity of well data in some areas, the locations of select surface-water features including major rivers, streams, lakes, wetlands, and springs representative of where the water table is at land surface were used to augment the analysis. Ground-water and surface-water data were combined for use in interpolation of the water-table configuration. Interpolation of the two representations typically used to define water-table position - depth to the water table below land surface and elevation of the water table above a datum - can produce substantially different results and may represent the end members of a spectrum of possible interpolations largely determined by the quantity of recharge and the hydraulic properties of the aquifer. Datasets of depth-to-water and water-table elevation for the current study were interpolated independently based on kriging as the method of interpolation with parameters determined through the use of semivariograms developed individually for each dataset. Resulting interpolations were then combined to create a single, averaged representation of the water-table configuration. Kriging analysis also was used to develop a map of relative uncertainty associated with the values of the water-table position. Accuracy of the depth-to-water and water-table elevation maps is dependent on various factors and assumptions pertaining to the data, the method of interpolation, and the hydrogeologic conditions of the surficial aquifers in the study area. Although the water-table configuration maps generally are representative of the conditions in the study area, the actual position of the water-table may differ from the estimated position at site-specific locations, and short-term, seasonal, and long-term variations in the differences also can be expected. The relative uncertainty map addresses some but not all possible errors associated with the analysis of the water-table configuration and does not depict all sources of uncertainty. Depth to water greater than 300 feet in the Portland area is limited to parts of the Tualatin Mountains, the foothills of the Cascade Range, and muc

  17. Möglichkeiten und Grenzen der Validierung flächenhaft modellierter Nitrateinträge ins Grundwasser mit der N2/Ar-Methode

    NASA Astrophysics Data System (ADS)

    Eschenbach, Wolfram; Budziak, Dörte; Elbracht, Jörg; Höper, Heinrich; Krienen, Lisa; Kunkel, Ralf; Meyer, Knut; Well, Reinhard; Wendland, Frank

    2018-06-01

    Valid models for estimating nitrate emissions from agriculture to groundwater are an indispensable forecasting tool. A major challenge for model validation is the spatial and temporal inconsistency between data from groundwater monitoring points and modelled nitrate inputs into groundwater, and the fact that many existing groundwater monitoring wells cannot be used for validation. With the help of the N2/Ar-method, groundwater monitoring wells in areas with reduced groundwater can now be used for model validation. For this purpose, 484 groundwater monitoring wells were sampled in Lower Saxony. For the first time, modelled potential nitrate concentrations in groundwater recharge (from the DENUZ model) were compared with nitrate input concentrations, which were calculated using the N2/Ar method. The results show a good agreement between both methods for glacial outwash plains and moraine deposits. Although the nitrate degradation processes in groundwater and soil merge seamlessly in areas with a shallow groundwater table, the DENUZ model only calculates denitrification in the soil zone. The DENUZ model thus predicts 27% higher nitrate emissions into the groundwater than the N2/Ar method in such areas. To account for high temporal and spatial variability of nitrate emissions into groundwater, a large number of groundwater monitoring points must be investigated for model validation.

  18. Stopping force and straggling of 0.6-4.7 MeV H, He and Li ions in the polyhydroxybutyrate foil

    NASA Astrophysics Data System (ADS)

    Hsu, J. Y.; Yu, Y. C.; Chen, K. M.

    2010-06-01

    Stopping force and straggling of 0.6-3.5 MeV 1H ions, 2.0-4.7 MeV 4He ions and 1.4-4.4 MeV 7Li ions in the polyhydroxybutyrate (PHB) foil were measured by means of a transmission technique. The measured stopping forces are in well agreement with the SRIM 2008 calculation and the ICRU Report tables, except for the lower energy region. The obtained energy loss straggling deviates from the Bohr's value by as much as 23.6% for the energies under study. The validity of the Bragg's rule has also been demonstrated in the stopping force and straggling for 1H, 4He and 7Li ions in the PHB foil.

  19. Data analysis report on ATS-F COMSAT millimeter wave propagation experiment, part 1. [effects of hydrometeors on ground to satellite communication

    NASA Technical Reports Server (NTRS)

    Hyde, G.

    1976-01-01

    The 13/18 GHz COMSAT Propagation Experiment (CPE) was performed to measure attenuation caused by hydrometeors along slant paths from transmitting terminals on the ground to the ATS-6 satellite. The effectiveness of site diversity in overcoming this impairment was also studied. Problems encountered in assembling a valid data base of rain induced attenuation data for statistical analysis are considered. The procedures used to obtain the various statistics are then outlined. The graphs and tables of statistical data for the 15 dual frequency (13 and 18 GHz) site diversity locations are discussed. Cumulative rain rate statistics for the Fayetteville and Boston sites based on point rainfall data collected are presented along with extrapolations of the attenuation and point rainfall data.

  20. Administrative database research has unique characteristics that can risk biased results.

    PubMed

    van Walraven, Carl; Austin, Peter

    2012-02-01

    The provision of health care frequently creates digitized data--such as physician service claims, medication prescription records, and hospitalization abstracts--that can be used to conduct studies termed "administrative database research." While most guidelines for assessing the validity of observational studies apply to administrative database research, the unique data source and analytical opportunities for these studies create risks that can make them uninterpretable or bias their results. Nonsystematic review. The risks of uninterpretable or biased results can be minimized by; providing a robust description of the data tables used, focusing on both why and how they were created; measuring and reporting the accuracy of diagnostic and procedural codes used; distinguishing between clinical significance and statistical significance; properly accounting for any time-dependent nature of variables; and analyzing clustered data properly to explore its influence on study outcomes. This article reviewed these five issues as they pertain to administrative database research to help maximize the utility of these studies for both readers and writers. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. Investigation on Suitability of Natural Fibre as Replacement Material for Table Tennis Blade

    NASA Astrophysics Data System (ADS)

    Arifin, A. M. T.; Fahrul Hassan, M.; Ismail, A. E.; Zulafif Rahim, M.; Rasidi Ibrahim, M.; Haq, R. H. Abdul; Rahman, M. N. A.; Yunos, M. Z.; Amin, M. H. M.

    2017-08-01

    This paper presents an investigation of suitability natural fibre as replacement material for table tennis blade, due to low cost, lightweight and apparently environmentally. Nowadays, natural fibre are one of the materials often used in replaced the main material on manufacturing sector, such as automotive, and construction. The objective of this study is to investigate and evaluate the suitability natural fiber materials to replace wood as a structure on table tennis blade. The mechanical properties of the different natural fibre material were examined, and correlated with characteristic of table tennis blade. The natural fibre selected for the study are kenaf (Hibiscus Cannabinus), jute, hemp, sisal (Agave Sisalana) and ramie. A further comparison was made with the corresponding properties of each type of natural fiber using Quality Function Deployment (QFD) and Theory of Inventive Problem Solving (TRIZ). TRIZ has been used to determine the most appropriate solution in producing table tennis blade. The results showed the most appropriate solution in producing table tennis blade using natural fibre is kenaf natural fibre. The selected on suitability natural fibre used as main structure on table tennis blade are based on the characteristics need for good performance of table tennis blade, such as energy absorption, lightweight, strength and hardness. Therefore, it shows an opportunity for replacing existing materials with a higher strength, lower cost alternative that is environmentally friendly.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kilkenny, J.; Richau, G.; Sangster, C.

    A major goal of the Stockpile Stewardship Program (SSP) is to deliver validated numerical models, benchmarked against experiments that address relevant and important issues and provide data that stress the codes and our understanding. DOENNSA has made significant investments in major facilities and high-performance computing to successfully execute the SSP. The more information obtained about the physical state of the plasmas produced, the more stringent the test of theories, models, and codes can be, leading to increased confidence in our predictive capability. To fully exploit the world-leading capabilities of the ICF program, a multi-year program to develop and deploy advancedmore » diagnostics has been developed by the expert scientific community. To formalize these activities NNSA’s Acting Director for the Inertial Confinement Fusion Program directed the formation and duties of the National Diagnostics Working Group (NDWG) in a Memorandum 11/3/16 (Appendix A). The NDWG identified eight transformational diagnostics, shown in Table 1, that will provide unprecedented information from experiments in support of the SSP at NIF, Z and OMEGA. Table 1 shows how the missions of the SSP experiments including materials, complex hydrodynamics, radiation flow and effects and thermo-nuclear burn and boost will produce new observables, which will be measured using a variety of largely new diagnostic technologies used in the eight transformational diagnostics. The data provided by these diagnostics will validate and improve the physics contained within the SSP’s simulations and both uncover and quantify important phenomena that lie beyond our present understanding.« less

  3. Research applications for an Object and Action Naming Battery to assess naming skills in adult Spanish-English bilingual speakers.

    PubMed

    Edmonds, Lisa A; Donovan, Neila J

    2014-06-01

    Virtually no valid materials are available to evaluate confrontation naming in Spanish-English bilingual adults in the U.S. In a recent study, a large group of young Spanish-English bilingual adults were evaluated on An Object and Action Naming Battery (Edmonds & Donovan in Journal of Speech, Language, and Hearing Research 55:359-381, 2012). Rasch analyses of the responses resulted in evidence for the content and construct validity of the retained items. However, the scope of that study did not allow for extensive examination of individual item characteristics, group analyses of participants, or the provision of testing and scoring materials or raw data, thereby limiting the ability of researchers to administer the test to Spanish-English bilinguals and to score the items with confidence. In this study, we present the in-depth information described above on the basis of further analyses, including (1) online searchable spreadsheets with extensive empirical (e.g., accuracy and name agreeability) and psycholinguistic item statistics; (2) answer sheets and instructions for scoring and interpreting the responses to the Rasch items; (3) tables of alternative correct responses for English and Spanish; (4) ability strata determined for all naming conditions (English and Spanish nouns and verbs); and (5) comparisons of accuracy across proficiency groups (i.e., Spanish dominant, English dominant, and balanced). These data indicate that the Rasch items from An Object and Action Naming Battery are valid and sensitive for the evaluation of naming in young Spanish-English bilingual adults. Additional information based on participant responses for all of the items on the battery can provide researchers with valuable information to aid in stimulus development and response interpretation for experimental studies in this population.

  4. Log-Linear Models for Gene Association

    PubMed Central

    Hu, Jianhua; Joshi, Adarsh; Johnson, Valen E.

    2009-01-01

    We describe a class of log-linear models for the detection of interactions in high-dimensional genomic data. This class of models leads to a Bayesian model selection algorithm that can be applied to data that have been reduced to contingency tables using ranks of observations within subjects, and discretization of these ranks within gene/network components. Many normalization issues associated with the analysis of genomic data are thereby avoided. A prior density based on Ewens’ sampling distribution is used to restrict the number of interacting components assigned high posterior probability, and the calculation of posterior model probabilities is expedited by approximations based on the likelihood ratio statistic. Simulation studies are used to evaluate the efficiency of the resulting algorithm for known interaction structures. Finally, the algorithm is validated in a microarray study for which it was possible to obtain biological confirmation of detected interactions. PMID:19655032

  5. A computational study of entropy generation in magnetohydrodynamic flow and heat transfer over an unsteady stretching permeable sheet

    NASA Astrophysics Data System (ADS)

    Saeed Butt, Adnan; Ali, Asif

    2014-01-01

    The present article aims to investigate the entropy effects in magnetohydrodynamic flow and heat transfer over an unsteady permeable stretching surface. The time-dependent partial differential equations are converted into non-linear ordinary differential equations by suitable similarity transformations. The solutions of these equations are computed analytically by the Homotopy Analysis Method (HAM) then solved numerically by the MATLAB built-in routine. Comparison of the obtained results is made with the existing literature under limiting cases to validate our study. The effects of unsteadiness parameter, magnetic field parameter, suction/injection parameter, Prandtl number, group parameter and Reynolds number on flow and heat transfer characteristics are checked and analysed with the aid of graphs and tables. Moreover, the effects of these parameters on entropy generation number and Bejan number are also shown graphically. It is examined that the unsteadiness and presence of magnetic field augments the entropy production.

  6. XML at the ADC: Steps to a Next Generation Data Archive

    NASA Astrophysics Data System (ADS)

    Shaya, E.; Blackwell, J.; Gass, J.; Oliversen, N.; Schneider, G.; Thomas, B.; Cheung, C.; White, R. A.

    1999-05-01

    The eXtensible Markup Language (XML) is a document markup language that allows users to specify their own tags, to create hierarchical structures to qualify their data, and to support automatic checking of documents for structural validity. It is being intensively supported by nearly every major corporate software developer. Under the funds of a NASA AISRP proposal, the Astronomical Data Center (ADC, http://adc.gsfc.nasa.gov) is developing an infrastructure for importation, enhancement, and distribution of data and metadata using XML as the document markup language. We discuss the preliminary Document Type Definition (DTD, at http://adc.gsfc.nasa.gov/xml) which specifies the elements and their attributes in our metadata documents. This attempts to define both the metadata of an astronomical catalog and the `header' information of an astronomical table. In addition, we give an overview of the planned flow of data through automated pipelines from authors and journal presses into our XML archive and retrieval through the web via the XML-QL Query Language and eXtensible Style Language (XSL) scripts. When completed, the catalogs and journal tables at the ADC will be tightly hyperlinked to enhance data discovery. In addition one will be able to search on fragmentary information. For instance, one could query for a table by entering that the second author is so-and-so or that the third author is at such-and-such institution.

  7. A comparative study of visual reaction time in table tennis players and healthy controls.

    PubMed

    Bhabhor, Mahesh K; Vidja, Kalpesh; Bhanderi, Priti; Dodhia, Shital; Kathrotia, Rajesh; Joshi, Varsha

    2013-01-01

    Visual reaction time is time required to response to visual stimuli. The present study was conducted to measure visual reaction time in 209 subjects, 50 table tennis (TT) players and 159 healthy controls. The visual reaction time was measured by the direct RT computerized software in healthy controls and table tennis players. Simple visual reaction time was measured. During the reaction time testing, visual stimuli were given for eighteen times and average reaction time was taken as the final reaction time. The study shows that table tennis players had faster reaction time than healthy controls. On multivariate analysis, it was found that TT players had 74.121 sec (95% CI 98.8 and 49.4 sec) faster reaction time compared to non-TT players of same age and BMI. Also playing TT has a profound influence on visual reaction time than BMI. Our study concluded that persons involved in sports are having good reaction time as compared to controls. These results support the view that playing of table tennis is beneficial to eye-hand reaction time, improve the concentration and alertness.

  8. The Communicability of Graphical Alternatives to Tabular Displays of Statistical Simulation Studies

    PubMed Central

    Cook, Alex R.; Teo, Shanice W. L.

    2011-01-01

    Simulation studies are often used to assess the frequency properties and optimality of statistical methods. They are typically reported in tables, which may contain hundreds of figures to be contrasted over multiple dimensions. To assess the degree to which these tables are fit for purpose, we performed a randomised cross-over experiment in which statisticians were asked to extract information from (i) such a table sourced from the literature and (ii) a graphical adaptation designed by the authors, and were timed and assessed for accuracy. We developed hierarchical models accounting for differences between individuals of different experience levels (under- and post-graduate), within experience levels, and between different table-graph pairs. In our experiment, information could be extracted quicker and, for less experienced participants, more accurately from graphical presentations than tabular displays. We also performed a literature review to assess the prevalence of hard-to-interpret design features in tables of simulation studies in three popular statistics journals, finding that many are presented innumerately. We recommend simulation studies be presented in graphical form. PMID:22132184

  9. The communicability of graphical alternatives to tabular displays of statistical simulation studies.

    PubMed

    Cook, Alex R; Teo, Shanice W L

    2011-01-01

    Simulation studies are often used to assess the frequency properties and optimality of statistical methods. They are typically reported in tables, which may contain hundreds of figures to be contrasted over multiple dimensions. To assess the degree to which these tables are fit for purpose, we performed a randomised cross-over experiment in which statisticians were asked to extract information from (i) such a table sourced from the literature and (ii) a graphical adaptation designed by the authors, and were timed and assessed for accuracy. We developed hierarchical models accounting for differences between individuals of different experience levels (under- and post-graduate), within experience levels, and between different table-graph pairs. In our experiment, information could be extracted quicker and, for less experienced participants, more accurately from graphical presentations than tabular displays. We also performed a literature review to assess the prevalence of hard-to-interpret design features in tables of simulation studies in three popular statistics journals, finding that many are presented innumerately. We recommend simulation studies be presented in graphical form.

  10. Quantification of the Effect of Shuttling on Computed Tomography Perfusion Parameters by Investigation of Aortic Inputs on Different Table Positions From Shuttle-Mode Scans of Lung and Liver Tumors.

    PubMed

    Ghosh, Payel; Chandler, Adam G; Hobbs, Brian P; Sun, Jia; Rong, John; Hong, David; Subbiah, Vivek; Janku, Filip; Naing, Aung; Hwu, Wen-Jen; Ng, Chaan S

    The aim of this study was to quantify the effect of shuttling on computed tomography perfusion (CTp) parameters derived from shuttle-mode body CT images using aortic inputs from different table positions. Axial shuttle-mode CT scans were acquired from 6 patients (10 phases, 2 nonoverlapping table positions 1.4 seconds apart) after contrast agent administration. Artifacts resulting from the shuttling motion were corrected with nonrigid registration before computing CTp maps from 4 aortic levels chosen from the most superior and inferior slices of each table position scan. The effect of shuttling on CTp parameters was estimated by mean differences in mappings obtained from aortic inputs in different table positions. Shuttling effect was also quantified using 95% limits of agreement of CTp parameter differences within-table and between-table aortic positions from the interaortic mean CTp values. Blood flow, permeability surface, and hepatic arterial fraction differences were insignificant (P > 0.05) for both within-table and between-table comparisons. The 95% limits of agreement for within-table blood volume (BV) value deviations obtained from lung tumor regions were less than 4.7% (P = 0.18) compared with less than 12.2% (P = 0.003) for between-table BV value deviations. The 95% limits of agreement of within-table deviations for liver tumor regions were less than 1.9% (P = 0.55) for BV and less than 3.2% (P = 0.23) for mean transit time, whereas between-table BV and mean transit time deviations were less than 11.7% (P < 0.01) and less than 14.6% (P < 0.01), respectively. Values for normal liver tissue regions were concordant. Computed tomography perfusion parameters acquired from aortic levels within-table positions generally yielded higher agreement than mappings obtained from aortic levels between-table positions indicating differences due to shuttling effect.

  11. Community-wide validation of geospace model local K-index predictions to support model transition to operations

    NASA Astrophysics Data System (ADS)

    Glocer, A.; Rastätter, L.; Kuznetsova, M.; Pulkkinen, A.; Singer, H. J.; Balch, C.; Weimer, D.; Welling, D.; Wiltberger, M.; Raeder, J.; Weigel, R. S.; McCollough, J.; Wing, S.

    2016-07-01

    We present the latest result of a community-wide space weather model validation effort coordinated among the Community Coordinated Modeling Center (CCMC), NOAA Space Weather Prediction Center (SWPC), model developers, and the broader science community. Validation of geospace models is a critical activity for both building confidence in the science results produced by the models and in assessing the suitability of the models for transition to operations. Indeed, a primary motivation of this work is supporting NOAA/SWPC's effort to select a model or models to be transitioned into operations. Our validation efforts focus on the ability of the models to reproduce a regional index of geomagnetic disturbance, the local K-index. Our analysis includes six events representing a range of geomagnetic activity conditions and six geomagnetic observatories representing midlatitude and high-latitude locations. Contingency tables, skill scores, and distribution metrics are used for the quantitative analysis of model performance. We consider model performance on an event-by-event basis, aggregated over events, at specific station locations, and separated into high-latitude and midlatitude domains. A summary of results is presented in this report, and an online tool for detailed analysis is available at the CCMC.

  12. Community-Wide Validation of Geospace Model Local K-Index Predictions to Support Model Transition to Operations

    NASA Technical Reports Server (NTRS)

    Glocer, A.; Rastaetter, L.; Kuznetsova, M.; Pulkkinen, A.; Singer, H. J.; Balch, C.; Weimer, D.; Welling, D.; Wiltberger, M.; Raeder, J.; hide

    2016-01-01

    We present the latest result of a community-wide space weather model validation effort coordinated among the Community Coordinated Modeling Center (CCMC), NOAA Space Weather Prediction Center (SWPC), model developers, and the broader science community. Validation of geospace models is a critical activity for both building confidence in the science results produced by the models and in assessing the suitability of the models for transition to operations. Indeed, a primary motivation of this work is supporting NOAA/SWPCs effort to select a model or models to be transitioned into operations. Our validation efforts focus on the ability of the models to reproduce a regional index of geomagnetic disturbance, the local K-index. Our analysis includes six events representing a range of geomagnetic activity conditions and six geomagnetic observatories representing midlatitude and high-latitude locations. Contingency tables, skill scores, and distribution metrics are used for the quantitative analysis of model performance. We consider model performance on an event-by-event basis, aggregated over events, at specific station locations, and separated into high-latitude and midlatitude domains. A summary of results is presented in this report, and an online tool for detailed analysis is available at the CCMC.

  13. A Computable Definition of Sepsis Facilitates Screening and Performance Improvement Tracking.

    PubMed

    Alessi, Lauren J; Warmus, Holly R; Schaffner, Erin K; Kantawala, Sajel; Carcillo, Joseph; Rosen, Johanna; Horvat, Christopher M

    2018-03-01

    Sepsis kills almost 5,000 children annually, accounting for 16% of pediatric health care spending in the United States. We sought to identify sepsis within the Electronic Health Record (EHR) of a quaternary children's hospital to characterize disease incidence, improve recognition and response, and track performance metrics. Methods are organized in a plan-do-study-act cycle. During the "plan" phase, electronic definitions of sepsis (blood culture and antibiotic within 24 hours) and septic shock (sepsis plus vasoactive medication) were created to establish benchmark data and track progress with statistical process control. The performance of a screening tool was evaluated in the emergency department. During the "do" phase, a novel inpatient workflow is being piloted, which involves regular sepsis screening by nurses using the tool, and a regimented response to high risk patients. Screening tool use in the emergency department reduced time to antibiotics (Fig. 1). Of the 6,159 admissions, EHR definitions identified 1,433 (23.3%) between July and December 2016 with sepsis, of which 159 (11.1%) had septic shock. Hospital mortality for all sepsis patients was 2.2% and 15.7% for septic shock (Table 1). These findings approximate epidemiologic studies of sepsis and severe sepsis, which report a prevalence range of 0.45-8.2% and mortality range of 8.2-25% (Table 2). 1-5 . Implementation of a sepsis screening tool is associated with improved performance. The prevalence of sepsis conditions identified with electronic definitions approximates the epidemiologic landscape characterized by other point-prevalence and administrative studies, providing face validity to this approach, and proving useful for tracking performance improvement.

  14. Analyzing thematic maps and mapping for accuracy

    USGS Publications Warehouse

    Rosenfield, G.H.

    1982-01-01

    Two problems which exist while attempting to test the accuracy of thematic maps and mapping are: (1) evaluating the accuracy of thematic content, and (2) evaluating the effects of the variables on thematic mapping. Statistical analysis techniques are applicable to both these problems and include techniques for sampling the data and determining their accuracy. In addition, techniques for hypothesis testing, or inferential statistics, are used when comparing the effects of variables. A comprehensive and valid accuracy test of a classification project, such as thematic mapping from remotely sensed data, includes the following components of statistical analysis: (1) sample design, including the sample distribution, sample size, size of the sample unit, and sampling procedure; and (2) accuracy estimation, including estimation of the variance and confidence limits. Careful consideration must be given to the minimum sample size necessary to validate the accuracy of a given. classification category. The results of an accuracy test are presented in a contingency table sometimes called a classification error matrix. Usually the rows represent the interpretation, and the columns represent the verification. The diagonal elements represent the correct classifications. The remaining elements of the rows represent errors by commission, and the remaining elements of the columns represent the errors of omission. For tests of hypothesis that compare variables, the general practice has been to use only the diagonal elements from several related classification error matrices. These data are arranged in the form of another contingency table. The columns of the table represent the different variables being compared, such as different scales of mapping. The rows represent the blocking characteristics, such as the various categories of classification. The values in the cells of the tables might be the counts of correct classification or the binomial proportions of these counts divided by either the row totals or the column totals from the original classification error matrices. In hypothesis testing, when the results of tests of multiple sample cases prove to be significant, some form of statistical test must be used to separate any results that differ significantly from the others. In the past, many analyses of the data in this error matrix were made by comparing the relative magnitudes of the percentage of correct classifications, for either individual categories, the entire map or both. More rigorous analyses have used data transformations and (or) two-way classification analysis of variance. A more sophisticated step of data analysis techniques would be to use the entire classification error matrices using the methods of discrete multivariate analysis or of multiviariate analysis of variance.

  15. Seismic velocities to characterize the soil-aquifer continuum on the Orgeval experimental basin (France)

    NASA Astrophysics Data System (ADS)

    Pasquet, S.; Ludovic, B.; Dhemaied, A.; Flipo, N.; Guérin, R.; Mouhri, A.; Faycal, R.; Vitale, Q.

    2013-12-01

    Among geophysical methods applied to hydrogeology, seismic prospecting is frequently confined to the characterization of aquifers geometry. The combined study of pressure- (P) and shear- (SH) wave velocities (respectively Vp and Vs) can however provide information about the aquifer parameters, as it is commonly done for most fluids in hydrocarbon exploration. This approach has recently been proposed in sandy aquifers with the estimation of Vp/Vs ratio. In order to address such issues in more complex aquifer systems (e.g. unconsolidated, heterogeneous or low-permeability media) we carried out P- and SH-wave seismic surveys on the Orgeval experimental basin (70 km east from Paris, France). This basin drains a multi-layer aquifer system monitored by a network of piezometers. The upper part of the aquifer system is characterized by tabular layers well delineated all over the basin thanks to Electrical Resistivity Tomography (ERT), Time Domain ElectroMagnetic (TDEM) soundings and wells. But the lateral variability of the intrinsic properties in each layer raises questions regarding the hydrodynamics of the upper aquifer and the validity of interpolations between piezometers. A simple interpretation of P- and SH-wave first arrivals for tabular models provides 1D velocity structures in very good agreement with the stratification anticipated from ERT and nearby geological logs. Vp/Vs ratios show a strong contrast at a depth consistent with the observed water table level, reinforcing the assumption of a free upper aquifer in the area. Similar experiments have to be conducted under different hydrological conditions to validate these observations. Anticipating the need to propose lateral applications of the method, we additionally performed tomographic inversions of the recorded data to retrieve 2D Vp and Vs models. If interpreted independently, both models fail to depict the stratification of the medium and the water table level cannot be straightforwardly identified. However, the computation of Vp/Vs ratios and derived parameters helps enhancing lithological contrasts.

  16. Use of existing patient-reported outcome (PRO) instruments and their modification: the ISPOR Good Research Practices for Evaluating and Documenting Content Validity for the Use of Existing Instruments and Their Modification PRO Task Force Report.

    PubMed

    Rothman, Margaret; Burke, Laurie; Erickson, Pennifer; Leidy, Nancy Kline; Patrick, Donald L; Petrie, Charles D

    2009-01-01

    Patient-reported outcome (PRO) instruments are used to evaluate the effect of medical products on how patients feel or function. This article presents the results of an ISPOR task force convened to address good clinical research practices for the use of existing or modified PRO instruments to support medical product labeling claims. The focus of the article is on content validity, with specific reference to existing or modified PRO instruments, because of the importance of content validity in selecting or modifying an existing PRO instrument and the lack of consensus in the research community regarding best practices for establishing and documenting this measurement property. Topics addressed in the article include: definition and general description of content validity; PRO concept identification as the important first step in establishing content validity; instrument identification and the initial review process; key issues in qualitative methodology; and potential threats to content validity, with three case examples used to illustrate types of threats and how they might be resolved. A table of steps used to identify and evaluate an existing PRO instrument is provided, and figures are used to illustrate the meaning of content validity in relationship to instrument development and evaluation. RESULTS & RECOMMENDATIONS: Four important threats to content validity are identified: unclear conceptual match between the PRO instrument and the intended claim, lack of direct patient input into PRO item content from the target population in which the claim is desired, no evidence that the most relevant and important item content is contained in the instrument, and lack of documentation to support modifications to the PRO instrument. In some cases, careful review of the threats to content validity in a specific application may be reduced through additional well documented qualitative studies that specifically address the issue of concern. Published evidence of the content validity of a PRO instrument for an intended application is often limited. Such evidence is, however, important to evaluating the adequacy of a PRO instrument for the intended application. This article provides an overview of key issues involved in assessing and documenting content validity as it relates to using existing instruments in the drug approval process.

  17. Mobility Analyses of Standard- and High-Mobility Tactical Support Vehicles (HIMO Study)

    DTIC Science & Technology

    1976-02-01

    l, APPENDIX G: PARTICIPANTS IN SCENARIO EXERCISES ... ....... Gl I ?S LIST OF TABLES Table Page I Summary of Vehicle Caracteristics and Some...15 1 :1010 2 :1111 Organid silts and clays ( plastic ) >7-30 0 11212 1 1 1313Peat (nou plastic ) _._>_3_0 0 .1414 Li Groups with Different Materiai in 0...diameter LL = Liquid limit PI - Plasticity index Drainage potential classified by occurrence of water table as follows: Class 0 Water table occurs at

  18. Utilization of Facial Image Analysis Technology for Blink Detection: A Validation Study.

    PubMed

    Kitazawa, Momoko; Yoshimura, Michitaka; Liang, Kuo-Ching; Wada, Satoshi; Mimura, Masaru; Tsubota, Kazuo; Kishimoto, Taishiro

    2018-06-25

    The assessment of anterior eye diseases and the understanding of psychological functions of blinking can benefit greatly from a validated blinking detection technology. In this work, we proposed an algorithm based on facial recognition built on current video processing technologies to automatically filter and analyze blinking movements. We compared electrooculography (EOG), the gold standard of blinking measurement, with manual video tape recording counting (mVTRc) and our proposed automated video tape recording analysis (aVTRa) in both static and dynamic conditions to validate our aVTRa method. We measured blinking in both static condition, where the subject was sitting still with chin fixed on the table, and dynamic condition, where the subject's face was not fixed and natural communication was taking place between the subject and interviewer. We defined concordance of blinks between measurement methods as having less than 50 ms difference between eyes opening and closing. The subjects consisted of seven healthy Japanese volunteers (3 male, four female) without significant eye disease with average age of 31.4±7.2. The concordance of EOG vs. aVTRa, EOG vs. mVTRc, and aVTRa vs. mVTRc (average±SD) were found to be 92.2±10.8%, 85.0±16.5%, and 99.6±1.0% in static conditions and 32.6±31.0%, 28.0±24.2%, and 98.5±2.7% in dynamic conditions, respectively. In static conditions, we have found a high blink concordance rate between the proposed aVTRa versus EOG, and confirmed the validity of aVTRa in both static and dynamic conditions.

  19. Development of Assessment Instrument of Critical Thinking in Physics at Senior High School

    NASA Astrophysics Data System (ADS)

    Sugiarti, T.; Kaniawati, I.; Aviyanti, L.

    2017-02-01

    The result of preliminary study shows that the assessment of physics in school did not train students’ critical thinking skill. The assessment instrument just measured low cognitive aspects. Supposedly, critical thinking skill is trained in the assessment activity. The study aims to determine the characteristics and the quality of critical thinking skill instrument. It employs descriptive-qualitative method with research and development as the research design. The research participants are 35 students involved in the limited trial and 188 students in the wider trial from three public senior high school in Ciamis which in high level school. The data was collected through expert validation, tests and interviews. The results indicate that the characteristics of the assessment instrument of critical thinking skill is open-ended. The instrument fulfills some indicators namely analyzing argument, deduction, induction, and display information in the form of scenario, text, graphic and table. In addition, the data processing through V4 Anates program shows that the instrument reliability achieves 0.67 with high interpretation of 0.67 and the validity is 0.47 with enough interpretation. Thus, the assessment instrument of critical thinking skill in the form of open-ended essay meets the criteria of quality test, so it can use as instrument of assessment critical thinking skill.

  20. The National Cancer Institute diet history questionnaire: validation of pyramid food servings.

    PubMed

    Millen, Amy E; Midthune, Douglas; Thompson, Frances E; Kipnis, Victor; Subar, Amy F

    2006-02-01

    The performance of the National Cancer Institute's food frequency questionnaire, the Diet History Questionnaire (DHQ), in estimating servings of 30 US Department of Agriculture Food Guide Pyramid food groups was evaluated in the Eating at America's Table Study (1997-1998), a nationally representative sample of men and women aged 20-79 years. Participants who completed four nonconsecutive, telephone-administered 24-hour dietary recalls (n = 1,301) were mailed a DHQ; 965 respondents completed both the 24-hour dietary recalls and the DHQ. The US Department of Agriculture's Pyramid Servings Database was used to estimate intakes of pyramid servings for both diet assessment tools. The correlation (rho) between DHQ-reported intake and true intake and the attenuation factor (lambda) were estimated using a measurement error model with repeat 24-hour dietary recalls as the reference instrument. Correlations for energy-adjusted pyramid servings of foods ranged from 0.43 (other starchy vegetables) to 0.84 (milk) among women and from 0.42 (eggs) to 0.80 (total dairy food) among men. The mean rho and lambda after energy adjustment were 0.62 and 0.60 for women and 0.63 and 0.66 for men, respectively. This food frequency questionnaire validation study of foods measured in pyramid servings allowed for a measure of food intake consistent with national dietary guidance.

  1. Radiation exposure during in-situ pinning of slipped capital femoral epiphysis hips: does the patient positioning matter?

    PubMed

    Mohammed, Riazuddin; Johnson, Karl; Bache, Ed

    2010-07-01

    Multiple radiographic images may be necessary during the standard procedure of in-situ pinning of slipped capital femoral epiphysis (SCFE) hips. This procedure can be performed with the patient positioned on a fracture table or a radiolucent table. Our study aims to look at any differences in the amount and duration of radiation exposure for in-situ pinning of SCFE performed using a traction table or a radiolucent table. Sixteen hips in thirteen patients who were pinned on radiolucent table were compared for the cumulative radiation exposure to 35 hips pinned on a fracture table in 33 patients during the same time period. Cumulative radiation dose was measured as dose area product in Gray centimeter2 and the duration of exposure was measured in minutes. Appropriate statistical tests were used to test the significance of any differences. Mean cumulative radiation dose for SCFE pinned on radiolucent table was statistically less than for those pinned on fracture table (P<0.05). The mean duration of radiation exposure on either table was not significantly different. Lateral projections may increase the radiation doses compared with anteroposterior projections because of the higher exposure parameters needed for side imaging. Our results showing decreased exposure doses on the radiolucent table are probably because of the ease of a frog leg lateral positioning obtained and thereby the ease of lateral imaging. In-situ pinning of SCFE hips on a radiolucent table has an additional advantage that the radiation dose during the procedure is significantly less than that of the procedure that is performed on a fracture table.

  2. Online tables of contents for books: effect on usage*

    PubMed Central

    Morris, Ruth C.

    2001-01-01

    Objectives: To explore whether the presence of online tables of contents (TOC) in an online catalog affects circulation (checkouts and inhouse usage). Two major questions were posed: (1) did the presence of online tables of contents for books increase use, and, (2) if it did, what factors might cause the increase? Method: A randomized and stratified design was used in tracking usage of 3,957 book titles that were previously divided into two groups: one with TOC and one without TOC. Stratification was done for year of imprint, location, subject, previous use, circulating or non-circulating status, and presence of TOC. The use was tracked by the online catalog statistics in the InnoPac online catalog for fourteen months. Results: The study found that tables of contents do increase usage. It also showed a correlation in the size of the effect based on the currency of the titles. In general, even after adjusting for all of the variables (publication date, location, circulation status, subject, and previous use), the odds of a title being used increased by 45% if the titles had online tables of contents, a statistically significant impact at the 0.05 level. Conclusions: This case-control study presents new information about the impact on circulation and inhouse use when tables of contents for books are added to the online catalog record. The study helps to establish the positive role of tables of contents in online catalogs. The research establishes TOC as a major parameter that can be successfully studied using quantitative methods. The study also provides information professionals with some guidance on when enhancement of TOC is likely to be most effective in increasing the use of existing collections. PMID:11209798

  3. Online tables of contents for books: effect on usage.

    PubMed

    Morris, R C

    2001-01-01

    To explore whether the presence of online tables of contents (TOC) in an online catalog affects circulation (checkouts and inhouse usage). Two major questions were posed: (1) did the presence of online tables of contents for books increase use, and, (2) if it did, what factors might cause the increase? A randomized and stratified design was used in tracking usage of 3,957 book titles that were previously divided into two groups: one with TOC and one without TOC. Stratification was done for year of imprint, location, subject, previous use, circulating or non-circulating status, and presence of TOC. The use was tracked by the online catalog statistics in the InnoPac online catalog for fourteen months. The study found that tables of contents do increase usage. It also showed a correlation in the size of the effect based on the currency of the titles. In general, even after adjusting for all of the variables (publication date, location, circulation status, subject, and previous use), the odds of a title being used increased by 45% if the titles had online tables of contents, a statistically significant impact at the 0.05 level. This case-control study presents new information about the impact on circulation and inhouse use when tables of contents for books are added to the online catalog record. The study helps to establish the positive role of tables of contents in online catalogs. The research establishes TOC as a major parameter that can be successfully studied using quantitative methods. The study also provides information professionals with some guidance on when enhancement of TOC is likely to be most effective in increasing the use of existing collections.

  4. A time motion study in the immunization clinic of a tertiary care hospital of kolkata, west bengal.

    PubMed

    Chattopadhyay, Amitabha; Ghosh, Ritu; Maji, Sucharita; Ray, Tapobroto Guha; Lahiri, Saibendu Kumar

    2012-01-01

    A time and motion study is used to determine the amount of time required for a specific activity, work function, or mechanical process. Few such studies have been reported in the outpatient department of institutions, and such studies based exclusively on immunization clinic of an institute is a rarity. This was an observational cross sectional study done in the immunization clinic of R.G. Kar Medical College, Kolkata, over a period of 1 month (September 2010). The study population included mother/caregivers attending the immunization clinics with their children. The total sample was 482. Pre-synchronized stopwatches were used to record service delivery time at the different activity points. Median time was the same for both initial registration table and nutrition and health education table (120 seconds), but the vaccination and post vaccination advice table took the highest percentage of overall time (46.3%). Maximum time spent on the vaccination and post vaccination advice table was on Monday (538.1 s) and nutritional assessment and health assessment table took maximum time on Friday (217.1 s). Time taken in the first half of immunization session was more in most of the tables. The goal for achieving universal immunization against vaccine-preventable diseases requires multifaceted collated response from many stakeholders. Efficient functioning of immunization clinics is therefore required to achieve the prescribed goals. This study aims to initiate an effort to study the utilization of time at a certain health care unit with the invitation of much more in depth analysis in future.

  5. Women in the Civil Engineer Corps.

    DTIC Science & Technology

    1986-01-01

    order to "measure up"? __ often__ occasionally__ rarely_ never 18.Have you experienced sexual harrassment on the job? -__often ___occasionally __rarely...Assignments. . . . . . . 23 Sexual Harassment/Discrimination . . . 26 Relating Studies and Literature. . . . . . 28 IV. RESEARCH METHOLOGY...Table 13: Question 16. Work Harder . . . . . . . 56 Table 14: Question 17, Measure Up. . . . . . . . 57 Table 15: Question 18. Sexual Harassment

  6. Transport of E. coli in a sandy soil as impacted by depth to water table.

    PubMed

    Stall, Christopher; Amoozegar, Aziz; Lindbo, David; Graves, Alexandria; Rashash, Diana

    2014-01-01

    Septic systems are considered a source of groundwater contamination. In the study described in this article, the fate of microbes applied to a sandy loam soil from North Carolina coastal plain as impacted by water table depth was studied. Soil materials were packed to a depth of 65 cm in 17 columns (15-cm diameter), and a water table was established at 30, 45, and 60 cm depths using five replications. Each day, 200 mL of an artificial septic tank effluent inoculated with E. coli were applied to the top of each column, a 100-mL sample was collected at the water table level and analyzed for E. coli, and 100 mL was drained from the bottom to maintain the water table. Two columns were used as control and received 200 mL/day of sterilized effluent. Neither 30 nor 45 cm of unsaturated soil was adequate to attenuate bacterial contamination, while 60 cm of separation appeared to be sufficient. Little bacterial contamination moved with the water table when it was lowered from 30 to 60 cm.

  7. Misconceptions impairing the validity of the stopping power tables in the SRIM library and suggestions for doing better in the future

    NASA Astrophysics Data System (ADS)

    Wittmaack, Klaus

    2016-08-01

    The popular SRIM library delivers data and simulation codes for describing the slowing down of energetic atoms in matter. This study explored the validity of the tables containing cross sections for electronic and nuclear stopping together with the respective range parameters. The electronic stopping cross sections Se were produced, much like in previous and subsequent attempts of other groups, by bringing together the limited number of available experimental results in the form of ratios, r(Z1, He, υ) = Se(Z1, Z2, υ)/Se(He, Z2, υ). Z1 and Z2 denote the atomic numbers of projectiles (velocity υ) and target atoms, respectively. Reference data for He impact are available in reasonable volume only for about 15% of all solid targets; missing information has to be derived by interpolation. The concept of data evaluation is based on the assumption that the Bethe-Bloch (BB) theory, developed for bare projectiles, can serve as a guide at all velocities. For the purpose in question, the theory features an indispensible property: Z1 and Z2 appear as straight pre-factors, not in coupled form, Se,BB (υ) ∝ Z12 Z2 L (Z2, υ) , with L(Z2, υ) representing the stopping number. Therefore, Z2 and L(Z2, υ) cancel out when taking ratios for fixed Z1 so that data for arbitrary Z2 may be combined in one set of r-values to determine the best fit rfit(Z1, He, υ). The stopping cross sections of interest are then derived as Se(Z1, Z2, υ) = rfit(Z1, He, υ) Se(He, Z2, υ). At low energies these results were often refined, Se ⇒ Se,f, to account for experimental data that did not comply with the anticipated Z2-independent trend. Major findings and suggestions are as follows: (i) At high velocities, υ > 4Z12/3υ0 (Bohr velocity υ0), the SRIM predictions (have to) rely primarily on BB theory. (ii) At intermediate velocities, i.e., around the Bragg peak, SRIM appears to produce reasonable results (ca. ±15%). (iii) Below the Bragg peak the tabulated data often deviate strongly and inconsistently from the predictions of Lindhard-Scharff (LS) theory; they also exhibit various forms of exotic velocity dependence. These deviations are primarily due to the fact that the range of validity of BB theory is artificially extended to velocities at which the 'effective-charge' concept is assumed to be applicable. Coupled Z1,2 scaling as in theories of LS or Firsov would be much more appropriate. Overall, the electronic stopping cross sections by SRIM are of unpredictable value and often strongly misleading below 1 MeV/u. (iv) Another consequence of the tight link to the Z1,2 dependence of BB theory is that only 2 × 92 master sets of electronic stopping cross sections were required to generate all conceivable 89 × 92 tables from Se,f-ratios for elemental targets (the tables for H, He and Li projectiles are derived separately). The information contained in the SRIM library at large thus exhibits a highly redundant character. (v) The nuclear stopping cross sections Sn mirror the predictions of the universal potential due by Ziegler, Biersack and Littmark, which differ from alternative suggestions typically by less than 15%. With this uncertainty, range distributions may be calculated with the TRIM program of SRIM, but only at energies where Sn dominates so that uncertainties in Se play a minor role. (vi) As a side aspect, an example is presented illustrating the efforts required to identify incorrect experimental data, notably when respected authors are accountable. (vii) Other approaches to establish stopping power tables are shown to be subject to the same problems as SRIM. It is recommended to add a warning to all theses tables, informing users at which energies the data are likely to lack reliability. (viii) The currently unacceptable quality of Se,f-data below 1 MeV/u could be improved significantly in the future if the user friendly TRIM(SRIM) code were modified to allow simulations with a free choice of nuclear and electronic stopping cross sections. Researchers would thus be enabled to identify optimum input parameters for reproducing measured range distributions at energies at which Sn and Se are often of similar magnitude so that their contribution to the total energy loss is difficult to quantify without simulations.

  8. Ada Compiler Validation Summary Report: Certificate Number: 890409W1. 10063, TeleSoft, TeleGen2 Ada Development System for VAX to E68K, Version 3.23, VAXserver 3602 Host and Motorola MVME117 (MC68010) Target

    DTIC Science & Technology

    1989-04-09

    01hc ’i.’ f’Pf a- ebc 2i1o.-fer-er~hfom Repo.) tUEFCTE $UP1 * ta ~ (Con’rj 01’?Cre V Pot s’o fneccksa’) ortoiorM’f, by blDc n fumYber) Ada Prcr-7n...CLA53SIF1CA1,O0s or IMIS PA ,j (AhnDetsinfree1 AVF Control Number: AVF-VSP-267.0539 69-01-25-TEL Ada COMPILER VALIDATION SUMMARY REPORT: Certificate Number...pragma images controls the creation and allocation of the image table for a specified enumeration type. The default is Deferred, which saves space in the

  9. LACIE performance predictor final operational capability program description, volume 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The program EPHEMS computes the orbital parameters for up to two vehicles orbiting the earth for up to 549 days. The data represents a continuous swath about the earth, producing tables which can be used to determine when and if certain land segments will be covered. The program GRID processes NASA's climatology tape to obtain the weather indices along with associated latitudes and longitudes. The program LUMP takes substrata historical data and sample segment ID, crop window, crop window error and statistical data, checks for valid input parameters and generates the segment ID file, crop window file and the substrata historical file. Finally, the System Error Executive (SEE) Program checks YES error and truth data, CAMS error data, and signature extension data for validity and missing elements. A message is printed for each error found.

  10. An Exploration of the Nightstand and Over-the-Bed Table in an Inpatient Rehabilitation Hospital.

    PubMed

    Healy, Stan; Manganelli, Joe; Rosopa, Patrick J; Brooks, Johnell O

    2015-01-01

    This study seeks to determine where patients in a rehabilitation hospital keep the greatest percentage of their belongings, that is, in/on the nightstand or on the over-the-bed table. This study provides an inventory of patient items located on the over-the-bed table and in/on the nightstand. Understanding the functions of furnishings within the patient room is key for future preparation for designing a next-generation over-the-bed table or for redesigning a more useful nightstand. The contents on the top of the nightstand; the contents in the top, middle, and bottom drawers of the nightstand; items next to the nightstand; and the contents on the over-the-bed table within patient rooms were inventoried and placed into categories using similar, patient item categories as the Brooks et al. (2011) study, which examined the contents of the nightstand and the over-the-bed table in assisted living and skilled nursing facilities. Overall, patients in a rehabilitation hospital had a greater percentage of their belongings on the top of the nightstand as compared to their belongings located in all three combined drawers of the nightstand. Overall, patients had a greater percentage of their belongings located on the over-the-bed table as compared to their belongings located on the nightstand. Tabletop surface area was used extensively in patient rooms at a rehabilitation hospital, but nightstand drawers were underutilized. © The Author(s) 2015.

  11. Unraveling past impacts of climate change and land management on historic peatland development using proxy-based reconstruction, monitoring data and process modeling.

    PubMed

    Heinemeyer, Andreas; Swindles, Graeme T

    2018-05-08

    Peatlands represent globally significant soil carbon stores that have been accumulating for millennia under water-logged conditions. However, deepening water-table depths (WTD) from climate change or human-induced drainage could stimulate decomposition resulting in peatlands turning from carbon sinks to carbon sources. Contemporary WTD ranges of testate amoebae (TA) are commonly used to predict past WTD in peatlands using quantitative transfer function models. Here we present, for the first time, a study comparing TA-based WTD reconstructions to instrumentally monitored WTD and hydrological model predictions using the MILLENNIA peatland model to examine past peatland responses to climate change and land management. Although there was very good agreement between monitored and modeled WTD, TA-reconstructed water table was consistently deeper. Predictions from a larger European TA transfer function data set were wetter, but the overall directional fit to observed WTD was better for a TA transfer function based on data from northern England. We applied a regression-based offset correction to the reconstructed WTD for the validation period (1931-2010). We then predicted WTD using available climate records as MILLENNIA model input and compared the offset-corrected TA reconstruction to MILLENNIA WTD predictions over an extended period (1750-1931) with available climate reconstructions. Although the comparison revealed striking similarities in predicted overall WTD patterns, particularly for a recent drier period (1965-1995), there were clear periods when TA-based WTD predictions underestimated (i.e. drier during 1830-1930) and overestimated (i.e. wetter during 1760-1830) past WTD compared to MILLENNIA model predictions. Importantly, simulated grouse moor management scenarios may explain the drier TA WTD predictions, resulting in considerable model predicted carbon losses and reduced methane emissions, mainly due to drainage. This study demonstrates the value of a site-specific and combined data-model validation step toward using TA-derived moisture conditions to understand past climate-driven peatland development and carbon budgets alongside modeling likely management impacts. © 2018 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.

  12. IntelliTable: Inclusively-Designed Furniture with Robotic Capabilities.

    PubMed

    Prescott, Tony J; Conran, Sebastian; Mitchinson, Ben; Cudd, Peter

    2017-01-01

    IntelliTable is a new proof-of-principle assistive technology system with robotic capabilities in the form of an elegant universal cantilever table able to move around by itself, or under user control. We describe the design and current capabilities of the table and the human-centered design methodology used in its development and initial evaluation. The IntelliTable study has delivered robotic platform programmed by a smartphone that can navigate around a typical home or care environment, avoiding obstacles, and positioning itself at the user's command. It can also be configured to navigate itself to pre-ordained places positions within an environment using ceiling tracking, responsive optical guidance and object-based sonar navigation.

  13. Analysis of the Articulated Total Body (ATB) and Mathematical Dynamics Model (MADYMO) Software Suites for Modeling Anthropomorphic Test Devices (ATDs) in Blast Environments

    DTIC Science & Technology

    2013-05-01

    REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 ...valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1 . REPORT DATE (DD-MM-YYYY) May 2013 2. REPORT TYPE Final 3. DATES...area code) (410) 278-7386 Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39.18 iii Contents List of Figures v List of Tables v 1

  14. Environmental Assessment for Lowering Base Altitude of Military Operations Areas

    DTIC Science & Technology

    2010-11-01

    collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE NOV 2010 2. REPORT TYPE 3. DATES COVERED 00-00...single T-38C overflight at 8,700 feet AGL (i.e., the current base altitude of the MOAs) is 63 dBA, while the noise from the aircraft at 6,700 feet AGL... Current Bird Census for Salt Plains National Wildlife Refuge: May 6, 2010 ....... 3-24  Table 3-13 Washita Wildlife Refuge 2008 Christmas Bird Count

  15. Unique Genomic Alterations in Prostate Cancers in African American Men

    DTIC Science & Technology

    2015-12-01

    with MNX1 mRNA in AA PCa by Pearson Product Moment. Correlation coefficient and p-value are shown. 13 Subtask 6: Validation of key gene...1 SF 298……………………………………………………………………………..……2 Table of Contents…………………………………………………………..…….... 3 Introduction ...

  16. Randolph AFB, San Antonio, Texas. Revised Uniform Summary of Surface Weather Observations (RUSSWO). Parts A-F.

    DTIC Science & Technology

    1982-02-08

    is printed in any year-month block when the extreme value Is based on an in- complete month (at least one day missing for the month). When a month has...means, standard deviations, and total number of valid observations for each month and annual (all months). An asterisk (*) is printed n each data block...becomes the extreme or monthly total in any of these tables it is printed as "TRACE." Continued on Reverse Side Values ’or means and standard

  17. Characterization and Modeling of High Power Microwave Effects in CMOS Microelectronics

    DTIC Science & Technology

    2010-01-01

    margin measurement 28 Any voltage above the line marked VIH is considered a valid logic high on the input of the gate. VIH and VIL are defined...can handle any voltage noise level at the input up to VIL without changing state. The region in between VIL and VIH is considered an invalid logic...29 Table 2.2: Intrinsic device characteristics derived from SPETCRE simulations   VIH  (V)  VIL (V)  High Noise Margin  (V)  Low Noise Margin (V

  18. Conference Proceedings on Validation of Computational Fluid Dynamics. Volume 1. Symposium Papers and Round Table Discussion Held in Lisbon, Portugal on 2-5 May 1988

    DTIC Science & Technology

    1988-05-01

    the representation of the shock, the non -conservative difference scheme in the original method being replaced by a ’ quasi - conservative’ operator 3...domain. In order to simulate the experimentally observed pressure distribution at the exit a formulation of the non -reflecting pressure condition Is used...and Experimental Aero- dynamics: Wing Surface Generator Code, Control Surface and Boundary Conditions". DFVLR IB 221-87 A 01, 1987. [11] Kordulla, W.(ed

  19. Validation of Biomarkers Predictive of Recurrence Following Prostatectomy

    DTIC Science & Technology

    2012-04-01

    NOTCH3 , ETV1, BID, SIM2, ANXA1, miR-519d, and miR-647) that could be used to separate patients with and without biochemical recurrence (p < 0.001), as...cases with known outcome. The characteristics of these cases are given in Table 1. Since five of our candidate biomarkers (RAD23B, SIM2s, Notch3 ...TMA sections were stained with antibodies to RAD23B, SIM2S, Notch3 , BID, and FBP1. Intensity of the various immunohistochemical stains were scored

  20. Analysis of the Army’s Installation Support Modules with the Private Sector’s Open Information Systems

    DTIC Science & Technology

    1993-04-09

    according to Bryman , it is possible to use a small sample and still maintain a high degree of validity as long as the sampling size represents a ...Analysis of the Army’s 48 References Alperin, Jeffry A . & St. Germain, Robert A ., Jr. (1989). Open Systems: Ready for Lift-Off? Best Review, 90, 82. Bryman ...8217" Availability Codes • -- a l and I or Dist Spccial*,vai Analysis of the Army’s iii TABLE OF CONTENTS Title Page

  1. Objective Analysis of Oceanic Data for Coast Guard Trajectory Models Phase II

    DTIC Science & Technology

    1997-12-01

    as outliers depends on the desired probability of false alarm, Pfa values, which is the probability of marking a valid point as an outlier. Table 2-2...constructed to minimize the mean-squared prediction error of the grid point estimate under the constraint that the estimate is unbiased . The...prediction error, e= Zl(S) _oizl(Si)+oC1iZz(S) (2.44) subject to the constraints of unbiasedness , • c/1 = 1,and (2.45) i SCC12 = 0. (2.46) Denoting

  2. Numerical Analysis of Organ Doses Delivered During Computed Tomography Examinations Using Japanese Adult Phantoms with the WAZA-ARI Dosimetry System.

    PubMed

    Takahashi, Fumiaki; Sato, Kaoru; Endo, Akira; Ono, Koji; Ban, Nobuhiko; Hasegawa, Takayuki; Katsunuma, Yasushi; Yoshitake, Takayasu; Kai, Michiaki

    2015-08-01

    A dosimetry system for computed tomography (CT) examinations, named WAZA-ARI, is being developed to accurately assess radiation doses to patients in Japan. For dose calculations in WAZA-ARI, organ doses were numerically analyzed using average adult Japanese male (JM) and female (JF) phantoms with the Particle and Heavy Ion Transport code System (PHITS). Experimental studies clarified the photon energy distribution of emitted photons and dose profiles on the table for some multi-detector row CT (MDCT) devices. Numerical analyses using a source model in PHITS could specifically take into account emissions of x rays from the tube to the table with attenuation of photons through a beam-shaping filter for each MDCT device based on the experiment results. The source model was validated by measuring the CT dose index (CTDI). Numerical analyses with PHITS revealed a concordance of organ doses with body sizes of the JM and JF phantoms. The organ doses in the JM phantoms were compared with data obtained using previously developed systems. In addition, the dose calculations in WAZA-ARI were verified with previously reported results by realistic NUBAS phantoms and radiation dose measurement using a physical Japanese model (THRA1 phantom). The results imply that numerical analyses using the Japanese phantoms and specified source models can give reasonable estimates of dose for MDCT devices for typical Japanese adults.

  3. Current Pressure Transducer Application of Model-based Prognostics Using Steady State Conditions

    NASA Technical Reports Server (NTRS)

    Teubert, Christopher; Daigle, Matthew J.

    2014-01-01

    Prognostics is the process of predicting a system's future states, health degradation/wear, and remaining useful life (RUL). This information plays an important role in preventing failure, reducing downtime, scheduling maintenance, and improving system utility. Prognostics relies heavily on wear estimation. In some components, the sensors used to estimate wear may not be fast enough to capture brief transient states that are indicative of wear. For this reason it is beneficial to be capable of detecting and estimating the extent of component wear using steady-state measurements. This paper details a method for estimating component wear using steady-state measurements, describes how this is used to predict future states, and presents a case study of a current/pressure (I/P) Transducer. I/P Transducer nominal and off-nominal behaviors are characterized using a physics-based model, and validated against expected and observed component behavior. This model is used to map observed steady-state responses to corresponding fault parameter values in the form of a lookup table. This method was chosen because of its fast, efficient nature, and its ability to be applied to both linear and non-linear systems. Using measurements of the steady state output, and the lookup table, wear is estimated. A regression is used to estimate the wear propagation parameter and characterize the damage progression function, which are used to predict future states and the remaining useful life of the system.

  4. Global Estimates of Dietary Intake of Docosahexaenoic Acid and Arachidonic Acid in Developing and Developed Countries.

    PubMed

    Forsyth, Stewart; Gautier, Sheila; Salem, Norman

    2016-01-01

    For international recommendations on docosahexaenoic acid (DHA) and arachidonic acid (ARA) dietary intake to be valid, there needs to be a greater understanding of dietary patterns across both the developed and developing world. The aim of this investigation was to provide a global overview of dietary intake of DHA and ARA. Food balance sheets from the Food and Agriculture Organisation Statistics Division and fatty acid composition data from Australian food composition tables in Nutrient Tables 2010 were utilised to generate median per capita intake estimates for DHA and ARA in 175 countries worldwide. Estimated dietary intake per capita for DHA and ARA in 47 developed and 128 developing countries demonstrated that 48% of the 175 countries have an ARA intake of <150 mg/day and 64% have a dietary DHA intake of <200 mg/day. There was a direct relationship between dietary ARA and DHA intake and the per capita gross national income of the country. Regional analysis showed the lowest ARA and DHA dietary intake in Sub-Saharan Africa and Central and Southern Asian populations. This study demonstrates there are many populations worldwide that have ARA and DHA intake that do not reflect current international recommendations, and the public health consequences of this global inadequacy need to be urgently considered. © 2016 S. Karger AG, Basel.

  5. Guide to Academic Research Career Development

    PubMed Central

    Smith, Richard J.; Graboyes, Evan M.; Paniello, Randal C.; Paul Gubbels, Samuel

    2016-01-01

    Objectives/Hypothesis Development of an academic career easily follows a clinical course for which there are multiple role models; however, development of an academic research career involves few role models, and rarely do instructional guides reach out to the new faculty. The purpose of this article is to present the cumulative experiences of previously and currently funded authors to serve as a guide to young as well as older faculty for developing their research careers. Study Design Cumulative experiences of research‐dedicated faculty. Methods This article is the result of lessons learned from developing a Triological Society National Physician‐Scientist Program and Network, as well as the cumulative experiences of the authors. Results Table I illustrates key elements in developing a serious research career. Table II records the career courses of five surgeon‐scientists, highlighting the continued theme focus with theme‐specific publications and progressive grants. These cumulative experiences have face validity but have not been objectively tested. The value added is a composite of 50 years of experiences from authors committed to research career development for themselves and others. Conclusion Crucial elements in developing a research career are a desire for and commitment to high‐quality research, a focus on an overall theme of progressive hypothesis‐driven investigations, research guidance, a willingness to spend the time required, and an ability to learn from and withstand failure. Level of Evidence 5. PMID:28894799

  6. Simple SNP-based minimal marker genotyping for Humulus lupulus L. identification and variety validation.

    PubMed

    Henning, John A; Coggins, Jamie; Peterson, Matthew

    2015-10-06

    Hop is an economically important crop for the Pacific Northwest USA as well as other regions of the world. It is a perennial crop with rhizomatous or clonal propagation system for varietal distribution. A big concern for growers as well as brewers is variety purity and questions are regularly posed to public agencies concerning the availability of genotype testing. Current means for genotyping are based upon 25 microsatellites that provides relatively accurate genotyping but cannot always differentiate sister-lines. In addition, numerous PCR runs (25) are required to complete this process and only a few laboratories exist that perform this service. A genotyping protocol based upon SNPs would enable rapid accurate genotyping that can be assayed at any laboratory facility set up for SNP-based genotyping. The results of this study arose from a larger project designed for whole genome association studies upon the USDA-ARS hop germplasm collection consisting of approximately 116 distinct hop varieties and germplasm (female lines) from around the world. The original dataset that arose from partial sequencing of 121 genotypes resulted in the identification of 374,829 SNPs using TASSEL-UNEAK pipeline. After filtering out genotypes with more than 50% missing data (5 genotypes) and SNP markers with more than 20% missing data, 32,206 highly filtered SNP markers across 116 genotypes were identified and considered for this study. Minor allele frequency (MAF) was calculated for each SNP and ranked according to the most informative to least informative. Only those markers without missing data across genotypes as well as 60% or less heterozygous gamete calls were considered for further analysis. Genetic distances among individuals in the study were calculated using the marker with the highest MAF value, then by using a combination of the two markers with highest MAF values and so on. This process was reiterated until a set of markers was identified that allowed for all genotypes in the study to be genetically differentiated from each other. Next, we compared genetic matrices calculated from the minimal marker sets [(Table 2; 6-, 7-, 8-, 10- and 12-marker set matrices] and that of a matrix calculated from a set of markers with no missing data across all 116 samples (1006 SNP markers). The minimum number of markers required to meet both specifications was a set of 7-markers (Table 3). These seven SNPs were then aligned with a genome assembly, and DNA sequence both upstream and downstream were used to identify primer sequences that can be used to develop seven amplicons for high resolution melting curve PCR detection or other SNP-based PCR detection methods. This study identifies a set of 7 SNP markers that may prove useful for the identification and validation of hop varieties and accessions. Variety validation of unknown samples assumes that the variety under question has been included a priori in a discovery panel. These results are based upon in silica studies and markers need to be validated using different SNP marker technology upon a differential set of hop genotypes. The marker sequence data and suggested primer sets provide potential means to fingerprint hop varieties in most genetic laboratories utilizing SNP-marker technology.

  7. The Physiological Demands of Table Tennis: A Review

    PubMed Central

    Kondrič, Miran; Zagatto, Alessandro Moura; Sekulić, Damir

    2013-01-01

    Although table tennis has a tradition lasting more than 100 years, relatively little is known about players’ physiological requirements – especially during competition. In this review we discuss research studies that have led to our current understanding of how the body functions during table tennis training and competition and how this is altered by training. Match and practice analysis of the table tennis game indicates that during intense practice and competition it is predominantly the anaerobic alactic system that is called into play, while the endurance system is relied on to recovery the anaerobic stores used during such effort. It is thus important for coaches to keep in mind that, while the anaerobic alactic system is the most energetic system used during periods of exertion in a table tennis game, a strong capacity for endurance is what helps a player recover quicker for the following match and the next day of competition. This paper provides a review of specific studies that relate to competitive table tennis, and highlights the need for training and research programs tailored to table tennis. Key Points Match and practice analysis of the table tennis game indicates that during intense practice and competition it is predominantly the anaerobic alactic system that is called into play. The endurance system is relied on to recovery the anaerobic stores used during hard practice and competition effort. It is important for coaches to keep in mind that, while the anaerobic alactic system is the most energetic system used during periods of exertion in a table tennis game, a strong capacity for endurance is what helps a player recover quicker for the following match and the next day of competition. PMID:24149139

  8. Online Periodic Table: A Cautionary Note

    NASA Astrophysics Data System (ADS)

    Izci, Kemal; Barrow, Lloyd H.; Thornhill, Erica

    2013-08-01

    The purpose of this study was (a) to evaluate ten online periodic table sources for their accuracy and (b) to compare the types of information and links provided to users. Limited studies have been reported on online periodic table (Diener and Moore 2011; Slocum and Moore in J Chem Educ 86(10):1167, 2009). Chemistry students' understanding of periodic table is vital for their success in chemistry, and the online periodic table has the potential to advance learners' understanding of chemical elements and fundamental chemistry concepts (Brito et al. in J Res Sci Teach 42(1):84-111, 2005). The ten sites were compared for accuracy of data with the Handbook of Chemistry and Physics (HCP, Haynes in CRC handbook of chemistry and physics: a ready-reference book of chemical and physical data. CRC Press, Boca Raton 2012). The 10 sites are the most visited periodic table Web sites available. Four different elements, carbon, gold, argon, and plutonium, were selected for comparison, and 11 different attributes for each element were identified for evaluating accuracy. A wide variation of accuracy was found among the 10 periodic table sources. Chemicool was the most accurate information provider with 66.67 % accuracy when compared to the HCP. The 22 types of information including meaning of name and use in industry and society provided by these sites were, also, compared. WebElements, "Chemicool", "Periodic Table Live", and "the Photographic Periodic Table of the Elements" were the most information providers, providing 86.36 % of information among the 10 Web sites. "WebElements" provides the most links among the 10 Web sites. It was concluded that if an individual teacher or student desires only raw physical data from element, the Internet might not be the best choice.

  9. Natural Resources Research Program: Summary of the 1986-87 Campground Receipt Study

    DTIC Science & Technology

    1991-03-01

    can sum to more than 100 percent because parties can use multiple pieces of equipment. B16 Table BI5 1987 Shenango Lake (Shenango Rec Area) User...table represent actual camping use. ** Percent of camping parties. t Percent of camping permits. B17 Table Bl6 1987 Shenango Lake (Shenango Rec Area

  10. Risk Factors for Sexual Violence in the Military: An Analysis of Sexual Assault and Sexual Harassment Incidents and Reporting

    DTIC Science & Technology

    2017-03-01

    53 ix LIST OF TABLES Table 1. Descriptive Statistics for Control Variables by... Statistics for Control Variables by Gender (Random Subsample with Complete Survey) ............................................................30 Table...empirical analysis. Chapter IV describes the summary statistics and results. Finally, Chapter V offers concluding thoughts, study limitations, and

  11. Beginning Subbaccalaureate Students' Labor Market Experiences: Six Years Later in 2009. Web Tables. NCES 2012-273

    ERIC Educational Resources Information Center

    Ifill, Nicole; Radford, Alexandria Walton

    2012-01-01

    This set of Web Tables presents descriptive statistics on the spring 2009 labor market experiences of subbaccalaureate students who first entered postsecondary education in 2003-04. The Web Tables use data from the nationally representative 2004/09 Beginning Post-secondary Students Longitudinal Study (BPS:04/09), which followed a cohort of…

  12. A Process Improvement Study on a Military System of Clinics to Manage Patient Demand and Resource Utilization Using Discrete-Event Simulation, Sensitivity Analysis, and Cost-Benefit Analysis

    DTIC Science & Technology

    2015-03-12

    26 Table 3: Optometry Clinic Frequency Count... Optometry Clinic Frequency Count.................................................................. 86 Table 22: Probability Distribution Summary Table...Clinic, the Audiology Clinic, and the Optometry Clinic. Methodology Overview The overarching research goal is to identify feasible solutions to

  13. Voice-Related Patient-Reported Outcome Measures: A Systematic Review of Instrument Development and Validation

    PubMed Central

    Daniero, James J.; Hovis, Kristen L.; Sathe, Nila; Jacobson, Barbara; Penson, David F.; Feurer, Irene D.; McPheeters, Melissa L.

    2017-01-01

    Purpose The purpose of this study was to perform a comprehensive systematic review of the literature on voice-related patient-reported outcome (PRO) measures in adults and to evaluate each instrument for the presence of important measurement properties. Method MEDLINE, the Cumulative Index of Nursing and Allied Health Literature, and the Health and Psychosocial Instrument databases were searched using relevant vocabulary terms and key terms related to PRO measures and voice. Inclusion and exclusion criteria were developed in consultation with an expert panel. Three independent investigators assessed study methodology using criteria developed a priori. Measurement properties were examined and entered into evidence tables. Results A total of 3,744 studies assessing voice-related constructs were identified. This list was narrowed to 32 PRO measures on the basis of predetermined inclusion and exclusion criteria. Questionnaire measurement properties varied widely. Important thematic deficiencies were apparent: (a) lack of patient involvement in the item development process, (b) lack of robust construct validity, and (c) lack of clear interpretability and scaling. Conclusions PRO measures are a principal means of evaluating treatment effectiveness in voice-related conditions. Despite their prominence, available PRO measures have disparate methodological rigor. Care must be taken to understand the psychometric and measurement properties and the applicability of PRO measures before advocating for their use in clinical or research applications. PMID:28030869

  14. Suggested Format for Acute Toxicity Studies

    EPA Pesticide Factsheets

    This document suggests the format for final reports on pesticide studies (right column of the tables in the document) and provides instructions for the creation of PDF Version 1.3 electronic submission documents (left column of the tables).

  15. Validity of Self-reported Sleep Bruxism among Myofascial Temporomandibular Disorder Patients and Controls

    PubMed Central

    Raphael, Karen G.; Janal, Malvin N.; Sirois, David A.; Dubrovsky, Boris; Klausner, Jack J.; Krieger, Ana C.; Lavigne, Gilles J.

    2015-01-01

    Sleep bruxism (SB), primarily involving rhythmic grinding of the teeth during sleep, has been advanced as a causal or maintenance factor for a variety of orofacial problems, including temporomandibular disorders (TMD). Since laboratory polysomnographic (PSG) assessment is extremely expensive and time-consuming, most research testing this belief has relied on patient self-report of SB. The current case-control study examined the accuracy of those self-reports relative to laboratory-based PSG assessment of SB in a large sample of women suffering from chronic myofascial TMD (n=124) and a demographically matched control group without TMD (n=46). A clinical research coordinator administered a structured questionnaire to assess self-reported SB. Participants then spent two consecutive nights in a sleep laboratory. Audiovisual and electromyographic data from the second night were scored to assess whether participants met criteria for presence of 2 or more (2+) rhythmic masticatory muscle activity episodes accompanied by grinding sounds, moderate SB, or severe SB, using previously validated research scoring standards. Contingency tables were constructed to assess positive and negative predictive values, sensitivity and specificity, and 95% confidence intervals surrounding the point estimates. Results showed that self-report significantly predicted 2+ grinding sounds during sleep for TMD cases. However, self-reported SB failed to significantly predict presence or absence of either moderate or severe SB as assessed by PSG, for both cases and controls. These data show that self-report of tooth grinding awareness is highly unlikely to be a valid indicator of true SB. Studies relying on self-report to assess SB must be viewed with extreme caution. PMID:26010126

  16. Validity of early parathyroid hormone assay as a diagnostic tool for sub-total thyroidectomy related hypocalcaemia.

    PubMed

    Riaz, Umbreen; Shah, Syed Aslam; Zahoor, Imran; Riaz, Arsalan; Zubair, Muhammad

    2014-07-01

    To determine the validity of early (one hour postoperatively) parathyroid hormone (PTH) assay (² 10 pg/ml), keeping gold standard as the serum ionic calcium level, for predicting sub-total thyroidectomy-related hypocalcaemia and to calculate the sensitivity and specificity of latent signs of tetany. Cross-sectional validation study. Department of General Surgery, Pakistan Institute of Medical Sciences, Islamabad from August 2008 to August 2010. Patients undergoing sub-total thyroidectomy were included by convenience sampling. PTH assay was performed 1 hour post sub-total thyroidectomy. Serum calcium levels were performed at 24 and 48 hours, 5th day and 2 weeks after surgery. Cases that developed hypocalcaemia were followed-up for a period of 6 months with monthly calcium level estimation to identify cases of permanent hypocalcaemia. Symptoms and signs of hypocalcaemia manifesting in our patients were recorded. Data was analyzed through SPSS version 10. 2 x 2 tables were used to calculate sensitivity and specificity of PTH in detecting post-thyroidectomy hypocalcaemia. Out of a total of 110 patients included in the study, 16.36% (n=18) developed hypocalcaemia including 1.81% (n=2) cases of permanent hypoparathyroidism. The sensitivity of one hour postoperative PTH assay as a predictive tool for post-thyroidectomy related hypocalcaemia was 94.4% while its specificity was 83.6% with 53% positive predictive value and 98.7% negative predictive value. One hour post sub-total thyroidectomy PTH assay can be helpful in predicting post sub-total thyroidectomy hypocalcaemia. Moreover, it can be useful in safe discharge of day-care thyroidectomy patients.

  17. The diagnostic efficiency of the extended German Brøset Violence Checklist to assess the risk of violence.

    PubMed

    Rechenmacher, Josef; Müller, Gerhard; Abderhalden, Christoph; Schulc, Eva

    2014-01-01

    The prevention of aggression and violence of patients is part of the challenge for the psychiatric inpatient care. Resources needed are a systematic risk assessment and taking preventive measures according to the risk. The extended Brøset Violence Checklist (BVC-CH) is an assessment instrument for the short-term assessment of the risk of violence for physical attacks toward medical staff and other patients. Until now, the instrument was only validated in the context of the development phase of the instrument. The aim of this study was to investigate how valid the BVC-CH scale is for adult psychiatry in acute inpatient care facilities. In a prospective cohort study, 232 consecutively admitted patients were assessed using the BVC-CH. The calculation of the predictive values was based on a contingency table. The discriminatory power of the instrument and the determination of the cutoff point were done using the receiver operating characteristic (ROC) curve analysis. Physical attacks were registered with the Staff Observation of Aggression Scale-Revised (SOAS-R). The sensitivity was 58.8% and the specificity was 96.8% by a cutoff point of > or = 7. By choosing a cutoff point of > or = 6, the sensitivity was 64.7% and the specificity was 95.1%. A value of .93 was determined for the area under the curve receiver operating characteristic (AUC(ROC)). Overall, the BVC-CH is a valid instrument for the short-term prediction of physical attacks. Further research of the BVC-CH is recommended but in particular for the cutoff point.

  18. Using connected objects in clinical research.

    PubMed

    Dhainaut, Jean-François; Huot, Laure; Pomar, Valérie Bouchara; Dubray, Claude

    2018-02-01

    Connected objects (CO), whether medical devices or not, are used in clinical research for data collection, a specific activity (communication, diagnosis, effector, etc.), or several functions combined. Their validation should be based on three approaches: technical and clinical reliability, data protection and cybersecurity. Consequently, the round table recommends that the typology of COs, their uses and limitations, be known and shared by all, particularly for implementing precise specifications. COs are used in clinical research during observational studies (assessment of the device itself or data collection), randomized studies, where only one group has a CO (assessment of its impact on patient follow-up or management), or randomized studies where both groups have a CO, which is then used as a tool to help with assessment. The benefits of using COs in clinical research includes: improved collection and quality of data, compliance of patients and pharmacovigilance, easier implementation of e-cohorts and a better representative balance of patients. The societal limits and risks identified relate to the sometimes intrusive nature of certain collected parameters and the possible misuse of data. The round table recommends the following on this last point: anticipation, by securing transmission methods, the qualification of data hosts, and assessment of the object's vulnerability. For this, a risk analysis appears necessary for each project. It is also necessary to accurately document the data flow, in order to inform both patients and healthcare professionals and to ensure adequate security. Anticipating regulatory changes and involving users starting from the study design stage are also recommended. Copyright © 2018 Société française de pharmacologie et de thérapeutique. Published by Elsevier Masson SAS. All rights reserved.

  19. Applying the transtheoretical model to tobacco cessation and prevention: a review of literature.

    PubMed

    Spencer, Leslie; Pagell, Francie; Hallion, Maria Elena; Adams, Troy B

    2002-01-01

    To comprehensively review all published, peer-reviewed research on the Transtheoretical Model (TTM) and tobacco cessation and prevention by exploring the validity of its constructs, the evidence for use of interventions based on the TTM, the description of populations using TTM constructs, and the identification of areas for further research. The three research questions answered were: "How is the validity of the TTM as applied to tobacco supported by research?" "How does the TTM describe special populations regarding tobacco use?" "What is the nature of evidence supporting the use of stage-matched tobacco interventions?" Computer Database search (PsychInfo, Medline, Current Contents, ERIC, CINAHL-Allied Health, and Pro-Quest Nursing) and manual journal search. INCLUSION/EXCLUSION CRITERIA: All English, original, research articles on the TTM as it relates to tobacco use published in peer-reviewed journals prior to March 1, 2001, were included. Commentaries, editorials, and books were not included. Articles were categorized as TTM construct validation, population descriptions using TTM constructs, or intervention evaluation using TTM constructs. Summary tables including study design, research rating, purpose, methods, findings, and implications were created. Articles were further divided into groups according to their purpose. Considering both the findings and research quality of each, the three research questions were addressed. The 148 articles reviewed included 54 validation studies, 73 population studies, and 37 interventions (some articles fit two categories). Overall, the evidence in support of the TTM as applied to tobacco use was strong, with supportive studies being more numerous and of a better design than nonsupportive studies. Using established criteria, we rated the construct validity of the entire body of literature as good; however, notable concerns exist about the staging construct. A majority of stage-matched intervention studies provided positive results and were of a better quality than those not supportive of stage-matched interventions; thus, we rated the body of literature using stage-matched tobacco interventions as acceptable and the body of literature using non-stage-matched interventions as suggestive. Population studies indicated that TTM constructs are applicable to a wide variety of general and special populations both in and outside of the United States, although a few exceptions exist. Evidence for the validity of the TTM as it applies to tobacco use is strong and growing; however, it is not conclusive. Eight different staging mechanisms were identified, raising the question of which are most valid and reliable. Interventions tailored to a smoker's stage were successful more often than nontailored interventions in promoting forward stage movement. Stage distribution is well-documented for U.S. populations; however, more research is needed for non-U.S. populations, for special populations, and on other TTM constructs.

  20. A revision of the genus Muricea Lamouroux, 1821 (Anthozoa, Octocorallia) in the eastern Pacific. Part II

    PubMed Central

    Breedy, Odalisca; Guzman, Hector M.

    2016-01-01

    Abstract The species of the genus Muricea were mainly described from 1846 to 1870. After that very few contributions were published. Although the highest richness of Muricea species is in the eastern Pacific shallow waters, a comprehensive systematic study of the genus does not exist. Recently we started a taxonomic review of the genus in order to validate the status of four species previously included in the genus Eumuricea. Herein we present the second part of the Muricea revision dealing with the species-group characterised by shelf-like calyces instead of tubular-like calyces (the Muricea squarrosa-group). Original type material was morphologically analysed and illustrated using optical and scanning electron microscopy. Comparative character tables are provided for the genus. The taxonomic status of the species was analysed and established by designating lectotypes, alternatively by recognising a holotype by monotypy. We conclude that the genus Muricea comprises 20 valid species, including the previous four in the Muricea squarrosa-group. We propose 10 lectotypes, a new combination and three more species groups for the genus based on morphology: the Muricea fruticosa-group, Muricea plantaginea-group and Muricea austera-group. PMID:27199581

  1. Validation of a vector version of the 6S radiative transfer code for atmospheric correction of satellite data. Part II. Homogeneous Lambertian and anisotropic surfaces.

    PubMed

    Kotchenova, Svetlana Y; Vermote, Eric F

    2007-07-10

    This is the second part of the validation effort of the recently developed vector version of the 6S (Second Simulation of a Satellite Signal in the Solar Spectrum) radiative transfer code (6SV1), primarily used for the calculation of look-up tables in the Moderate Resolution Imaging Spectroradiometer (MODIS) atmospheric correction algorithm. The 6SV1 code was tested against a Monte Carlo code and Coulson's tabulated values for molecular and aerosol atmospheres bounded by different Lambertian and anisotropic surfaces. The code was also tested in scalar mode against the scalar code SHARM to resolve the previous 6S accuracy issues in the case of an anisotropic surface. All test cases were characterized by good agreement between the 6SV1 and the other codes: The overall relative error did not exceed 0.8%. The study also showed that ignoring the effects of radiation polarization in the atmosphere led to large errors in the simulated top-of-atmosphere reflectances: The maximum observed error was approximately 7.2% for both Lambertian and anisotropic surfaces.

  2. Validation of a vector version of the 6S radiative transfer code for atmospheric correction of satellite data. Part II. Homogeneous Lambertian and anisotropic surfaces

    NASA Astrophysics Data System (ADS)

    Kotchenova, Svetlana Y.; Vermote, Eric F.

    2007-07-01

    This is the second part of the validation effort of the recently developed vector version of the 6S (Second Simulation of a Satellite Signal in the Solar Spectrum) radiative transfer code (6SV1), primarily used for the calculation of look-up tables in the Moderate Resolution Imaging Spectroradiometer (MODIS) atmospheric correction algorithm. The 6SV1 code was tested against a Monte Carlo code and Coulson's tabulated values for molecular and aerosol atmospheres bounded by different Lambertian and anisotropic surfaces. The code was also tested in scalar mode against the scalar code SHARM to resolve the previous 6S accuracy issues in the case of an anisotropic surface. All test cases were characterized by good agreement between the 6SV1 and the other codes: The overall relative error did not exceed 0.8%. The study also showed that ignoring the effects of radiation polarization in the atmosphere led to large errors in the simulated top-of-atmosphere reflectances: The maximum observed error was approximately 7.2% for both Lambertian and anisotropic surfaces.

  3. User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Coleman, Kayla; Hooper, Russell W.

    2016-10-04

    In general, Dakota is the Consortium for Advanced Simulation of Light Water Reactors (CASL) delivery vehicle for verification, validation, and uncertainty quantification (VUQ) algorithms. It permits ready application of the VUQ methods described above to simulation codes by CASL researchers, code developers, and application engineers. More specifically, the CASL VUQ Strategy [33] prescribes the use of Predictive Capability Maturity Model (PCMM) assessments [37]. PCMM is an expert elicitation tool designed to characterize and communicate completeness of the approaches used for computational model definition, verification, validation, and uncertainty quantification associated with an intended application. Exercising a computational model with the methodsmore » in Dakota will yield, in part, evidence for a predictive capability maturity model (PCMM) assessment. Table 1.1 summarizes some key predictive maturity related activities (see details in [33]), with examples of how Dakota fits in. This manual offers CASL partners a guide to conducting Dakota-based VUQ studies for CASL problems. It motivates various classes of Dakota methods and includes examples of their use on representative application problems. On reading, a CASL analyst should understand why and how to apply Dakota to a simulation problem.« less

  4. A Study to Determine The Best Way, Within Existing Constraints, to Minimize Lost Training Time Due to Recruit Sick Call at the Marine Corps Recruit Depot, Parris Island, South Carolina

    DTIC Science & Technology

    1988-07-01

    Assistants. Chicago, IL; American Medical Association, 1985 Light, 1317 Nichols, AW. *Physician Extenders, the Law , and the Future.’ Journal of Family...97 * 4 . LIST OF TABLES Table 1 Distance of Subpopulations from Branch ............ 35 Medical Clinic, MCRD, Parris Island Table 2 Total Recruit...Sick Call, June 1987 to ............. 50 May 1988 Table 3 Number of Recruit Sick Call Visits Per Day ........ 50 at the Branch Medical Clinic. MCRD

  5. Decennial Life Tables for the White Population of the United States, 1790-1900.

    PubMed

    Hacker, J David

    2010-04-01

    This article constructs new life tables for the white population of the United States in each decade between 1790 and 1900. Drawing from several recent studies, it suggests best estimates of life expectancy at age 20 for each decade. These estimates are fitted to new standards derived from the 1900-02 rural and 1900-02 overall DRA life tables using a two-parameter logit model with fixed slope. The resulting decennial life tables more accurately represent sex-and age-specific mortality rates while capturing known mortality trends.

  6. Comparison between the standard and a new alternative format of the Summary-of-Findings tables in Cochrane review users: study protocol for a randomized controlled trial.

    PubMed

    Carrasco-Labra, Alonso; Brignardello-Petersen, Romina; Santesso, Nancy; Neumann, Ignacio; Mustafa, Reem A; Mbuagbaw, Lawrence; Ikobaltzeta, Itziar Etxeandia; De Stio, Catherine; McCullagh, Lauren J; Alonso-Coello, Pablo; Meerpohl, Joerg J; Vandvik, Per Olav; Brozek, Jan L; Akl, Elie A; Bossuyt, Patrick; Churchill, Rachel; Glenton, Claire; Rosenbaum, Sarah; Tugwell, Peter; Welch, Vivian; Guyatt, Gordon; Schünemann, Holger

    2015-04-16

    Systematic reviews represent one of the most important tools for knowledge translation but users often struggle with understanding and interpreting their results. GRADE Summary-of-Findings tables have been developed to display results of systematic reviews in a concise and transparent manner. The current format of the Summary-of-Findings tables for presenting risks and quality of evidence improves understanding and assists users with finding key information from the systematic review. However, it has been suggested that additional methods to present risks and display results in the Summary-of-Findings tables are needed. We will conduct a non-inferiority parallel-armed randomized controlled trial to determine whether an alternative format to present risks and display Summary-of-Findings tables is not inferior compared to the current standard format. We will measure participant understanding, accessibility of the information, satisfaction, and preference for both formats. We will invite systematic review users to participate (that is clinicians, guideline developers, and researchers). The data collection process will be undertaken using the online 'Survey Monkey' system. For the primary outcome understanding, non-inferiority of the alternative format (Table A) to the current standard format (Table C) of Summary-of-Findings tables will be claimed if the upper limit of a 1-sided 95% confidence interval (for the difference of proportion of participants answering correctly a given question) excluded a difference in favor of the current format of more than 10%. This study represents an effort to provide systematic reviewers with additional options to display review results using Summary-of-Findings tables. In this way, review authors will have a variety of methods to present risks and more flexibility to choose the most appropriate table features to display (that is optional columns, risks expressions, complementary methods to display continuous outcomes, and so on). NCT02022631 (21 December 2013).

  7. A study on the influence of tides on the water table conditions of the shallow coastal aquifers

    NASA Astrophysics Data System (ADS)

    Singaraja, C.; Chidambaram, S.; Jacob, Noble

    2018-03-01

    Tidal variation and water level in aquifer is an important function in the coastal environment, this study attempts to find the relationship between water table fluctuation and tides in the shallow coastal aquifers. The study was conducted by selecting three coastal sites and by monitoring the water level for every 2-h interval in 24 h of observation. The study was done during two periods of full moon and new moon along the Cuddalore coastal region of southern part of Tamil Nadu, India. The study shows the relationship between tidal variation, water table fluctuations, dissolved oxygen, and electrical conductivity. An attempt has also been made in this study to approximate the rate of flow of water. Anyhow, the differences are site specific and the angle of inclination of the water table shows a significant relation to the mean sea level, with respect to the distance of the point of observation from the sea and elevation above mean sea level.

  8. Anthropometric evaluation and recommendation for primary schools classroom furniture design in Perlis

    NASA Astrophysics Data System (ADS)

    Shan, Lim Shaiu; Jing, Ewe Hui; Effendi, M. S. M.; Rosli, Muhamad Farizuan

    2017-09-01

    This study was carried out with the objective to obtain the anthropometric data of primary school children from Year 1 to Year 6 and evaluate the children's anthropometry with the current dimensions of classroom furniture (i.e. chair and table). In addition, this study also proposed the design dimensions for the improvement in classroom furniture design with the consideration of children's anthropometric data. A total of 390 children selected from 13 primary schools in Perlis, Malaysia were participated in this study. There were 11 anthropometric measurements of children have been measured in this study, which include stature (St), popliteal height (PH), knee height (KH), thigh thickness (TT), buttock popliteal length (BPL), hip breadth (HB), sitting shoulder height (SSH), sitting elbow height (SEH), forearm-hand length (FHL), height of lumbar point (HLP) and buttock clearance (BC). Besides that, 7 dimensions relating to current classroom chair have been measured, such as seat height (SH), seat depth (SD), seat width (SW), upper edge of backrest (UEB), lower edge of backrest (LEB), S point (SP), overall chair height (OCH). Another 5 dimensions of the existing classroom table have been measured too, which involve table height (TH), table depth (TD), table width (TW), under table height (UH) and seat to table clearance (STC). All the measurements were performed by using metal measuring tape. The anthropometric data of the children were analyzed with the help of Microsoft Excel 2013. Several equations with associated with the anthropometric data and furniture dimensions have been applied in this research. The new design dimensions for classroom furniture that proposed in this paper which based on the collected anthropometric data can be referred as a guideline for classroom furniture design. The implementation of these data may help to create comfortability, safety, suitability and improve performance of children in the classroom.

  9. A review of dentists' use of digital radiography and caries diagnosis with digital systems.

    PubMed

    Wenzel, A

    2006-09-01

    To investigate the evidence for (1) dentists' use of digital radiography and (2) the outcome of caries diagnosis with digital systems. A literature search with the software search package PubMed was used to get internet-based access to Medline through the website www.ncbi.nlm.nih.gov/pubmed. The search was limited to the years 1999-2005 since most papers dealing with the diagnostic value of digital radiography systems published before 1999 will hold little interest for today's users due to changes in the systems. The search strategies resulted in 123 articles (Table 1, #4 and #5). Original research articles (not reviews) were selected by the following inclusion criteria: (1) questionnaire studies on the use of direct digital intraoral radiography systems (not digitized film), (2) studies which used human teeth and natural caries lesions, and further in laboratory studies, the sectioned tooth was the gold standard for validating the presence or depth of a lesion. The search resulted in 42 articles fulfilling the above criteria, which could be grouped into three types of studies: (a) questionnaire studies, (b) clinical (in vivo) studies, and (c) laboratory (in vitro) studies. Nine questionnaire studies, five clinical studies and 28 laboratory studies were found. These studies and their results are summarized in Tables 2-5. The number of studies was limited, and some of the digital systems were evaluated in only one or two studies. A conclusive judgment may therefore not be possible for the majority of the digital systems selected for this review. There is a continuous need for the evaluation of new digital intraoral radiography systems that appear on the market, first and foremost for their image quality and diagnostic accuracy, but certainly also for their performance in the clinic, a clear deficiency observed after the literature search for the present review.

  10. Automatic layout of structured hierarchical reports.

    PubMed

    Bakke, Eirik; Karger, David R; Miller, Robert C

    2013-12-01

    Domain-specific database applications tend to contain a sizable number of table-, form-, and report-style views that must each be designed and maintained by a software developer. A significant part of this job is the necessary tweaking of low-level presentation details such as label placements, text field dimensions, list or table styles, and so on. In this paper, we present a horizontally constrained layout management algorithm that automates the display of structured hierarchical data using the traditional visual idioms of hand-designed database UIs: tables, multi-column forms, and outline-style indented lists. We compare our system with pure outline and nested table layouts with respect to space efficiency and readability, the latter with an online user study on 27 subjects. Our layouts are 3.9 and 1.6 times more compact on average than outline layouts and horizontally unconstrained table layouts, respectively, and are as readable as table layouts even for large datasets.

  11. Studies, Summary Tables, and Data Related to the Advancing Sustainable Materials Management Report

    EPA Pesticide Factsheets

    This webpage provides further information about how EPA measures data for the annual Advancing Materials Management Report. Researchers can use the tables and studies to better understand how waste in managed in America

  12. Double hashing technique in closed hashing search process

    NASA Astrophysics Data System (ADS)

    Rahim, Robbi; Zulkarnain, Iskandar; Jaya, Hendra

    2017-09-01

    The search process is used in various activities performed both online and offline, many algorithms that can be used to perform the search process one of which is a hash search algorithm, search process with hash search algorithm used in this study using double hashing technique where the data will be formed into the table with same length and then search, the results of this study indicate that the search process with double hashing technique allows faster searching than the usual search techniques, this research allows to search the solution by dividing the value into the main table and overflow table so that the search process is expected faster than the data stacked in the form of one table and collision data could avoided.

  13. Dynamics of water-table fluctuations in an upland between two prairie-pothole wetlands in North Dakota

    USGS Publications Warehouse

    Rosenberry, Donald O.; Winter, Thomas C.

    1997-01-01

    Data from a string of instrumented wells located on an upland of 55 m width between two wetlands in central North Dakota, USA, indicated frequent changes in water-table configuration following wet and dry periods during 5 years of investigation. A seasonal wetland is situated about 1.5 m higher than a nearby semipermanent wetland, suggesting an average ground water-table gradient of 0.02. However, water had the potential to flow as ground water from the upper to the lower wetland during only a few instances. A water-table trough adjacent to the lower semipermanent wetland was the most common water-table configuration during the first 4 years of the study, but it is likely that severe drought during those years contributed to the longevity and extent of the water-table trough. Water-table mounds that formed in response to rainfall events caused reversals of direction of flow that frequently modified the more dominant water-table trough during the severe drought. Rapid and large water-table rise to near land surface in response to intense rainfall was aided by the thick capillary fringe. One of the wettest summers on record ended the severe drought during the last year of the study, and caused a larger-scale water-table mound to form between the two wetlands. The mound was short in duration because it was overwhelmed by rising stage of the higher seasonal wetland which spilled into the lower wetland. Evapotranspiration was responsible for generating the water-table trough that formed between the two wetlands. Estimation of evapotranspiration based on diurnal fluctuations in wells yielded rates that averaged 3–5 mm day−1. On many occasions water levels in wells closer to the semipermanent wetland indicated a direction of flow that was different from the direction indicated by water levels in wells farther from the wetland. Misinterpretation of direction and magnitude of gradients between ground water and wetlands could result from poorly placed or too few observation wells, and also from infrequent measurement of water levels in wells.

  14. Sensitivity tests on the rates of the excited states of positron decays during the rapid proton capture process of the one-zone X-ray burst model

    NASA Astrophysics Data System (ADS)

    Lau, Rita

    2018-02-01

    In this paper, we investigate the sensitivities of positron decays on a one-zone model of type-I X-ray bursts. Most existing studies have multiplied or divided entire beta decay rates (electron captures and beta decay rates) by 10. Instead of using the standard Fuller & Fowler (FFNU) rates, we used the most recently developed weak library rates [1], which include rates from Langanke et al.'s table (the LMP table) (2000) [2], Langanke et al.'s table (the LMSH table) (2003) [3], and Oda et al.'s table (1994) [4] (all shell model rates). We then compared these table rates with the old FFNU rates [5] to study differences within the final abundances. Both positron decays and electron capture rates were included in the tables. We also used pn-QRPA rates [6,7] to study the differences within the final abundances. Many of the positron rates from the nuclei's ground states and initial excited energy states along the rapid proton capture (rp) process have been measured in existing studies. However, because temperature affects the rates of excited states, these studies should have also acknowledged the half-lives of the nuclei's excited states. Thus, instead of multiplying or dividing entire rates by 10, we studied how the half-lives of sensitive nuclei in excited states affected the abundances by dividing the half-lives of the ground states by 10, which allowed us to set the half-lives of the excited states. Interestingly, we found that the peak of the final abundance shifted when we modified the rates from the excited states of the 105Sn positron decay rates. Furthermore, the abundance of 80Zr also changed due to usage of pn-QRPA rates instead of weak library rates (the shell model rates).

  15. Primary School Children's Strategies in Solving Contingency Table Problems: The Role of Intuition and Inhibition

    ERIC Educational Resources Information Center

    Obersteiner, Andreas; Bernhard, Matthias; Reiss, Kristina

    2015-01-01

    Understanding contingency table analysis is a facet of mathematical competence in the domain of data and probability. Previous studies have shown that even young children are able to solve specific contingency table problems, but apply a variety of strategies that are actually invalid. The purpose of this paper is to describe primary school…

  16. Analysis of Modal Growth on the Leeward Centerplane of the X-51 Vehicle

    DTIC Science & Technology

    2009-09-01

    Research Center ( CUBRC ) 4455 Genesee Street Buffalo, NY 14225 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) Air Force Research...8 Figure 9. Disturbance N-factor Growth and CUBRC Data Showing Transition...for CUBRC Run 4 9 LIST OF TABLES Table Page Table 1. Freestream Conditions for Ground Test Cases Selected for Modal Analysis Study

  17. Holt-Winters Forecasting: A Study of Practical Applications for Healthcare Managers

    DTIC Science & Technology

    2006-05-25

    Winters Forecasting 5 List of Tables Table 1. Holt-Winters smoothing parameters and Mean Absolute Percentage Errors: Pseudoephedrine prescriptions Table 2...confidence intervals Holt-Winters Forecasting 6 List of Figures Figure 1. Line Plot of Pseudoephedrine Prescriptions forecast using smoothing parameters...The first represents monthly prescriptions of pseudoephedrine . Pseudoephedrine is a drug commonly prescribed to relieve nasal congestion and other

  18. Developmental times and life table statistics of Aulacorthum solani (Hemiptera: Aphididae) at six constant temperatures, with recommendations on the application of temperature-dependent development models

    USDA-ARS?s Scientific Manuscript database

    Developmental rates and age-specific life tables were determined for Aulacorthum solani (Kaltenbach) (known as foxglove aphid or glasshouse potato aphid) at 6 constant temperatures feeding on pansy (Viola × wittrockiana) (Gams.). Previously, there were no complete life table studies of this species...

  19. The Ability of Elite Table Tennis Players with Intellectual Disabilities to Adapt Their Service/Return

    ERIC Educational Resources Information Center

    Van Biesen, Debbie; Verellen, Joeri; Meyer, Christophe; Mactavish, Jennifer; Van de Vliet, Peter; Vanlandewijck, Yves

    2010-01-01

    In this study the ability of elite table tennis players with intellectual disability (ID) to adapt their service/return to specific ball spin characteristics was investigated. This was done by examining the performance of 39 players with ID and a reference group of 8 players without ID on a standardized table tennis specific test battery. The…

  20. Guess who's not coming to dinner? Evaluating online restaurant reservations for disease surveillance.

    PubMed

    Nsoesie, Elaine O; Buckeridge, David L; Brownstein, John S

    2014-01-22

    Alternative data sources are used increasingly to augment traditional public health surveillance systems. Examples include over-the-counter medication sales and school absenteeism. We sought to determine if an increase in restaurant table availabilities was associated with an increase in disease incidence, specifically influenza-like illness (ILI). Restaurant table availability was monitored using OpenTable, an online restaurant table reservation site. A daily search was performed for restaurants with available tables for 2 at the hour and at half past the hour for 22 distinct times: between 11:00 am-3:30 pm for lunch and between 6:00-11:30 PM for dinner. In the United States, we examined table availability for restaurants in Boston, Atlanta, Baltimore, and Miami. For Mexico, we studied table availabilities in Cancun, Mexico City, Puebla, Monterrey, and Guadalajara. Time series of restaurant use was compared with Google Flu Trends and ILI at the state and national levels for the United States and Mexico using the cross-correlation function. Differences in restaurant use were observed across sampling times and regions. We also noted similarities in time series trends between data on influenza activity and restaurant use. In some settings, significant correlations greater than 70% were noted between data on restaurant use and ILI trends. This study introduces and demonstrates the potential value of restaurant use data for event surveillance.

Top