Sample records for assessment models final

  1. Approaches for the Application of Physiologically Based ...

    EPA Pesticide Factsheets

    EPA released the final report, Approaches for the Application of Physiologically Based Pharmacokinetic (PBPK) Models and Supporting Data in Risk Assessment as announced in a September 22 2006 Federal Register Notice.This final report addresses the application and evaluation of PBPK models for risk assessment purposes. These models represent an important class of dosimetry models that are useful for predicting internal dose at target organs for risk assessment applications. EPA is releasing a final report describing the evaluation and applications of physiologically based pharmacokinetic (PBPK) models in health risk assessment. This was announced in the September 22 2006 Federal Register Notice.

  2. Approaches for the Application of Physiologically Based Pharmacokinetic (PBPK) Models and Supporting Data in Risk Assessment (Final Report)

    EPA Science Inventory

    EPA released the final report, Approaches for the Application of Physiologically Based Pharmacokinetic (PBPK) Models and Supporting Data in Risk Assessment as announced in a September 22 2006 Federal Register Notice.This final report addresses the application and evaluati...

  3. Reference H Cycle 3 Stability, Control, and Flying Qualities Batch Assessments

    NASA Technical Reports Server (NTRS)

    Henderson, Dennis K.

    1999-01-01

    This work is an update of the assessment completed in February of 1996, when a preliminary assessment report was issued for the Cycle 2B simulation model. The primary purpose of the final assessment was to re-evaluate each assessment against the flight control system (FCS) requirements document using the updated model. Only a limited number of final assessments were completed due to the close proximity of the release of the Langley model and the assessment deliverable date. The assessment used the nonlinear Cycle 3 simulation model because it combines nonlinear aeroelastic (quasi-static) aerodynamic with hinge moment and rate limited control surface deflections. Both Configuration Aerodynamics (Task 32) and Flight Controls (Task 36) were funded in 1996 to conduct the final stability and control assessments of the unaugmented Reference H configuration in FY96. Because the two tasks had similar output requirements, the work was divided such that Flight Controls would be responsible for the implementation and checkout of the simulation model and Configuration Aerodynamics for writing Madab "script' files, conducting the batch assessments and writing the assessment report. Additionally, Flight Controls was to investigate control surface allocations schemes different from the baseline Reference H in an effort to fulfill flying qualities criteria.

  4. An Exploratory Study: Assessment of Modeled Dioxin Exposure in Ceramic Art Studios (Final Report, 2008)

    EPA Science Inventory

    EPA announced the availability of the final report, An Exploratory Study: Assessment of Modeled Dioxin Exposure in Ceramic Art Studios. This report investigates the potential dioxin exposure to artists/hobbyists who use ball clay to make pottery and related products. Derm...

  5. Watershed Modeling to Assess the Sensitivity of Streamflow, Nutrient, and Sediment Loads to Potential Climate Change and Urban Development in 20 U.S. Watersheds (Final Report)

    EPA Science Inventory

    In September 2013, EPA announced the release of the final report, Watershed Modeling to Assess the Sensitivity of Streamflow, Nutrient, and Sediment Loads to Potential Climate Change and Urban Development in 20 U.S. Watersheds.

    Watershed modeling was conducted in ...

  6. 75 FR 42819 - Notice of Availability of a Final Environmental Assessment (Final EA) and a Finding of No...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-22

    ... the Proposed ORD Airport Surveillance Radar, Model 9, West Chicago, IL AGENCY: Federal Aviation... Surveillance Radar, Model 9, West Chicago, Illinois. SUMMARY: The Federal Aviation Administration (FAA) is... (Final EA) for the Proposed ORD Airport Surveillance Radar, Model 9 (ASR-9), in West Chicago, Illinois...

  7. BASINs and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications (Final Report)

    EPA Science Inventory

    EPA announced the release of the final report, BASINs and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications. This report supports application of two recently developed water modeling tools, the Better Assessment Science Integrating point & ...

  8. Evaluation of a model of violence risk assessment among forensic psychiatric patients.

    PubMed

    Douglas, Kevin S; Ogloff, James R P; Hart, Stephen D

    2003-10-01

    This study tested the interrater reliability and criterion-related validity of structured violence risk judgments made by using one application of the structured professional judgment model of violence risk assessment, the HCR-20 violence risk assessment scheme, which assesses 20 key risk factors in three domains: historical, clinical, and risk management. The HCR-20 was completed for a sample of 100 forensic psychiatric patients who had been found not guilty by reason of a mental disorder and were subsequently released to the community. Violence in the community was determined from multiple file-based sources. Interrater reliability of structured final risk judgments of low, moderate, or high violence risk made on the basis of the structured professional judgment model was acceptable (weighted kappa=.61). Structured final risk judgments were significantly predictive of postrelease community violence, yielding moderate to large effect sizes. Event history analyses showed that final risk judgments made with the structured professional judgment model added incremental validity to the HCR-20 used in an actuarial (numerical) sense. The findings support the structured professional judgment model of risk assessment as well as the HCR-20 specifically and suggest that clinical judgment, if made within a structured context, can contribute in meaningful ways to the assessment of violence risk.

  9. Naval Postgraduate School Scheduling Support System (NPS4)

    DTIC Science & Technology

    1992-03-01

    NPSS ...... .................. 156 2. Final Exam Scheduler .. .......... 159 F. PRESENTATION SYSTEM ... ............. . 160 G. USER INTERFACE... NPSS ...... .................. 185 2. Final Exam Model ... ............ 186 3. The Class Schedulers .. .......... 186 4. Assessment of Problem Model...Information Distribution ....... 150 4.13 NPSS Optimization Process .... ............ . 157 4.14 NPSS Performance ..... ................ . 159 4.15 Department

  10. Utah Pilot Writing Assessment for Grades 3 and 8. Final Report.

    ERIC Educational Resources Information Center

    Duke, Charles R.; Strong, William J.

    This final report presents findings of a project designed to develop a model for state-wide assessment of student writing/language skills in conjunction with Utah's Core Curriculum in English/Language Arts (UCCLA). The project tested procedures for collecting baseline data on writing/language skills at grades 3 and 8. The report consists of the…

  11. Analysis of safety impacts of access management alternatives using the surrogate safety assessment model : final report.

    DOT National Transportation Integrated Search

    2017-06-01

    The purpose of this study was to evaluate if the Surrogate Safety Assessment Model (SSAM) could be used to assess the safety of a highway segment or an intersection in terms of the number and type of conflicts and to compare the safety effects of mul...

  12. Quantification for complex assessment: uncertainty estimation in final year project thesis assessment

    NASA Astrophysics Data System (ADS)

    Kim, Ho Sung

    2013-12-01

    A quantitative method for estimating an expected uncertainty (reliability and validity) in assessment results arising from the relativity between four variables, viz examiner's expertise, examinee's expertise achieved, assessment task difficulty and examinee's performance, was developed for the complex assessment applicable to final year project thesis assessment including peer assessment. A guide map can be generated by the method for finding expected uncertainties prior to the assessment implementation with a given set of variables. It employs a scale for visualisation of expertise levels, derivation of which is based on quantified clarities of mental images for levels of the examiner's expertise and the examinee's expertise achieved. To identify the relevant expertise areas that depend on the complexity in assessment format, a graphical continuum model was developed. The continuum model consists of assessment task, assessment standards and criterion for the transition towards the complex assessment owing to the relativity between implicitness and explicitness and is capable of identifying areas of expertise required for scale development.

  13. Model Assessment of the Impact on Ozone of Subsonic and Supersonic Aircraft

    NASA Technical Reports Server (NTRS)

    Ko, Malcolm; Weisenstein, Debra; Danilin, Michael; Scott, Courtney; Shia, Run-Lie

    2000-01-01

    This is the final report for work performed between June 1999 through May 2000. The work represents continuation of the previous contract which encompasses five areas: (1) continued refinements and applications of the 2-D chemistry-transport model (CTM) to assess the ozone effects from aircraft operation in the stratosphere; (2) studying the mechanisms that determine the evolution of the sulfur species in the aircraft plume and how such mechanisms affect the way aircraft sulfur emissions should be introduced into global models; (3) the development of diagnostics in the AER 3-wave interactive model to assess the importance of the dynamics feedback and zonal asymmetry in model prediction of ozone response to aircraft operation; (4) the development of a chemistry parameterization scheme in support of the global modeling initiative (GMI); and (5) providing assessment results for preparation of national and international reports which include the "Aviation and the Global Atmosphere" prepared by the Intergovernmental Panel on Climate Change, "Assessment of the effects of high-speed aircraft in the stratosphere: 1998" by NASA, and the "Model and Measurements Intercomparison II" by NASA. Part of the work was reported in the final report. We participated in the SAGE III Ozone Loss and Validation Experiment (SOLVE) campaign and we continue with our analyses of the data.

  14. Cox Proportional Hazards Models for Modeling the Time to Onset of Decompression Sickness in Hypobaric Environments

    NASA Technical Reports Server (NTRS)

    Thompson, Laura A.; Chhikara, Raj S.; Conkin, Johnny

    2003-01-01

    In this paper we fit Cox proportional hazards models to a subset of data from the Hypobaric Decompression Sickness Databank. The data bank contains records on the time to decompression sickness (DCS) and venous gas emboli (VGE) for over 130,000 person-exposures to high altitude in chamber tests. The subset we use contains 1,321 records, with 87% censoring, and has the most recent experimental tests on DCS made available from Johnson Space Center. We build on previous analyses of this data set by considering more expanded models and more detailed model assessments specific to the Cox model. Our model - which is stratified on the quartiles of the final ambient pressure at altitude - includes the final ambient pressure at altitude as a nonlinear continuous predictor, the computed tissue partial pressure of nitrogen at altitude, and whether exercise was done at altitude. We conduct various assessments of our model, many of which are recently developed in the statistical literature, and conclude where the model needs improvement. We consider the addition of frailties to the stratified Cox model, but found that no significant gain was attained above a model that does not include frailties. Finally, we validate some of the models that we fit.

  15. Final Report on the Multicultural/Diversity Assessment Project.

    ERIC Educational Resources Information Center

    Ambrosio, Anthony L.

    The Emporia State University Multicultural/Diversity Project developed a set of assessment instruments and a model evaluation plan to assess multicultural/diversity (MCD) outcomes in teacher education and general education programs. Assessment instruments and techniques were constructed to evaluate the impact of coursework on student attitudes,…

  16. Site descriptive modeling as a part of site characterization in Sweden - Concluding the surface based investigations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersson, Johan; Winberg, Anders; Skagius, Kristina

    The Swedish Nuclear Fuel and Waste Management Co., SKB, is currently finalizing its surface based site investigations for the final repository for spent nuclear fuel in the municipalities of Oestharmnar (the Forsmark area) and Oskarshamn (the Simpevar/Laxemar area). The investigation data are assessed into a Site Descriptive Model, constituting a synthesis of geology, rock mechanics, thermal properties, hydrogeology, hydro-geochemistry, transport properties and a surface system description. Site data constitute a wide range of different measurement results. These data both need to be checked for consistency and to be interpreted into a format more amenable for three-dimensional modeling. The three-dimensional modelingmore » (i.e. estimating the distribution of parameter values in space) is made in a sequence where the geometrical framework is taken from the geological models and in turn used by the rock mechanics, thermal and hydrogeological modeling. These disciplines in turn are partly interrelated, and also provide feedback to the geological modeling, especially if the geological description appears unreasonable when assessed together with the other data. Procedures for assessing the uncertainties and the confidence in the modeling have been developed during the course of the site modeling. These assessments also provide key input to the completion of the site investigation program. (authors)« less

  17. Does the choice of nucleotide substitution models matter topologically?

    PubMed

    Hoff, Michael; Orf, Stefan; Riehm, Benedikt; Darriba, Diego; Stamatakis, Alexandros

    2016-03-24

    In the context of a master level programming practical at the computer science department of the Karlsruhe Institute of Technology, we developed and make available an open-source code for testing all 203 possible nucleotide substitution models in the Maximum Likelihood (ML) setting under the common Akaike, corrected Akaike, and Bayesian information criteria. We address the question if model selection matters topologically, that is, if conducting ML inferences under the optimal, instead of a standard General Time Reversible model, yields different tree topologies. We also assess, to which degree models selected and trees inferred under the three standard criteria (AIC, AICc, BIC) differ. Finally, we assess if the definition of the sample size (#sites versus #sites × #taxa) yields different models and, as a consequence, different tree topologies. We find that, all three factors (by order of impact: nucleotide model selection, information criterion used, sample size definition) can yield topologically substantially different final tree topologies (topological difference exceeding 10 %) for approximately 5 % of the tree inferences conducted on the 39 empirical datasets used in our study. We find that, using the best-fit nucleotide substitution model may change the final ML tree topology compared to an inference under a default GTR model. The effect is less pronounced when comparing distinct information criteria. Nonetheless, in some cases we did obtain substantial topological differences.

  18. Microbial Performance of Food Safety Control and Assurance Activities in a Fresh Produce Processing Sector Measured Using a Microbial Assessment Scheme and Statistical Modeling.

    PubMed

    Njage, Patrick Murigu Kamau; Sawe, Chemutai Tonui; Onyango, Cecilia Moraa; Habib, I; Njagi, Edmund Njeru; Aerts, Marc; Molenberghs, Geert

    2017-01-01

    Current approaches such as inspections, audits, and end product testing cannot detect the distribution and dynamics of microbial contamination. Despite the implementation of current food safety management systems, foodborne outbreaks linked to fresh produce continue to be reported. A microbial assessment scheme and statistical modeling were used to systematically assess the microbial performance of core control and assurance activities in five Kenyan fresh produce processing and export companies. Generalized linear mixed models and correlated random-effects joint models for multivariate clustered data followed by empirical Bayes estimates enabled the analysis of the probability of contamination across critical sampling locations (CSLs) and factories as a random effect. Salmonella spp. and Listeria monocytogenes were not detected in the final products. However, none of the processors attained the maximum safety level for environmental samples. Escherichia coli was detected in five of the six CSLs, including the final product. Among the processing-environment samples, the hand or glove swabs of personnel revealed a higher level of predicted contamination with E. coli , and 80% of the factories were E. coli positive at this CSL. End products showed higher predicted probabilities of having the lowest level of food safety compared with raw materials. The final products were E. coli positive despite the raw materials being E. coli negative for 60% of the processors. There was a higher probability of contamination with coliforms in water at the inlet than in the final rinse water. Four (80%) of the five assessed processors had poor to unacceptable counts of Enterobacteriaceae on processing surfaces. Personnel-, equipment-, and product-related hygiene measures to improve the performance of preventive and intervention measures are recommended.

  19. Design of Training Systems. Computerization of the Educational Technology Assessment Model. Volume 1.

    ERIC Educational Resources Information Center

    Duffy, Larry B.; And Others

    The Educational Technology Assessment Model (ETAM) is a set of comprehensive procedures and variables for the analysis, synthesis, and decision making, in regard to the benefits, costs, and risks associated with introducing technical innovations in education and training. This final report summarizes the analysis, design, and development…

  20. 1973 Assessment Workshops: Final Report.

    ERIC Educational Resources Information Center

    Womer, Frank B.; Lehmann, Irvin J.

    Three 3-day assessment workshops were held in Boulder, Colorado from June 19-29, for personnel in the assessment field from state departments of education. Seventy-six participants from 35 states, Puerto Rico, the Virgin Islands and the District of Columbia attended. Two of the three workshops concentrated on National Assessment as one model for…

  1. Population pharmacokinetic modelling of tramadol using inverse Gaussian function for the assessment of drug absorption from prolonged and immediate release formulations.

    PubMed

    Brvar, Nina; Mateović-Rojnik, Tatjana; Grabnar, Iztok

    2014-10-01

    This study aimed to develop a population pharmacokinetic model for tramadol that combines different input rates with disposition characteristics. Data used for the analysis were pooled from two phase I bioavailability studies with immediate (IR) and prolonged release (PR) formulations in healthy volunteers. Tramadol plasma concentration-time data were described by an inverse Gaussian function to model the complete input process linked to a two-compartment disposition model with first-order elimination. Although polymorphic CYP2D6 appears to be a major enzyme involved in the metabolism of tramadol, application of a mixture model to test the assumption of two and three subpopulations did not reveal any improvement of the model. The final model estimated parameters with reasonable precision and was able to estimate the interindividual variability of all parameters except for the relative bioavailability of PR vs. IR formulation. Validity of the model was further tested using the nonparametric bootstrap approach. Finally, the model was applied to assess absorption kinetics of tramadol and predict steady-state pharmacokinetics following administration of both types of formulations. For both formulations, the final model yielded a stable estimate of the absorption time profiles. Steady-state simulation supports switching of patients from IR to PR formulation. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Best practices for assessing competence and performance of the behavioral health workforce.

    PubMed

    Bashook, Philip G

    2005-01-01

    The need for mechanisms to assess the competence and performance of the behavioral health workforce has received increasing attention. This article reviews strategies used in general medicine and other disciplines for assessing trainees and practitioners. The possibilities and limitations of various approaches are reviewed, and the implications for behavioral health are addressed. A conceptual model of competence is presented, and practical applications of this model are reviewed. Finally, guidelines are proposed for building competency assessment protocols for behavioral health.

  3. Assessing the Integration of Audience Response System Technology in Teaching of Anatomical Sciences

    PubMed Central

    Alexander, Cara J.; Crescini, Weronika M.; Juskewitch, Justin E.; Lachman, Nirusha; Pawlina, Wojciech

    2009-01-01

    The goals of our study were to determine the predictive value and usability of an audience response system (ARS) as a knowledge assessment tool in an undergraduate medical curriculum. Over a three year period (2006–2008), data were collected from first year didactic blocks in Genetics/Histology and Anatomy/Radiology (n=42–50 per class). During each block, students answered clinically oriented multiple choice questions using the ARS. Students’ performances were recorded and cumulative ARS scores were compared with final examination performances. Correlation coefficients between these variables were calculated to assess the existence and direction of an association between ARS and final examination score. If associations existed, univariate models were then constructed using ARS as a predictor of final examination score. Student and faculty perception of ARS difficulty, usefulness, effect on performance, and preferred use were evaluated using a questionnaire. There was a statistically significant positive correlation between ARS and final examination scores in all didactic blocks and predictive univariate models were constructed for each relationship (all P < 0.0001). Students and faculty agreed that ARS was easy to use and are liable tool for providing real-time feedback that improved their performance and participation. In conclusion, we found ARS to be an effective assessment tool benefiting the faculty and the students in a curriculum focused on interaction and self-directed learning. PMID:19670428

  4. Assessing the integration of audience response system technology in teaching of anatomical sciences.

    PubMed

    Alexander, Cara J; Crescini, Weronika M; Juskewitch, Justin E; Lachman, Nirusha; Pawlina, Wojciech

    2009-01-01

    The goals of our study were to determine the predictive value and usability of an audience response system (ARS) as a knowledge assessment tool in an undergraduate medical curriculum. Over a three year period (2006-2008), data were collected from first year didactic blocks in Genetics/Histology and Anatomy/Radiology (n = 42-50 per class). During each block, students answered clinically oriented multiple choice questions using the ARS. Students' performances were recorded and cumulative ARS scores were compared with final examination performances. Correlation coefficients between these variables were calculated to assess the existence and direction of an association between ARS and final examination score. If associations existed, univariate models were then constructed using ARS as a predictor of final examination score. Student and faculty perception of ARS difficulty, usefulness, effect on performance, and preferred use were evaluated using a questionnaire. There was a statistically significant positive correlation between ARS and final examination scores in all didactic blocks and predictive univariate models were constructed for each relationship (all P < 0.0001). Students and faculty agreed that ARS was easy to use and a reliable tool for providing real-time feedback that improved their performance and participation. In conclusion, we found ARS to be an effective assessment tool benefiting the faculty and the students in a curriculum focused on interaction and self-directed learning. 2009 American Association of Anatomists

  5. Market Assessment For Traveler Services, A Choice Modeling Study Phase Iii, Fast-Trac Deliverable, #16B: Final Choice Modeling Report

    DOT National Transportation Integrated Search

    1999-02-12

    FAST-TRAC : THIS REPORT DESCRIBES THE CHOICE MODEL STUDY OF THE FAST-TRAC (FASTER AND SAFER TRAVEL THROUGH TRAFFIC ROUTING AND ADVANCED CONTROLS) OPERATIONAL TEST IN SOUTHEAST MICHIGAN. CHOICE MODELING IS A STATED-PREFERENCE APPROACH IN WHICH RESP...

  6. Uncertainty and Variability in Physiologically-Based ...

    EPA Pesticide Factsheets

    EPA announced the availability of the final report, Uncertainty and Variability in Physiologically-Based Pharmacokinetic (PBPK) Models: Key Issues and Case Studies. This report summarizes some of the recent progress in characterizing uncertainty and variability in physiologically-based pharmacokinetic models and their predictions for use in risk assessment. This report summarizes some of the recent progress in characterizing uncertainty and variability in physiologically-based pharmacokinetic models and their predictions for use in risk assessment.

  7. Consideration of VT5 etch-based OPC modeling

    NASA Astrophysics Data System (ADS)

    Lim, ChinTeong; Temchenko, Vlad; Kaiser, Dieter; Meusel, Ingo; Schmidt, Sebastian; Schneider, Jens; Niehoff, Martin

    2008-03-01

    Including etch-based empirical data during OPC model calibration is a desired yet controversial decision for OPC modeling, especially for process with a large litho to etch biasing. While many OPC software tools are capable of providing this functionality nowadays; yet few were implemented in manufacturing due to various risks considerations such as compromises in resist and optical effects prediction, etch model accuracy or even runtime concern. Conventional method of applying rule-based alongside resist model is popular but requires a lot of lengthy code generation to provide a leaner OPC input. This work discusses risk factors and their considerations, together with introduction of techniques used within Mentor Calibre VT5 etch-based modeling at sub 90nm technology node. Various strategies are discussed with the aim of better handling of large etch bias offset without adding complexity into final OPC package. Finally, results were presented to assess the advantages and limitations of the final method chosen.

  8. Using R and WinBUGS to fit a Generalized Partial Credit Model for developing and evaluating patient-reported outcomes assessments

    PubMed Central

    Li, Yuelin; Baser, Ray

    2013-01-01

    The US Food and Drug Administration recently announced the final guidelines on the development and validation of Patient-Reported Outcomes (PROs) assessments in drug labeling and clinical trials. This guidance paper may boost the demand for new PRO survey questionnaires. Henceforth biostatisticians may encounter psychometric methods more frequently, particularly Item Response Theory (IRT) models to guide the shortening of a PRO assessment instrument. This article aims to provide an introduction on the theory and practical analytic skills in fitting a Generalized Partial Credit Model in IRT (GPCM). GPCM theory is explained first, with special attention to a clearer exposition of the formal mathematics than what is typically available in the psychometric literature. Then a worked example is presented, using self-reported responses taken from the International Personality Item Pool. The worked example contains step-by-step guides on using the statistical languages R and WinBUGS in fitting the GPCM. Finally, the Fisher information function of the GPCM model is derived and used to evaluate, as an illustrative example, the usefulness of assessment items by their information contents. This article aims to encourage biostatisticians to apply IRT models in the re-analysis of existing data and in future research. PMID:22362655

  9. Using R and WinBUGS to fit a generalized partial credit model for developing and evaluating patient-reported outcomes assessments.

    PubMed

    Li, Yuelin; Baser, Ray

    2012-08-15

    The US Food and Drug Administration recently announced the final guidelines on the development and validation of patient-reported outcomes (PROs) assessments in drug labeling and clinical trials. This guidance paper may boost the demand for new PRO survey questionnaires. Henceforth, biostatisticians may encounter psychometric methods more frequently, particularly item response theory (IRT) models to guide the shortening of a PRO assessment instrument. This article aims to provide an introduction on the theory and practical analytic skills in fitting a generalized partial credit model (GPCM) in IRT. GPCM theory is explained first, with special attention to a clearer exposition of the formal mathematics than what is typically available in the psychometric literature. Then, a worked example is presented, using self-reported responses taken from the international personality item pool. The worked example contains step-by-step guides on using the statistical languages r and WinBUGS in fitting the GPCM. Finally, the Fisher information function of the GPCM model is derived and used to evaluate, as an illustrative example, the usefulness of assessment items by their information contents. This article aims to encourage biostatisticians to apply IRT models in the re-analysis of existing data and in future research. Copyright © 2012 John Wiley & Sons, Ltd.

  10. Freight transportation and the potential for invasions of exotic insects in urban and periurban forests of the United States.

    PubMed

    Colunga-Garcia, Manuel; Haack, Robert A; Adelaja, Adesoji O

    2009-02-01

    Freight transportation is an important pathway for the introduction and dissemination of exotic forest insects (EFI). Identifying the final destination of imports is critical in determining the likelihood of EFI establishment. We analyzed the use of regional freight transport information to characterize risk of urban and periurban areas to EFI introductions. Specific objectives were to 1) approximate the final distribution of selected imports among urban areas of the United States, 2) characterize the final distribution of imports in terms of their spatial aggregation and dominant world region of origin, and 3) assess the effect of the final distribution of imports on the level of risk to urban and periurban forests from EFI. Freight pattern analyses were conducted for three categories of imports whose products or packaging materials are associated with EFI: wood products, nonmetallic mineral products, and machinery. The final distribution of wood products was the most evenly distributed of the three selected imports, whereas machinery was most spatially concentrated. We found that the type of import and the world region of origin greatly influence the final distribution of imported products. Risk assessment models were built based on the amount of forestland and imports for each urban area The model indicated that 84-88% of the imported tonnage went to only 4-6% of the urban areas in the contiguous United States. We concluded that freight movement information is critical for proper risk assessment of EFI. Implications of our findings and future research needs are discussed.

  11. Examining the Predictive Validity of a Dynamic Assessment of Decoding to Forecast Response Tier 2 to Intervention

    PubMed Central

    Cho, Eunsoo; Compton, Donald L.; Fuchs, Doug; Fuchs, Lynn S.; Bouton, Bobette

    2013-01-01

    The purpose of this study was to examine the role of a dynamic assessment (DA) of decoding in predicting responsiveness to Tier 2 small group tutoring in a response-to-intervention model. First-grade students (n=134) who did not show adequate progress in Tier 1 based on 6 weeks of progress monitoring received Tier 2 small-group tutoring in reading for 14 weeks. Student responsiveness to Tier 2 was assessed weekly with word identification fluency (WIF). A series of conditional individual growth curve analyses were completed that modeled the correlates of WIF growth (final level of performance and growth). Its purpose was to examine the predictive validity of DA in the presence of 3 sets of variables: static decoding measures, Tier 1 responsiveness indicators, and pre-reading variables (phonemic awareness, rapid letter naming, oral vocabulary, and IQ). DA was a significant predictor of final level and growth, uniquely explaining 3% – 13% of the variance in Tier 2 responsiveness depending on the competing predictors in the model and WIF outcome (final level of performance or growth). Although the additional variances explained uniquely by DA were relatively small, results indicate the potential of DA in identifying Tier 2 nonresponders. PMID:23213050

  12. Examining the predictive validity of a dynamic assessment of decoding to forecast response to tier 2 intervention.

    PubMed

    Cho, Eunsoo; Compton, Donald L; Fuchs, Douglas; Fuchs, Lynn S; Bouton, Bobette

    2014-01-01

    The purpose of this study was to examine the role of a dynamic assessment (DA) of decoding in predicting responsiveness to Tier 2 small-group tutoring in a response-to-intervention model. First grade students (n = 134) who did not show adequate progress in Tier 1 based on 6 weeks of progress monitoring received Tier 2 small-group tutoring in reading for 14 weeks. Student responsiveness to Tier 2 was assessed weekly with word identification fluency (WIF). A series of conditional individual growth curve analyses were completed that modeled the correlates of WIF growth (final level of performance and growth). Its purpose was to examine the predictive validity of DA in the presence of three sets of variables: static decoding measures, Tier 1 responsiveness indicators, and prereading variables (phonemic awareness, rapid letter naming, oral vocabulary, and IQ). DA was a significant predictor of final level and growth, uniquely explaining 3% to 13% of the variance in Tier 2 responsiveness depending on the competing predictors in the model and WIF outcome (final level of performance or growth). Although the additional variances explained uniquely by DA were relatively small, results indicate the potential of DA in identifying Tier 2 nonresponders. © Hammill Institute on Disabilities 2012.

  13. KEY ISSUES FOR THE ASSESSMENT OF THE ALLERGENIC POTENTIAL OF GENETICALLY MODIFIED FOODS: BREAKOUT GROUP REPORTS

    EPA Science Inventory

    Abstract
    On the final afternoon of the Workshop, Assessment of the Allergenic Potential of Genetically Modified Foods, speakers and participants met in breakout groups to discuss specific questions in the areas of 1) Use of Human Clinical Data; 2) Animal Models to Assess Food ...

  14. An introduction to the partial credit model for developing nursing assessments.

    PubMed

    Fox, C

    1999-11-01

    The partial credit model, which is a special case of the Rasch measurement model, was presented as a useful way to develop and refine complex nursing assessments. The advantages of the Rasch model over the classical psychometric model were presented including the lack of bias in the measurement process, the ability to highlight those items in need of refinement, the provision of information on congruence between the data and the model, and feedback on the usefulness of the response categories. The partial credit model was introduced as a way to develop complex nursing assessments such as performance-based assessments, because of the model's ability to accommodate a variety of scoring procedures. Finally, an application of the partial credit model was illustrated using the Practical Knowledge Inventory for Nurses, a paper-and-pencil instrument that measures on-the-job decision-making for nurses.

  15. Decision Support System For Management Of Low-Level Radioactive Waste Disposal At The Nevada Test Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shott, G.; Yucel, V.; Desotell, L.

    2006-07-01

    The long-term safety of U.S. Department of Energy (DOE) low-level radioactive disposal facilities is assessed by conducting a performance assessment -- a systematic analysis that compares estimated risks to the public and the environment with performance objectives contained in DOE Manual 435.1-1, Radioactive Waste Management Manual. Before site operations, facilities design features such as final inventory, waste form characteristics, and closure cover design may be uncertain. Site operators need a modeling tool that can be used throughout the operational life of the disposal site to guide decisions regarding the acceptance of problematic waste streams, new disposal cell design, environmental monitoringmore » program design, and final site closure. In response to these needs the National Nuclear Security Administration Nevada Site Office (NNSA/NSO) has developed a decision support system for the Area 5 Radioactive Waste Management Site in Frenchman Flat on the Nevada Test Site. The core of the system is a probabilistic inventory and performance assessment model implemented in the GoldSim{sup R} simulation platform. The modeling platform supports multiple graphic capabilities that allow clear documentation of the model data sources, conceptual model, mathematical implementation, and results. The combined models have the capability to estimate disposal site inventory, contaminant concentrations in environmental media, and radiological doses to members of the public engaged in various activities at multiple locations. The model allows rapid assessment and documentation of the consequences of waste management decisions using the most current site characterization information, radionuclide inventory, and conceptual model. The model is routinely used to provide annual updates of site performance, evaluate the consequences of disposal of new waste streams, develop waste concentration limits, optimize the design of new disposal cells, and assess the adequacy of environmental monitoring programs. (authors)« less

  16. System Maturity and Architecture Assessment Methods, Processes, and Tools

    DTIC Science & Technology

    2012-03-02

    Deshmukh , and M. Sarfaraz. Development of Systems Engineering Maturity Models and Management Tools. Systems Engineering Research Center Final Technical...Ramirez- Marquez, D. Nowicki, A. Deshmukh , and M. Sarfaraz. Development of Systems Engineering Maturity Models and Management Tools. Systems Engineering

  17. A Comparison of Career-Related Assessment Tools/Models. Final [Report].

    ERIC Educational Resources Information Center

    WestEd, San Francisco, CA.

    This document contains charts that evaluate career related assessment items. Chart categories include: Purpose/Current Uses/Format; Intended Population; Oregon Career Related Learning Standards Addressed; Relationship to the Standards; Relationship to Endorsement Area Frameworks; Evidence of Validity; Evidence of Reliability; Evidence of Fairness…

  18. RESIDUAL RISK ASSESSMENTS - FINAL RESIDUAL RISK ASSESSMENT FOR SECONDARY LEAD SMELTERS

    EPA Science Inventory

    This source category previously subjected to a technology-based standard will be examined to determine if health or ecological risks are significant enough to warrant further regulation for Secondary Lead Smelters. These assesments utilize existing models and data bases to examin...

  19. The CanMEDS role of Collaborator: How is it taught and assessed according to faculty and residents?

    PubMed Central

    Berger, Elizabeth; Chan, Ming-Ka; Kuper, Ayelet; Albert, Mathieu; Jenkins, Deirdre; Harrison, Megan; Harris, Ilene

    2012-01-01

    OBJECTIVE: To explore the perspectives of paediatric residents and faculty regarding how the Collaborator role is taught and assessed. METHODS: Using a constructivist grounded theory approach, focus groups at four Canadian universities were conducted. Data were analyzed iteratively for emergent themes. RESULTS: Residents reported learning about collaboration through faculty role modelling but did not perceive that it was part of the formal curriculum. Faculty reported that they were not trained in how to effectively model this role. Both groups reported a need for training in conflict management, particularly as it applies to intraprofessional (physician-to-physician) relationships. Finally, the participants asserted that current methods to assess residents on their performance as collaborators are suboptimal. CONCLUSIONS: The Collaborator role should be a formal part of the residency curriculum. Residents need to be better educated with regard to managing conflict and handling intraprofessional relationships. Finally, innovative methods of assessing residents on this non-medical expert role need to be created. PMID:24294063

  20. A Multidimensional Model for Child Maltreatment Prevention Readiness in Low- and Middle-Income Countries

    ERIC Educational Resources Information Center

    Mikton, Christopher; Mehra, Radhika; Butchart, Alexander; Addiss, David; Almuneef, Maha; Cardia, Nancy; Cheah, Irene; Chen, JingQi; Makoae, Mokhantso; Raleva, Marija

    2011-01-01

    The study's aim was to develop a multidimensional model for the assessment of child maltreatment prevention readiness in low- and middle-income countries. The model was developed based on a conceptual review of relevant existing models and approaches, an international expert consultation, and focus groups in six countries. The final model…

  1. Mentor judgements and decision-making in the assessment of student nurse competence in practice: A mixed-methods study.

    PubMed

    Burden, Sarah; Topping, Anne Elizabeth; O'Halloran, Catherine

    2018-05-01

    To investigate how mentors form judgements and reach summative assessment decisions regarding student competence in practice. Competence assessment is a significant component of pre-registration nursing programmes in the United Kingdom. Concerns exist that assessments are subjective, lack consistency and that mentors fail to judge student performance as unsatisfactory. A two-stage sequential embedded mixed-methods design. Data collected 2012-2013. This study involved a whole student cohort completing a UK undergraduate adult nursing programme (N = 41). Stage 1: quantitative data on mentor conduct of assessment interviews and the final decision recorded (N = 330 from 270 mentors) were extracted from student Practice Assessment Documents (PADs). Stage 2: mentor feedback in student PADs was used in Stimulated Recall interviews with a purposive sample of final placement mentors (N = 17). These were thematically analysed. Findings were integrated to develop a theoretically driven model of mentor decision-making. Course assessment strategies and documentation had limited effect in framing mentor judgements and decisions. Rather, mentors amassed impressions, moderated by expectations of an "idealized student" by practice area and programme stage that influenced their management and outcome of the assessment process. These impressions were accumulated and combined into judgements that informed the final decision. This process can best be understood and conceptualized through the Brunswik's lens model of social judgement. Mentor decisions were reasoned and there was a shared understanding of judgement criteria and their importance. This impression-based nature of mentor decision-making questions the reliability and validity of competency-based assessments used in nursing pre-registration programmes. © 2017 John Wiley & Sons Ltd.

  2. Hurricane Sandy Economic Impacts Assessment: A Computable General Equilibrium Approach and Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boero, Riccardo; Edwards, Brian Keith

    Economists use computable general equilibrium (CGE) models to assess how economies react and self-organize after changes in policies, technology, and other exogenous shocks. CGE models are equation-based, empirically calibrated, and inspired by Neoclassical economic theory. The focus of this work was to validate the National Infrastructure Simulation and Analysis Center (NISAC) CGE model and apply it to the problem of assessing the economic impacts of severe events. We used the 2012 Hurricane Sandy event as our validation case. In particular, this work first introduces the model and then describes the validation approach and the empirical data available for studying themore » event of focus. Shocks to the model are then formalized and applied. Finally, model results and limitations are presented and discussed, pointing out both the model degree of accuracy and the assessed total damage caused by Hurricane Sandy.« less

  3. Frontal Representation as a Metric of Model Performance

    NASA Astrophysics Data System (ADS)

    Douglass, E.; Mask, A. C.

    2017-12-01

    Representation of fronts detected by altimetry are used to evaluate the performance of the HYCOM global operational product. Fronts are detected and assessed in daily alongtrack altimetry. Then, modeled sea surface height is interpolated to the locations of the alongtrack observations, and the same frontal detection algorithm is applied to the interpolated model output. The percentage of fronts found in the altimetry and replicated in the model gives a score (0-100) that assesses the model's ability to replicate fronts in the proper location with the proper orientation. Further information can be obtained from determining the number of "extra" fronts found in the model but not in the altimetry, and from assessing the horizontal and vertical dimensions of the front in the model as compared to observations. Finally, the sensitivity of this metric to choices regarding the smoothing of noisy alongtrack altimetry observations, and to the minimum size of fronts being analyzed, is assessed.

  4. An Exploratory Analysis of Personality, Attitudes, and Study Skills on the Learning Curve within a Team-based Learning Environment

    PubMed Central

    Henry, Teague; Campbell, Ashley

    2015-01-01

    Objective. To examine factors that determine the interindividual variability of learning within a team-based learning environment. Methods. Students in a pharmacokinetics course were given 4 interim, low-stakes cumulative assessments throughout the semester and a cumulative final examination. Students’ Myers-Briggs personality type was assessed, as well as their study skills, motivations, and attitudes towards team-learning. A latent curve model (LCM) was applied and various covariates were assessed to improve the regression model. Results. A quadratic LCM was applied for the first 4 assessments to predict final examination performance. None of the covariates examined significantly impacted the regression model fit except metacognitive self-regulation, which explained some of the variability in the rate of learning. There were some correlations between personality type and attitudes towards team learning, with introverts having a lower opinion of team-learning than extroverts. Conclusion. The LCM could readily describe the learning curve. Extroverted and introverted personality types had the same learning performance even though preference for team-learning was lower in introverts. Other personality traits, study skills, or practice did not significantly contribute to the learning variability in this course. PMID:25861101

  5. An exploratory analysis of personality, attitudes, and study skills on the learning curve within a team-based learning environment.

    PubMed

    Persky, Adam M; Henry, Teague; Campbell, Ashley

    2015-03-25

    To examine factors that determine the interindividual variability of learning within a team-based learning environment. Students in a pharmacokinetics course were given 4 interim, low-stakes cumulative assessments throughout the semester and a cumulative final examination. Students' Myers-Briggs personality type was assessed, as well as their study skills, motivations, and attitudes towards team-learning. A latent curve model (LCM) was applied and various covariates were assessed to improve the regression model. A quadratic LCM was applied for the first 4 assessments to predict final examination performance. None of the covariates examined significantly impacted the regression model fit except metacognitive self-regulation, which explained some of the variability in the rate of learning. There were some correlations between personality type and attitudes towards team learning, with introverts having a lower opinion of team-learning than extroverts. The LCM could readily describe the learning curve. Extroverted and introverted personality types had the same learning performance even though preference for team-learning was lower in introverts. Other personality traits, study skills, or practice did not significantly contribute to the learning variability in this course.

  6. Nuclear Nonproliferation Ontology Assessment Team Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strasburg, Jana D.; Hohimer, Ryan E.

    Final Report for the NA22 Simulations, Algorithm and Modeling (SAM) Ontology Assessment Team's efforts from FY09-FY11. The Ontology Assessment Team began in May 2009 and concluded in September 2011. During this two-year time frame, the Ontology Assessment team had two objectives: (1) Assessing the utility of knowledge representation and semantic technologies for addressing nuclear nonproliferation challenges; and (2) Developing ontological support tools that would provide a framework for integrating across the Simulation, Algorithm and Modeling (SAM) program. The SAM Program was going through a large assessment and strategic planning effort during this time and as a result, the relative importancemore » of these two objectives changed, altering the focus of the Ontology Assessment Team. In the end, the team conducted an assessment of the state of art, created an annotated bibliography, and developed a series of ontological support tools, demonstrations and presentations. A total of more than 35 individuals from 12 different research institutions participated in the Ontology Assessment Team. These included subject matter experts in several nuclear nonproliferation-related domains as well as experts in semantic technologies. Despite the diverse backgrounds and perspectives, the Ontology Assessment team functioned very well together and aspects could serve as a model for future inter-laboratory collaborations and working groups. While the team encountered several challenges and learned many lessons along the way, the Ontology Assessment effort was ultimately a success that led to several multi-lab research projects and opened up a new area of scientific exploration within the Office of Nuclear Nonproliferation and Verification.« less

  7. Mathematical Models for Camouflage Pattern Assessment

    DTIC Science & Technology

    2013-04-01

    Matemático Facultad de Ciencias F́ısicas y Matemáticas http://www.cmm.uchile.cl DISTRIBUTION A: Distribution approved for public release University of Chile...Centro de Modelamiento Matemático Facultad de Ciencias Físicas y Matemáticas Final Report: Camouage Assessment January 2013 Abstract The main

  8. AIR QUALITY ASSESSMENT IN USA - TECHNICAL TOOLS AND LINKAGE TO HUMAN HEALTH

    EPA Science Inventory

    This is an invited presentation to the Air4EU Final Conference to held in Prague, Czech Republic, on 10 November 2006. Air4EU is a jointly-sponsored, three-year European effort to provide recommendations on air quality assessment by monitoring and modeling for regulated pollutan...

  9. Use of Physiologically Based Pharmacokinetic (PBPK) Models ...

    EPA Pesticide Factsheets

    EPA announced the availability of the final report, Use of Physiologically Based Pharmacokinetic (PBPK) Models to Quantify the Impact of Human Age and Interindividual Differences in Physiology and Biochemistry Pertinent to Risk Final Report for Cooperative Agreement. This report describes and demonstrates techniques necessary to extrapolate and incorporate in vitro derived metabolic rate constants in PBPK models. It also includes two case study examples designed to demonstrate the applicability of such data for health risk assessment and addresses the quantification, extrapolation and interpretation of advanced biochemical information on human interindividual variability of chemical metabolism for risk assessment application. It comprises five chapters; topics and results covered in the first four chapters have been published in the peer reviewed scientific literature. Topics covered include: Data Quality ObjectivesExperimental FrameworkRequired DataTwo example case studies that develop and incorporate in vitro metabolic rate constants in PBPK models designed to quantify human interindividual variability to better direct the choice of uncertainty factors for health risk assessment. This report is intended to serve as a reference document for risk assors to use when quantifying, extrapolating, and interpretating advanced biochemical information about human interindividual variability of chemical metabolism.

  10. The use of psychosocial assessment following the Haiti earthquake in the development of the three-year emotional psycho-medical mental health and psychosocial support (EP-MMHPS) plan.

    PubMed

    Jordan, Karin

    2010-01-01

    This article provides information about the 2010 Haiti earthquake. An assessment model used by a crisis counselor responding to the earthquake is presented, focusing on the importance of gathering pre-deployment assessment and in-country assessment. Examples of the information gathered through the in-country assessment model from children, adolescents, and adults are presented. A brief overview of Haiti's three-year Emergency Psycho-Medical Mental Health and Psychosocial Support (EP-MMHPS) is provided. Finally, how the psychosocial manual developed after assessing 200 Haitian survivors through in-country assessment, and information gathered through pre-deployment assessment became part of the EP-MMHPS is offered.

  11. Assessment and Treatment of Deviant Behavior in Children - Section Six: Single Subject Experiments Generated by Application of the Treatment Model in the Experimental Class Setting. Final Report.

    ERIC Educational Resources Information Center

    Walker, Hill M.; Buckley, Nancy K.

    The studies in section six (of a six part report on the assessment and treatment of deviant behavior in children)investigated questions generated by the application of the treatment model in the experimental class setting (EC 032 210). The first experiment, on attending behavior, was designed to measure the conditionability of attending behavior…

  12. Metrics, models and data for assessment of resilience of urban infrastructure systems : final report

    DOT National Transportation Integrated Search

    2016-12-01

    This document is a summary of findings based on this research as presented in several conferences during the : course of the project. The research focused on identifying the basic metrics and models that can be used to : develop representations of pe...

  13. Good modeling practice guidelines for applying multimedia models in chemical assessments.

    PubMed

    Buser, Andreas M; MacLeod, Matthew; Scheringer, Martin; Mackay, Don; Bonnell, Mark; Russell, Mark H; DePinto, Joseph V; Hungerbühler, Konrad

    2012-10-01

    Multimedia mass balance models of chemical fate in the environment have been used for over 3 decades in a regulatory context to assist decision making. As these models become more comprehensive, reliable, and accepted, there is a need to recognize and adopt principles of Good Modeling Practice (GMP) to ensure that multimedia models are applied with transparency and adherence to accepted scientific principles. We propose and discuss 6 principles of GMP for applying existing multimedia models in a decision-making context, namely 1) specification of the goals of the model assessment, 2) specification of the model used, 3) specification of the input data, 4) specification of the output data, 5) conduct of a sensitivity and possibly also uncertainty analysis, and finally 6) specification of the limitations and limits of applicability of the analysis. These principles are justified and discussed with a view to enhancing the transparency and quality of model-based assessments. Copyright © 2012 SETAC.

  14. Feature-based Approach in Product Design with Energy Efficiency Consideration

    NASA Astrophysics Data System (ADS)

    Li, D. D.; Zhang, Y. J.

    2017-10-01

    In this paper, a method to measure the energy efficiency and ecological footprint metrics of features is proposed for product design. First the energy consumption models of various manufacturing features, like cutting feature, welding feature, etc. are studied. Then, the total energy consumption of a product is modeled and estimated according to its features. Finally, feature chains that combined by several sequence features based on the producing operation orders are defined and analyzed to calculate global optimal solution. The corresponding assessment model is also proposed to estimate their energy efficiency and ecological footprint. Finally, an example is given to validate the proposed approach in the improvement of sustainability.

  15. A virtual maintenance-based approach for satellite assembling and troubleshooting assessment

    NASA Astrophysics Data System (ADS)

    Geng, Jie; Li, Ying; Wang, Ranran; Wang, Zili; Lv, Chuan; Zhou, Dong

    2017-09-01

    In this study, a Virtual Maintenance (VM)-based approach for satellite troubleshooting assessment is proposed. By focusing on various elements in satellite assemble troubleshooting, such as accessibility, ergonomics, wiring, and extent of damage, a systematic, quantitative, and objective assessment model is established to decrease subjectivity in satellite assembling and troubleshooting assessment. Afterwards, based on the established assessment model and satellite virtual prototype, an application process of this model suitable for a virtual environment is presented. Finally, according to the application process, all the elements in satellite troubleshooting are analyzed and assessed. The corresponding improvements, which realize the transformation from a conventional way to a virtual simulation and assessment, are suggested, and the flaws in assembling and troubleshooting are revealed. Assembling or troubleshooting schemes can be improved in the early stage of satellite design with the help of a virtual prototype. Repetition in the practical operation is beneficial to companies as risk and cost are effectively reduced.

  16. Georgia Vocational Student Assessment Project. Final Report.

    ERIC Educational Resources Information Center

    Vocational Technical Education Consortium of States, Atlanta, GA.

    A project was conducted to develop vocational education tests for use in Georgia secondary schools, specifically for welding, machine shop, and sheet metal courses. The project team developed an outline of an assessment model that included the following components: (1) select a program for use in developing test items; (2) verify duties, tasks,…

  17. Multilevel Evaluation Systems Project. Final Report.

    ERIC Educational Resources Information Center

    Herman, Joan L.

    Several studies were conducted in 1987 by the Multilevel Evaluation Systems Project, which focuses on developing a model for a multi-purpose, multi-user evaluation system to facilitate educational decision making and evaluation. The project model emphasizes on-going integrated assessment of individuals, classes, and programs using a variety of…

  18. pcr: an R package for quality assessment, analysis and testing of qPCR data

    PubMed Central

    Ahmed, Mahmoud

    2018-01-01

    Background Real-time quantitative PCR (qPCR) is a broadly used technique in the biomedical research. Currently, few different analysis models are used to determine the quality of data and to quantify the mRNA level across the experimental conditions. Methods We developed an R package to implement methods for quality assessment, analysis and testing qPCR data for statistical significance. Double Delta CT and standard curve models were implemented to quantify the relative expression of target genes from CT in standard qPCR control-group experiments. In addition, calculation of amplification efficiency and curves from serial dilution qPCR experiments are used to assess the quality of the data. Finally, two-group testing and linear models were used to test for significance of the difference in expression control groups and conditions of interest. Results Using two datasets from qPCR experiments, we applied different quality assessment, analysis and statistical testing in the pcr package and compared the results to the original published articles. The final relative expression values from the different models, as well as the intermediary outputs, were checked against the expected results in the original papers and were found to be accurate and reliable. Conclusion The pcr package provides an intuitive and unified interface for its main functions to allow biologist to perform all necessary steps of qPCR analysis and produce graphs in a uniform way. PMID:29576953

  19. Carbon emissions risk map from deforestation in the tropical Amazon

    NASA Astrophysics Data System (ADS)

    Ometto, J.; Soler, L. S.; Assis, T. D.; Oliveira, P. V.; Aguiar, A. P.

    2011-12-01

    Assis, Pedro Valle This work aims to estimate the carbon emissions from tropical deforestation in the Brazilian Amazon associated to the risk assessment of future land use change. The emissions are estimated by incorporating temporal deforestation dynamics, accounting for the biophysical and socioeconomic heterogeneity in the region, as well secondary forest growth dynamic in abandoned areas. The land cover change model that supported the risk assessment of deforestation, was run based on linear regressions. This method takes into account spatial heterogeneity of deforestation as the spatial variables adopted to fit the final regression model comprise: environmental aspects, economic attractiveness, accessibility and land tenure structure. After fitting a suitable regression models for each land cover category, the potential of each cell to be deforested (25x25km and 5x5 km of resolution) in the near future was used to calculate the risk assessment of land cover change. The carbon emissions model combines high-resolution new forest clear-cut mapping and four alternative sources of spatial information on biomass distribution for different vegetation types. The risk assessment map of CO2 emissions, was obtained by crossing the simulation results of the historical land cover changes to a map of aboveground biomass contained in the remaining forest. This final map represents the risk of CO2 emissions at 25x25km and 5x5 km until 2020, under a scenario of carbon emission reduction target.

  20. Digital Avionics Information System (DAIS): Life Cycle Cost Impact Modeling System Reliability, Maintainability, and Cost Model (RMCM)--Description. Users Guide. Final Report.

    ERIC Educational Resources Information Center

    Goclowski, John C.; And Others

    The Reliability, Maintainability, and Cost Model (RMCM) described in this report is an interactive mathematical model with a built-in sensitivity analysis capability. It is a major component of the Life Cycle Cost Impact Model (LCCIM), which was developed as part of the DAIS advanced development program to be used to assess the potential impacts…

  1. How can payback from health services research be assessed?

    PubMed

    Buxton, M; Hanney, S

    1996-01-01

    Throughout the world there is a growing recognition that health care should be research-led. This strengthens the requirement for expenditure on health services research to be justified by demonstrating the benefits it produces. However, payback from health research and development is a complex concept and little used term. Five main categories of payback can be identified: knowledge; research benefits; political and administrative benefits; health sector benefits; and broader economic benefits. Various models of research utilization together with previous assessments of payback from research helped in the development of a new conceptual model of how and where payback may occur. The model combines an input-output perspective with an examination of the permeable interfaces between research and its environment. The model characterizes research projects in terms of Inputs, Processes, and Primary Outputs. The last consist of knowledge and research benefits. There are two interfaces between the project and its environment. The first (Project Specification, Selection and Commissioning) is the link with Research Needs Assessment. The second (Dissemination) should lead to Secondary Outputs (which are policy or administrative decisions), and usually Applications (which take the form of behavioural changes), from which Impacts or Final Outcomes result. It is at this final stage that health and wider economic benefits can be measured. A series of case studies were used to assess the feasibility both of applying the model and the payback categorization. The paper draws various conclusions from the case studies and identifies a range of issues for further work.

  2. A Bayesian hierarchical latent trait model for estimating rater bias and reliability in large-scale performance assessment

    PubMed Central

    2018-01-01

    We propose a novel approach to modelling rater effects in scoring-based assessment. The approach is based on a Bayesian hierarchical model and simulations from the posterior distribution. We apply it to large-scale essay assessment data over a period of 5 years. Empirical results suggest that the model provides a good fit for both the total scores and when applied to individual rubrics. We estimate the median impact of rater effects on the final grade to be ± 2 points on a 50 point scale, while 10% of essays would receive a score at least ± 5 different from their actual quality. Most of the impact is due to rater unreliability, not rater bias. PMID:29614129

  3. A Markovian model for assessment of personnel hiring plans

    NASA Technical Reports Server (NTRS)

    Katz, L. G.

    1974-01-01

    As a result of the current economic environment, many organizations are having to operate with fewer resources. In the manpower area, these constraints have forced organizations to operate within well-defined hiring plans. Exceeding personnel ceilings is in most cases an intolerable situation. A mathematical model, based on the theory of Markov processes, is presented which can be used to assess the chances of success of personnel hiring plans. The model considers a plan to be successful if the final population size, at the end of the planning period, lies within a range specified by management. Although this model was developed to assess personnel hiring plans at the Goddard Space Flight Center, it is directly applicable wherever personnel hiring plans are used.

  4. HRST architecture modeling and assessments

    NASA Astrophysics Data System (ADS)

    Comstock, Douglas A.

    1997-01-01

    This paper presents work supporting the assessment of advanced concept options for the Highly Reusable Space Transportation (HRST) study. It describes the development of computer models as the basis for creating an integrated capability to evaluate the economic feasibility and sustainability of a variety of system architectures. It summarizes modeling capabilities for use on the HRST study to perform sensitivity analysis of alternative architectures (consisting of different combinations of highly reusable vehicles, launch assist systems, and alternative operations and support concepts) in terms of cost, schedule, performance, and demand. In addition, the identification and preliminary assessment of alternative market segments for HRST applications, such as space manufacturing, space tourism, etc., is described. Finally, the development of an initial prototype model that can begin to be used for modeling alternative HRST concepts at the system level is presented.

  5. A comparison of different functions for predicted protein model quality assessment.

    PubMed

    Li, Juan; Fang, Huisheng

    2016-07-01

    In protein structure prediction, a considerable number of models are usually produced by either the Template-Based Method (TBM) or the ab initio prediction. The purpose of this study is to find the critical parameter in assessing the quality of the predicted models. A non-redundant template library was developed and 138 target sequences were modeled. The target sequences were all distant from the proteins in the template library and were aligned with template library proteins on the basis of the transformation matrix. The quality of each model was first assessed with QMEAN and its six parameters, which are C_β interaction energy (C_beta), all-atom pairwise energy (PE), solvation energy (SE), torsion angle energy (TAE), secondary structure agreement (SSA), and solvent accessibility agreement (SAE). Finally, the alignment score (score) was also used to assess the quality of model. Hence, a total of eight parameters (i.e., QMEAN, C_beta, PE, SE, TAE, SSA, SAE, score) were independently used to assess the quality of each model. The results indicate that SSA is the best parameter to estimate the quality of the model.

  6. Influence of Discussion Rating in Cooperative Learning Type Numbered Head Together on Learning Results Students VII MTSN Model Padang

    NASA Astrophysics Data System (ADS)

    Sasmita, E.; Edriati, S.; Yunita, A.

    2018-04-01

    Related to the math score of the first semester in class at seventh grade of MTSN Model Padang which much the score that low (less than KKM). It because of the students who feel less involved in learning process because the teacher don't do assessment the discussions. The solution of the problem is discussion assessment in Cooperative Learning Model type Numbered Head Together. This study aims to determine whether the discussion assessment in NHT effect on student learning outcomes of class VII MTsN Model Padang. The instrument used in this study is discussion assessment and final tests. The data analysis technique used is the simple linear regression analysis. Hypothesis test results Fcount greater than the value of Ftable then the hypothesis in this study received. So it concluded that the assessment of the discussion in NHT effect on student learning outcomes of class VII MTsN Model Padang.

  7. Advancing Consumer Product Composition and Chemical ...

    EPA Pesticide Factsheets

    This presentation describes EPA efforts to collect, model, and measure publically available consumer product data for use in exposure assessment. The development of the ORD Chemicals and Products database will be described, as will machine-learning based models for predicting chemical function. Finally, the talk describes new mass spectrometry-based methods for measuring chemicals in formulation and articles. This presentation is an invited talk to the ICCA-LRI workshop "Fit-For-Purpose Exposure Assessments For Risk-Based Decision Making". The talk will share EPA efforts to characterize the components of consumer products for use in exposure assessment with the international exposure science community.

  8. Integrated healthy workplace model: An experience from North Indian industry

    PubMed Central

    Thakur, Jarnail Singh; Bains, Puneet; Kar, Sitanshu Sekhar; Wadhwa, Sanjay; Moirangthem, Prabha; Kumar, Rajesh; Wadwalker, Sanjay; Sharma, Yashpal

    2012-01-01

    Background: Keeping in view of rapid industrialization and growing Indian economy, there has been a substantial increase in the workforce in India. Currently there is no organized workplace model for promoting health of industrial workers in India. Objective: To develop and implement a healthy workplace model in three industrial settings of North India. Materials and Methods: An operations research was conducted for 12 months in purposively selected three industries of Chandigarh. In phase I, a multi-stakeholder workshop was conducted to finalize the components and tools for the healthy workplace model. NCD risk factors were assessed in 947 employees in these three industries. In phase II, the healthy workplace model was implemented on pilot basis for a period of 12 months in these three industries to finalize the model. Findings: Healthy workplace committee with involvement of representatives of management, labor union and research organization was formed in three industries. Various tools like comprehensive and rapid healthy workplace assessment forms, NCD work-lite format for risk factors surveillance and monitoring and evaluation format were developed. The prevalence of tobacco use, ever alcoholics was found to be 17.8% and 47%, respectively. Around one-third (28%) of employees complained of back pain in the past 12 months. Healthy workplace model with focus on three key components (physical environment, psychosocial work environment, and promoting healthy habits) was developed, implemented on pilot basis, and finalized based on experience in participating industries. A stepwise approach for model with a core, expanded, and optional components were also suggested. An accreditation system is also required for promoting healthy workplace program. Conclusion: Integrated healthy workplace model is feasible, could be implemented in industrial setting in northern India and needs to be pilot tested in other parts of the country. PMID:23776318

  9. The development of instruments to measure the work disability assessment behaviour of insurance physicians

    PubMed Central

    2011-01-01

    Background Variation in assessments is a universal given, and work disability assessments by insurance physicians are no exception. Little is known about the considerations and views of insurance physicians that may partly explain such variation. On the basis of the Attitude - Social norm - self Efficacy (ASE) model, we have developed measurement instruments for assessment behaviour and its determinants. Methods Based on theory and interviews with insurance physicians the questionnaire included blocks of items concerning background variables, intentions, attitudes, social norms, self-efficacy, knowledge, barriers and behaviour of the insurance physicians in relation to work disability assessment issues. The responses of 231 insurance physicians were suitable for further analysis. Factor analysis and reliability analysis were used to form scale variables and homogeneity analysis was used to form dimension variables. Thus, we included 169 of the 177 original items. Results Factor analysis and reliability analysis yielded 29 scales with sufficient reliability. Homogeneity analysis yielded 19 dimensions. Scales and dimensions fitted with the concepts of the ASE model. We slightly modified the ASE model by dividing behaviour into two blocks: behaviour that reflects the assessment process and behaviour that reflects assessment behaviour. The picture that emerged from the descriptive results was of a group of physicians who were motivated in their job and positive about the Dutch social security system in general. However, only half of them had a positive opinion about the Dutch Work and Income (Capacity for Work) Act (WIA). They also reported serious barriers, the most common of which was work pressure. Finally, 73% of the insurance physicians described the majority of their cases as 'difficult'. Conclusions The scales and dimensions developed appear to be valid and offer a promising basis for future research. The results suggest that the underlying ASE model, in modified form, is suitable for describing the assessment behaviour of insurance physicians and the determinants of this behaviour. The next step in this line of research should be to validate the model using structural equation modelling. Finally, the predictive value should be tested in relation to outcome measurements of work disability assessments. PMID:21199570

  10. Low-cost silicon solar array project environmental hail model for assessing risk to solar collectors

    NASA Technical Reports Server (NTRS)

    Gonzalez, C.

    1977-01-01

    The probability of solar arrays being struck by hailstones of various sizes as a function of geographic location and service life was assessed. The study complements parallel studies of solar array sensitivity to hail damage, the final objective being an estimate of the most cost effective level for solar array hail protection.

  11. Functional Assessment of the Role of BORIS in Ovarian Cancer Using a Novel in Vivo Model System

    DTIC Science & Technology

    2014-10-01

    2. REPORT TYPE Annual 3. DATES COVERED 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Functional Assessment of the Role of BORIS in Ovarian Cancer...Finally, the Institutional Biosafety Protocol for the use of adenoviruses was also resubmitted and approved by Roswell Park Cancer Institute. 1b

  12. National Assessment of School Resource Officer Programs. Final Project Report. Document Number 209273

    ERIC Educational Resources Information Center

    Finn, Peter; McDevitt, Jack

    2005-01-01

    There has been a growing interest in placing sworn police officers in schools as SROs to improve school safety. The purpose of the National Assessment was to identify what program "models" have been implemented, how programs have been implemented, and what the programs' possible effects may be. To obtain this information, Abt Associates conducted…

  13. Fiscal Viability, Conjunctive and Compensatory Models, and Career-Ladder Decisions: An Empirical Investigation.

    ERIC Educational Resources Information Center

    Mehrens, William A.; And Others

    A study was undertaken to explore cost-effective ways of making career ladder teacher evaluation system decisions based on fewer measures, assessing the relationship of observational variables to other data and final decisions, and comparison of compensatory and conjunctive decision models. Data included multiple scores from eight data sources in…

  14. The Effects of Measurement Error on Statistical Models for Analyzing Change. Final Report.

    ERIC Educational Resources Information Center

    Dunivant, Noel

    The results of six major projects are discussed including a comprehensive mathematical and statistical analysis of the problems caused by errors of measurement in linear models for assessing change. In a general matrix representation of the problem, several new analytic results are proved concerning the parameters which affect bias in…

  15. Using Machine Learning Models to Predict Functional Use (Interagency Alternatives Assessment Workshop)

    EPA Science Inventory

    This presentation will outline how data was collected on how chemicals are used in products, models were built using this data to then predict how chemicals can be used in products, and, finally, how combining this information with Tox21 in vitro assays can be use to rapidly scre...

  16. Risk model for estimating the 1-year risk of deferred lesion intervention following deferred revascularization after fractional flow reserve assessment.

    PubMed

    Depta, Jeremiah P; Patel, Jayendrakumar S; Novak, Eric; Gage, Brian F; Masrani, Shriti K; Raymer, David; Facey, Gabrielle; Patel, Yogesh; Zajarias, Alan; Lasala, John M; Amin, Amit P; Kurz, Howard I; Singh, Jasvindar; Bach, Richard G

    2015-02-21

    Although lesions deferred revascularization following fractional flow reserve (FFR) assessment have a low risk of adverse cardiac events, variability in risk for deferred lesion intervention (DLI) has not been previously evaluated. The aim of this study was to develop a prediction model to estimate 1-year risk of DLI for coronary lesions where revascularization was not performed following FFR assessment. A prediction model for DLI was developed from a cohort of 721 patients with 882 coronary lesions where revascularization was deferred based on FFR between 10/2002 and 7/2010. Deferred lesion intervention was defined as any revascularization of a lesion previously deferred following FFR. The final DLI model was developed using stepwise Cox regression and validated using bootstrapping techniques. An algorithm was constructed to predict the 1-year risk of DLI. During a mean (±SD) follow-up period of 4.0 ± 2.3 years, 18% of lesions deferred after FFR underwent DLI; the 1-year incidence of DLI was 5.3%, while the predicted risk of DLI varied from 1 to 40%. The final Cox model included the FFR value, age, current or former smoking, history of coronary artery disease (CAD) or prior percutaneous coronary intervention, multi-vessel CAD, and serum creatinine. The c statistic for the DLI prediction model was 0.66 (95% confidence interval, CI: 0.61-0.70). Patients deferred revascularization based on FFR have variation in their risk for DLI. A clinical prediction model consisting of five clinical variables and the FFR value can help predict the risk of DLI in the first year following FFR assessment. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2014. For permissions please email: journals.permissions@oup.com.

  17. Impact of aircraft emissions on air quality in the vicinity of airports. Volume I. Recent airport measurement programs, data analyses, and sub-model development. Final report Jan78-Jul 80

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamartino, R.J.; Smith, D.G.; Bremer, S.A.

    1980-07-01

    This report documents the results of the Federal Aviation Administration (FAA)/Environmental Protection Agency (EPA) air quality study which has been conducted to assess the impact of aircraft emissions of carbon monoxide (CO), hydrocarbons (HC), and oxides of nitrogen (NOx) in the vicinity of airports. This assessment includes the results of recent modeling and monitoring efforts at Washington National (DCA), Los Angeles International (LAX), Dulles International (IAD), and Lakeland, Florida airports and an updated modeling of aircraft generated pollution at LAX, John F. Kennedy (JFK) and Chicago O'Hare (ORD) airports. The Airport Vicinity Air Pollution (AVAP) model which was designed formore » use at civil airports was used in this assessment. In addition the results of the application of the military version of the AVAP model the Air Quality Assessment Model (AQAM), are summarized.« less

  18. Positional Quality Assessment of Orthophotos Obtained from Sensors Onboard Multi-Rotor UAV Platforms

    PubMed Central

    Mesas-Carrascosa, Francisco Javier; Rumbao, Inmaculada Clavero; Berrocal, Juan Alberto Barrera; Porras, Alfonso García-Ferrer

    2014-01-01

    In this study we explored the positional quality of orthophotos obtained by an unmanned aerial vehicle (UAV). A multi-rotor UAV was used to obtain images using a vertically mounted digital camera. The flight was processed taking into account the photogrammetry workflow: perform the aerial triangulation, generate a digital surface model, orthorectify individual images and finally obtain a mosaic image or final orthophoto. The UAV orthophotos were assessed with various spatial quality tests used by national mapping agencies (NMAs). Results showed that the orthophotos satisfactorily passed the spatial quality tests and are therefore a useful tool for NMAs in their production flowchart. PMID:25587877

  19. Positional quality assessment of orthophotos obtained from sensors onboard multi-rotor UAV platforms.

    PubMed

    Mesas-Carrascosa, Francisco Javier; Rumbao, Inmaculada Clavero; Berrocal, Juan Alberto Barrera; Porras, Alfonso García-Ferrer

    2014-11-26

    In this study we explored the positional quality of orthophotos obtained by an unmanned aerial vehicle (UAV). A multi-rotor UAV was used to obtain images using a vertically mounted digital camera. The flight was processed taking into account the photogrammetry workflow: perform the aerial triangulation, generate a digital surface model, orthorectify individual images and finally obtain a mosaic image or final orthophoto. The UAV orthophotos were assessed with various spatial quality tests used by national mapping agencies (NMAs). Results showed that the orthophotos satisfactorily passed the spatial quality tests and are therefore a useful tool for NMAs in their production flowchart.

  20. Framework for Uncertainty Assessment - Hanford Site-Wide Groundwater Flow and Transport Modeling

    NASA Astrophysics Data System (ADS)

    Bergeron, M. P.; Cole, C. R.; Murray, C. J.; Thorne, P. D.; Wurstner, S. K.

    2002-05-01

    Pacific Northwest National Laboratory is in the process of development and implementation of an uncertainty estimation methodology for use in future site assessments that addresses parameter uncertainty as well as uncertainties related to the groundwater conceptual model. The long-term goals of the effort are development and implementation of an uncertainty estimation methodology for use in future assessments and analyses being made with the Hanford site-wide groundwater model. The basic approach in the framework developed for uncertainty assessment consists of: 1) Alternate conceptual model (ACM) identification to identify and document the major features and assumptions of each conceptual model. The process must also include a periodic review of the existing and proposed new conceptual models as data or understanding become available. 2) ACM development of each identified conceptual model through inverse modeling with historical site data. 3) ACM evaluation to identify which of conceptual models are plausible and should be included in any subsequent uncertainty assessments. 4) ACM uncertainty assessments will only be carried out for those ACMs determined to be plausible through comparison with historical observations and model structure identification measures. The parameter uncertainty assessment process generally involves: a) Model Complexity Optimization - to identify the important or relevant parameters for the uncertainty analysis; b) Characterization of Parameter Uncertainty - to develop the pdfs for the important uncertain parameters including identification of any correlations among parameters; c) Propagation of Uncertainty - to propagate parameter uncertainties (e.g., by first order second moment methods if applicable or by a Monte Carlo approach) through the model to determine the uncertainty in the model predictions of interest. 5)Estimation of combined ACM and scenario uncertainty by a double sum with each component of the inner sum (an individual CCDF) representing parameter uncertainty associated with a particular scenario and ACM and the outer sum enumerating the various plausible ACM and scenario combinations in order to represent the combined estimate of uncertainty (a family of CCDFs). A final important part of the framework includes identification, enumeration, and documentation of all the assumptions, which include those made during conceptual model development, required by the mathematical model, required by the numerical model, made during the spatial and temporal descretization process, needed to assign the statistical model and associated parameters that describe the uncertainty in the relevant input parameters, and finally those assumptions required by the propagation method. Pacific Northwest National Laboratory is operated for the U.S. Department of Energy under Contract DE-AC06-76RL01830.

  1. Neural network model for thermal inactivation of Salmonella Typhimurium to elimination in ground chicken: Acquisition of data by whole sample enrichment, miniature most-probable-number method

    USDA-ARS?s Scientific Manuscript database

    Predictive models are valuable tools for assessing food safety. Existing thermal inactivation models for Salmonella and ground chicken do not provide predictions above 71 degrees C, which is below the recommended final cooked temperature of 73.9 degrees C. They also do not predict when all Salmone...

  2. Landslide susceptibility assesssment in the Uttarakhand area (India) using GIS: a comparison study of prediction capability of naïve bayes, multilayer perceptron neural networks, and functional trees methods

    NASA Astrophysics Data System (ADS)

    Pham, Binh Thai; Tien Bui, Dieu; Pourghasemi, Hamid Reza; Indra, Prakash; Dholakia, M. B.

    2017-04-01

    The objective of this study is to make a comparison of the prediction performance of three techniques, Functional Trees (FT), Multilayer Perceptron Neural Networks (MLP Neural Nets), and Naïve Bayes (NB) for landslide susceptibility assessment at the Uttarakhand Area (India). Firstly, a landslide inventory map with 430 landslide locations in the study area was constructed from various sources. Landslide locations were then randomly split into two parts (i) 70 % landslide locations being used for training models (ii) 30 % landslide locations being employed for validation process. Secondly, a total of eleven landslide conditioning factors including slope angle, slope aspect, elevation, curvature, lithology, soil, land cover, distance to roads, distance to lineaments, distance to rivers, and rainfall were used in the analysis to elucidate the spatial relationship between these factors and landslide occurrences. Feature selection of Linear Support Vector Machine (LSVM) algorithm was employed to assess the prediction capability of these conditioning factors on landslide models. Subsequently, the NB, MLP Neural Nets, and FT models were constructed using training dataset. Finally, success rate and predictive rate curves were employed to validate and compare the predictive capability of three used models. Overall, all the three models performed very well for landslide susceptibility assessment. Out of these models, the MLP Neural Nets and the FT models had almost the same predictive capability whereas the MLP Neural Nets (AUC = 0.850) was slightly better than the FT model (AUC = 0.849). The NB model (AUC = 0.838) had the lowest predictive capability compared to other models. Landslide susceptibility maps were final developed using these three models. These maps would be helpful to planners and engineers for the development activities and land-use planning.

  3. Rescuing the Clinical Breast Examination: Advances in Classifying Technique and Assessing Physician Competency.

    PubMed

    Laufer, Shlomi; D'Angelo, Anne-Lise D; Kwan, Calvin; Ray, Rebbeca D; Yudkowsky, Rachel; Boulet, John R; McGaghie, William C; Pugh, Carla M

    2017-12-01

    Develop new performance evaluation standards for the clinical breast examination (CBE). There are several, technical aspects of a proper CBE. Our recent work discovered a significant, linear relationship between palpation force and CBE accuracy. This article investigates the relationship between other technical aspects of the CBE and accuracy. This performance assessment study involved data collection from physicians (n = 553) attending 3 different clinical meetings between 2013 and 2014: American Society of Breast Surgeons, American Academy of Family Physicians, and American College of Obstetricians and Gynecologists. Four, previously validated, sensor-enabled breast models were used for clinical skills assessment. Models A and B had solitary, superficial, 2 cm and 1 cm soft masses, respectively. Models C and D had solitary, deep, 2 cm hard and moderately firm masses, respectively. Finger movements (search technique) from 1137 CBE video recordings were independently classified by 2 observers. Final classifications were compared with CBE accuracy. Accuracy rates were model A = 99.6%, model B = 89.7%, model C = 75%, and model D = 60%. Final classification categories for search technique included rubbing movement, vertical movement, piano fingers, and other. Interrater reliability was (k = 0.79). Rubbing movement was 4 times more likely to yield an accurate assessment (odds ratio 3.81, P < 0.001) compared with vertical movement and piano fingers. Piano fingers had the highest failure rate (36.5%). Regression analysis of search pattern, search technique, palpation force, examination time, and 6 demographic variables, revealed that search technique independently and significantly affected CBE accuracy (P < 0.001). Our results support measurement and classification of CBE techniques and provide the foundation for a new paradigm in teaching and assessing hands-on clinical skills. The newly described piano fingers palpation technique was noted to have unusually high failure rates. Medical educators should be aware of the potential differences in effectiveness for various CBE techniques.

  4. Modeling Tool to Quantify Metal Sources in Stormwater Discharges at Naval Facilities (NESDI Project 455)

    DTIC Science & Technology

    2014-06-01

    TECHNICAL REPORT 2077 June 2014 Modeling Tool to Quantify Metal Sources in Stormwater Discharges at Naval Facilities (NESDI Project 455... Stormwater Discharges at Naval Facilities (NESDI Project 455) Final Report and Guidance C. Katz K. Sorensen E. Arias SSC Pacific R. Pitt L. Talebi...demonstration/validation project to assess the use of the urban stormwater model Windows Source Loading and Management Model (WinSLAMM) to characterize

  5. 76 FR 54525 - Notice of Availability of a Final Environmental Assessment (Final EA) and a Finding of No...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-01

    ... Environmental Assessment (Final EA) and a Finding of No Significant Impact (FONSI)/Record of Decision (ROD) for... Environmental Assessment (Final EA) and Finding of No Significant Impact (FONSI)/Record of Decision (ROD) for a...)/Record of Decision (ROD) based on the Final Environmental Assessment (Final EA) for a Proposed Airport...

  6. What Influences Young Canadians to Pursue Post-Secondary Studies? Final Report

    ERIC Educational Resources Information Center

    Dubois, Julie

    2002-01-01

    This paper uses the theory of human capital to model post-secondary education enrolment decisions. The model is based on the assumption that high school graduates assess the costs and benefits associated with various levels of post-secondary education (college or university) and select the option that maximizes the expected net present value.…

  7. Protein structure modeling for CASP10 by multiple layers of global optimization.

    PubMed

    Joo, Keehyoung; Lee, Juyong; Sim, Sangjin; Lee, Sun Young; Lee, Kiho; Heo, Seungryong; Lee, In-Ho; Lee, Sung Jong; Lee, Jooyoung

    2014-02-01

    In the template-based modeling (TBM) category of CASP10 experiment, we introduced a new protocol called protein modeling system (PMS) to generate accurate protein structures in terms of side-chains as well as backbone trace. In the new protocol, a global optimization algorithm, called conformational space annealing (CSA), is applied to the three layers of TBM procedure: multiple sequence-structure alignment, 3D chain building, and side-chain re-modeling. For 3D chain building, we developed a new energy function which includes new distance restraint terms of Lorentzian type (derived from multiple templates), and new energy terms that combine (physical) energy terms such as dynamic fragment assembly (DFA) energy, DFIRE statistical potential energy, hydrogen bonding term, etc. These physical energy terms are expected to guide the structure modeling especially for loop regions where no template structures are available. In addition, we developed a new quality assessment method based on random forest machine learning algorithm to screen templates, multiple alignments, and final models. For TBM targets of CASP10, we find that, due to the combination of three stages of CSA global optimizations and quality assessment, the modeling accuracy of PMS improves at each additional stage of the protocol. It is especially noteworthy that the side-chains of the final PMS models are far more accurate than the models in the intermediate steps. Copyright © 2013 Wiley Periodicals, Inc.

  8. Underwater noise modelling for environmental impact assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farcas, Adrian; Thompson, Paul M.; Merchant, Nathan D., E-mail: nathan.merchant@cefas.co.uk

    Assessment of underwater noise is increasingly required by regulators of development projects in marine and freshwater habitats, and noise pollution can be a constraining factor in the consenting process. Noise levels arising from the proposed activity are modelled and the potential impact on species of interest within the affected area is then evaluated. Although there is considerable uncertainty in the relationship between noise levels and impacts on aquatic species, the science underlying noise modelling is well understood. Nevertheless, many environmental impact assessments (EIAs) do not reflect best practice, and stakeholders and decision makers in the EIA process are often unfamiliarmore » with the concepts and terminology that are integral to interpreting noise exposure predictions. In this paper, we review the process of underwater noise modelling and explore the factors affecting predictions of noise exposure. Finally, we illustrate the consequences of errors and uncertainties in noise modelling, and discuss future research needs to reduce uncertainty in noise assessments.« less

  9. Standardized reporting for rapid relative effectiveness assessments of pharmaceuticals.

    PubMed

    Kleijnen, Sarah; Pasternack, Iris; Van de Casteele, Marc; Rossi, Bernardette; Cangini, Agnese; Di Bidino, Rossella; Jelenc, Marjetka; Abrishami, Payam; Autti-Rämö, Ilona; Seyfried, Hans; Wildbacher, Ingrid; Goettsch, Wim G

    2014-11-01

    Many European countries perform rapid assessments of the relative effectiveness (RE) of pharmaceuticals as part of the reimbursement decision making process. Increased sharing of information on RE across countries may save costs and reduce duplication of work. The objective of this article is to describe the development of a tool for rapid assessment of RE of new pharmaceuticals that enter the market, the HTA Core Model® for Rapid Relative Effectiveness Assessment (REA) of Pharmaceuticals. Eighteen member organisations of the European Network of Health Technology Assessment (EUnetHTA) participated in the development of the model. Different versions of the model were developed and piloted in this collaboration and adjusted accordingly based on feedback on the content and feasibility of the model. The final model deviates from the traditional HTA Core Model® used for assessing other types of technologies. This is due to the limited scope (strong focus on RE), the timing of the assessment (just after market authorisation), and strict timelines (e.g. 90 days) required for performing the assessment. The number of domains and assessment elements was limited and it was decided that the primary information sources should preferably be a submission file provided by the marketing authorisation holder and the European Public Assessment Report. The HTA Core Model® for Rapid REA (version 3.0) was developed to produce standardised transparent RE information of pharmaceuticals. Further piloting can provide input for possible improvements, such as further refining the assessment elements and new methodological guidance on relevant areas.

  10. HRST architecture modeling and assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Comstock, D.A.

    1997-01-01

    This paper presents work supporting the assessment of advanced concept options for the Highly Reusable Space Transportation (HRST) study. It describes the development of computer models as the basis for creating an integrated capability to evaluate the economic feasibility and sustainability of a variety of system architectures. It summarizes modeling capabilities for use on the HRST study to perform sensitivity analysis of alternative architectures (consisting of different combinations of highly reusable vehicles, launch assist systems, and alternative operations and support concepts) in terms of cost, schedule, performance, and demand. In addition, the identification and preliminary assessment of alternative market segmentsmore » for HRST applications, such as space manufacturing, space tourism, etc., is described. Finally, the development of an initial prototype model that can begin to be used for modeling alternative HRST concepts at the system level is presented. {copyright} {ital 1997 American Institute of Physics.}« less

  11. Evaluation of the durability of 3D printed keys produced by computational processing of image data

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy; Kerlin, Scott

    2016-05-01

    Possession of a working 3D printed key can, for most practical purposes, convince observers that an illicit attempt to gain premises access is authorized. This paper seeks to assess three things. First, work has been performed to determine how easily the data for making models of keys can be obtained through manual measurement. It then presents work done to create a model of the key and determine how easy key modeling could be (particularly after a first key of a given key `blank' has been made). Finally, it seeks to assess the durability of the keys produced using 3D printing.

  12. Assessment of Ocean Wave Model used to Analyze the Constellation Program (CxP) Orion Project Crew Module Water Landing Conditions

    NASA Technical Reports Server (NTRS)

    Smith, Bryan K.; Bouchard, Richard; Teng, Chung-Chu; Dyson, Rodger; Jenson, Robert; OReilly, William; Rogers, Erick; Wang, David; Volovoi, Vitali

    2009-01-01

    Mr. Christopher Johnson, NASA's Systems Manager for the Orion Project Crew Module (CM) Landing and Recovery at the Johnson Space Center (JSC), and Mr. James Corliss, Project Engineer for the Orion CM Landing System Advanced Development Project at the Langley Research Center (LaRC) requested an independent assessment of the wave model that was developed to analyze the CM water landing conditions. A NASA Engineering and Safety Center (NESC) initial evaluation was approved November 20, 2008. Mr. Bryan Smith, NESC Chief Engineer at the NASA Glenn Research Center (GRC), was selected to lead this assessment. The Assessment Plan was presented and approved by the NESC Review Board (NRB) on December 18, 2008. The Assessment Report was presented to the NRB on March 12, 2009. This document is the final Assessment Report.

  13. Regional scale landslide risk assessment with a dynamic physical model - development, application and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Luna, Byron Quan; Vidar Vangelsten, Bjørn; Liu, Zhongqiang; Eidsvig, Unni; Nadim, Farrokh

    2013-04-01

    Landslide risk must be assessed at the appropriate scale in order to allow effective risk management. At the moment, few deterministic models exist that can do all the computations required for a complete landslide risk assessment at a regional scale. This arises from the difficulty to precisely define the location and volume of the released mass and from the inability of the models to compute the displacement with a large amount of individual initiation areas (computationally exhaustive). This paper presents a medium-scale, dynamic physical model for rapid mass movements in mountainous and volcanic areas. The deterministic nature of the approach makes it possible to apply it to other sites since it considers the frictional equilibrium conditions for the initiation process, the rheological resistance of the displaced flow for the run-out process and fragility curve that links intensity to economic loss for each building. The model takes into account the triggering effect of an earthquake, intense rainfall and a combination of both (spatial and temporal). The run-out module of the model considers the flow as a 2-D continuum medium solving the equations of mass balance and momentum conservation. The model is embedded in an open source environment geographical information system (GIS), it is computationally efficient and it is transparent (understandable and comprehensible) for the end-user. The model was applied to a virtual region, assessing landslide hazard, vulnerability and risk. A Monte Carlo simulation scheme was applied to quantify, propagate and communicate the effects of uncertainty in input parameters on the final results. In this technique, the input distributions are recreated through sampling and the failure criteria are calculated for each stochastic realisation of the site properties. The model is able to identify the released volumes of the critical slopes and the areas threatened by the run-out intensity. The obtained final outcome is the estimation of individual building damage and total economic risk. The research leading to these results has received funding from the European Community's Seventh Framework Programme [FP7/2007-2013] under grant agreement No 265138 New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe (MATRIX).

  14. Evaluation of HFIR LEU Fuel Using the COMSOL Multiphysics Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Primm, Trent; Ruggles, Arthur; Freels, James D

    2009-03-01

    A finite element computational approach to simulation of the High Flux Isotope Reactor (HFIR) Core Thermal-Fluid behavior is developed. These models were developed to facilitate design of a low enriched core for the HFIR, which will have different axial and radial flux profiles from the current HEU core and thus will require fuel and poison load optimization. This report outlines a stepwise implementation of this modeling approach using the commercial finite element code, COMSOL, with initial assessment of fuel, poison and clad conduction modeling capability, followed by assessment of mating of the fuel conduction models to a one dimensional fluidmore » model typical of legacy simulation techniques for the HFIR core. The model is then extended to fully couple 2-dimensional conduction in the fuel to a 2-dimensional thermo-fluid model of the coolant for a HFIR core cooling sub-channel with additional assessment of simulation outcomes. Finally, 3-dimensional simulations of a fuel plate and cooling channel are presented.« less

  15. Design and Implementation of an Assessment Model for Students Entering Vocational Education Programs in the State of Colorado. Final Report.

    ERIC Educational Resources Information Center

    Hartley, Nancy K.; And Others

    A project was conducted to develop materials that could be utilized by teachers for informal assessment of the needs of students in vocational programs. Sixteen modules were developed to help teachers identify some of the areas in which students have special learning needs, clarify various levels of abilities, and distinguish the different…

  16. Pharmacodynamic Modeling of Bacillary Elimination Rates and Detection of Bacterial Lipid Bodies in Sputum to Predict and Understand Outcomes in Treatment of Pulmonary Tuberculosis

    PubMed Central

    Sloan, Derek J.; Mwandumba, Henry C.; Garton, Natalie J.; Khoo, Saye H.; Butterworth, Anthony E.; Allain, Theresa J.; Heyderman, Robert S.; Corbett, Elizabeth L.; Barer, Mike R.; Davies, Geraint R.

    2015-01-01

    Background. Antibiotic-tolerant bacterial persistence prevents treatment shortening in drug-susceptible tuberculosis, and accumulation of intracellular lipid bodies has been proposed to identify a persister phenotype of Mycobacterium tuberculosis cells. In Malawi, we modeled bacillary elimination rates (BERs) from sputum cultures and calculated the percentage of lipid body–positive acid-fast bacilli (%LB + AFB) on sputum smears. We assessed whether these putative measurements of persistence predict unfavorable outcomes (treatment failure/relapse). Methods. Adults with pulmonary tuberculosis received standard 6-month therapy. Sputum samples were collected during the first 8 weeks for serial sputum colony counting (SSCC) on agar and time-to positivity (TTP) measurement in mycobacterial growth indicator tubes. BERs were extracted from nonlinear and linear mixed-effects models, respectively, fitted to these datasets. The %LB + AFB counts were assessed by fluorescence microscopy. Patients were followed until 1 year posttreatment. Individual BERs and %LB + AFB counts were related to final outcomes. Results. One hundred and thirty-three patients (56% HIV coinfected) participated, and 15 unfavorable outcomes were reported. These were inversely associated with faster sterilization phase bacillary elimination from the SSCC model (odds ratio [OR], 0.39; 95% confidence interval [CI], .22–.70) and a faster BER from the TTP model (OR, 0.71; 95% CI, .55–.94). Higher %LB + AFB counts on day 21–28 were recorded in patients who suffered unfavorable final outcomes compared with those who achieved stable cure (P = .008). Conclusions. Modeling BERs predicts final outcome, and high %LB + AFB counts 3–4 weeks into therapy may identify a persister bacterial phenotype. These methods deserve further evaluation as surrogate endpoints for clinical trials. PMID:25778753

  17. Final ecosystem goods and services enhance societal relevance of contaminated-site remediation

    EPA Science Inventory

    Background/Question/Methods Exposure to environmental stressors can adversely affect both human health and ecological receptors and impacts on the latter influence the community's overall vulnerability. Risk assessment guidance promotes conceptual site models to integrate multip...

  18. Proposing Electronic Health Record Usability Requirements Based on Enriched ISO 9241 Metric Usability Model

    PubMed Central

    Farzandipour, Mehrdad; Riazi, Hossein; Jabali, Monireh Sadeqi

    2018-01-01

    Introduction: System usability assessment is among the important aspects in assessing the quality of clinical information technology, especially when the end users of the system are concerned. This study aims at providing a comprehensive list of system usability. Methods: This research is a descriptive cross-sectional one conducted using Delphi technique in three phases in 2013. After experts’ ideas were concluded, the final version of the questionnaire including 163 items in three phases was presented to 40 users of information systems in hospitals. The grading ranged from 0-4. Data analysis was conducted using SPSS software. Those requirements with a mean point of three or higher were finally confirmed. Results: The list of system usability requirements for electronic health record was designed and confirmed in nine areas including suitability for the task (24 items), self-descriptiveness (22 items), controllability (19 questions), conformity with user expectations (25 items), error tolerance (21 items), suitability for individualization (7 items), suitability for learning (19 items), visual clarity (18 items) and auditory presentation (8 items). Conclusion: A relatively comprehensive model including useful requirements for using EHR was presented which can increase functionality, effectiveness and users’ satisfaction. Thus, it is suggested that the present model be adopted by system designers and healthcare system institutions to assess those systems. PMID:29719310

  19. Assessing Resilience in the Global Undersea Cable Infrastructure

    DTIC Science & Technology

    2012-06-01

    ABBREVIATIONS ACMA Australian Communications and Media Authority AD Attacker-Defender FSSCC Financial Services Sector Coordinating Council...after a disruption to the value delivery of the system before the disruption. Finally, their article also highlights the critical importance of...Chang et al. (2006), gravity models take their name from Newton’s law of gravitation, and are commonly used by social scientists to model or

  20. Preliminary Assessment of the Flow of Used Electronics, In ...

    EPA Pesticide Factsheets

    Electronic waste (e-waste) is the largest growing municipal waste stream in the United States. The improper disposal of e-waste has environmental, economic, and social impacts, thus there is a need for sustainable stewardship of electronics. EPA/ORD has been working to improve our understanding of the quantity and flow of electronic devices from initial purchase to final disposition. Understanding the pathways of used electronics from the consumer to their final disposition would provide insight to decision makers about their impacts and support efforts to encourage improvements in policy, technology, and beneficial use. This report is the first stage of study of EPA/ORD's efforts to understand the flows of used electronics and e-waste by reviewing the regulatory programs for the selected states and identifying the key lessons learned and best practices that have emerged since their inception. Additionally, a proof-of-concept e-waste flow model has been developed to provide estimates of the quantity of e-waste generated annually at the national level, as well as for selected states. This report documents a preliminary assessment of available data and development of the model that can be used as a starting point to estimate domestic flows of used electronics from generation, to collection and reuse, to final disposition. The electronics waste flow model can estimate the amount of electronic products entering the EOL management phase based on unit sales dat

  1. Cognitive predictors of a common multitasking ability: Contributions from working memory, attention control, and fluid intelligence.

    PubMed

    Redick, Thomas S; Shipstead, Zach; Meier, Matthew E; Montroy, Janelle J; Hicks, Kenny L; Unsworth, Nash; Kane, Michael J; Hambrick, D Zachary; Engle, Randall W

    2016-11-01

    Previous research has identified several cognitive abilities that are important for multitasking, but few studies have attempted to measure a general multitasking ability using a diverse set of multitasks. In the final dataset, 534 young adult subjects completed measures of working memory (WM), attention control, fluid intelligence, and multitasking. Correlations, hierarchical regression analyses, confirmatory factor analyses, structural equation models, and relative weight analyses revealed several key findings. First, although the complex tasks used to assess multitasking differed greatly in their task characteristics and demands, a coherent construct specific to multitasking ability was identified. Second, the cognitive ability predictors accounted for substantial variance in the general multitasking construct, with WM and fluid intelligence accounting for the most multitasking variance compared to attention control. Third, the magnitude of the relationships among the cognitive abilities and multitasking varied as a function of the complexity and structure of the various multitasks assessed. Finally, structural equation models based on a multifaceted model of WM indicated that attention control and capacity fully mediated the WM and multitasking relationship. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  2. The Development of a Conceptual Framework and Tools to Assess Undergraduates' Principled Use of Models in Cellular Biology

    PubMed Central

    Merritt, Brett; Urban-Lurain, Mark; Parker, Joyce

    2010-01-01

    Recent science education reform has been marked by a shift away from a focus on facts toward deep, rich, conceptual understanding. This requires assessment that also focuses on conceptual understanding rather than recall of facts. This study outlines our development of a new assessment framework and tool—a taxonomy— which, unlike existing frameworks and tools, is grounded firmly in a framework that considers the critical role that models play in science. It also provides instructors a resource for assessing students' ability to reason about models that are central to the organization of key scientific concepts. We describe preliminary data arising from the application of our tool to exam questions used by instructors of a large-enrollment cell and molecular biology course over a 5-yr period during which time our framework and the assessment tool were increasingly used. Students were increasingly able to describe and manipulate models of the processes and systems being studied in this course as measured by assessment items. However, their ability to apply these models in new contexts did not improve. Finally, we discuss the implications of our results and the future directions for our research. PMID:21123691

  3. How to Develop and Interpret a Credibility Assessment of Numerical Models for Human Research: NASA-STD-7009 Demystified

    NASA Technical Reports Server (NTRS)

    Nelson, Emily S.; Mulugeta, Lealem; Walton, Marlei; Myers, Jerry G.

    2014-01-01

    In the wake of the Columbia accident, the NASA-STD-7009 [1] credibility assessment was developed as a unifying platform to describe model credibility and the uncertainties in its modeling predictions. This standard is now being adapted by NASAs Human Research Program to cover a wide range of numerical models for human research. When used properly, the standard can improve the process of code development by encouraging the use of best practices. It can also give management more insight in making informed decisions through a better understanding of the models capabilities and limitations.To a newcomer, the abstractions presented in NASA-STD-7009 and the sheer volume of information that must be absorbed can be overwhelming. This talk is aimed at describing the credibility assessment, which is the heart of the standard, in plain terms. It will outline how to develop a credibility assessment under the standard. It will also show how to quickly interpret the graphs and tables that result from the assessment and how to drill down from the top-level view to the foundation of the assessment. Finally, it will highlight some of the resources that are available for further study.

  4. [Evaluation of preliminary grades and credits in nurse training programs].

    PubMed

    Darmann-Finck, Ingrid; Duveneck, Nicole

    2016-01-01

    In the federal state of Bremen preliminary grades were included to the extent of 25 % in written, oral and practical final grades during the time period 2009-2014. The evaluation focuses on the effects of preliminary grades on the scale of final grades and the performance of learners as well as on the assessment of the appropriateness of final grades. A mixed-methods design was employed that consisted of a quasi-experimental study comprising of surveys of students and teachers of comparative and model courses as well as a qualitative study using group discussions. The results confirm that preliminary grades lead to a minimal improvement of the final grades of some exclusively low-achieving students. The assessment of appropriateness hardly changed. From both learners' and teachers' point of view there is still great dissatisfaction concerning the practical final grades. With regard to learning habits an increased willingness to learn new skills on the one hand and a partly increased performance pressure on the other hand were demonstrated. On the basis of these research results, the authors recommend the regular introduction of preliminary marks into the nursing training. Copyright © 2016. Published by Elsevier GmbH.

  5. Selecting Policy Indicators and Developing Simulation Models for the National School Lunch and Breakfast Programs. Final Report. Special Nutrition Programs Report Series. Special Nutrition Programs Report No. CN-10-PRED

    ERIC Educational Resources Information Center

    Dragoset, Lisa; Gordon, Anne

    2010-01-01

    This report describes work using nationally representative 2005 data from the School Nutrition Dietary Assessment-III (SNDA-III) study to develop a simulation model to predict the potential implications of changes in policies or practices related to school meals and school food environments. The model focuses on three domains of outcomes: (1) the…

  6. Assessing Performance Tradeoffs in Undersea Distributed Sensor Networks

    DTIC Science & Technology

    2006-09-01

    time. We refer to this process as track - before - detect (see [5] for a description), since the final determination of a target presence is not made until...expressions for probability of successful search and probability of false search for modeling the track - before - detect process. We then describe a numerical...random manner (randomly sampled from a uniform distribution). II. SENSOR NETWORK PERFORMANCE MODELS We model the process of track - before - detect by

  7. Population Pharmacokinetic Model of Doxycycline Plasma Concentrations Using Pooled Study Data

    PubMed Central

    Wojciechowski, Jessica; Mudge, Stuart; Upton, Richard N.; Foster, David J. R.

    2017-01-01

    ABSTRACT The literature presently lacks a population pharmacokinetic analysis of doxycycline. This study aimed to develop a population pharmacokinetic model of doxycycline plasma concentrations that could be used to assess the power of bioequivalence between Doryx delayed-release tablets and Doryx MPC. Doxycycline pharmacokinetic data were available from eight phase 1 clinical trials following single/multiple doses of conventional-release doxycycline capsules, Doryx delayed-release tablets, and Doryx MPC under fed and fasted conditions. A population pharmacokinetic model was developed in a stepwise manner using NONMEM, version 7.3. The final covariate model was developed according to a forward inclusion (P < 0.01) and then backward deletion (P < 0.001) procedure. The final model was a two-compartment model with two-transit absorption compartments. Structural covariates in the base model included formulation effects on relative bioavailability (F), absorption lag (ALAG), and the transit absorption rate (KTR) under the fed status. An absorption delay (lag) for the fed status (FTLAG2 = 0.203 h) was also included in the model as a structural covariate. The fed status was observed to decrease F by 10.5%, and the effect of female sex was a 14.4% increase in clearance. The manuscript presents the first population pharmacokinetic model of doxycycline plasma concentrations following oral doxycycline administration. The model was used to assess the power of bioequivalence between Doryx delayed-release tablets and Doryx MPC, and it could potentially be used to critically examine and optimize doxycycline dose regimens. PMID:28052851

  8. Population Pharmacokinetic Model of Doxycycline Plasma Concentrations Using Pooled Study Data.

    PubMed

    Hopkins, Ashley M; Wojciechowski, Jessica; Abuhelwa, Ahmad Y; Mudge, Stuart; Upton, Richard N; Foster, David J R

    2017-03-01

    The literature presently lacks a population pharmacokinetic analysis of doxycycline. This study aimed to develop a population pharmacokinetic model of doxycycline plasma concentrations that could be used to assess the power of bioequivalence between Doryx delayed-release tablets and Doryx MPC. Doxycycline pharmacokinetic data were available from eight phase 1 clinical trials following single/multiple doses of conventional-release doxycycline capsules, Doryx delayed-release tablets, and Doryx MPC under fed and fasted conditions. A population pharmacokinetic model was developed in a stepwise manner using NONMEM, version 7.3. The final covariate model was developed according to a forward inclusion ( P < 0.01) and then backward deletion ( P < 0.001) procedure. The final model was a two-compartment model with two-transit absorption compartments. Structural covariates in the base model included formulation effects on relative bioavailability ( F ), absorption lag (ALAG), and the transit absorption rate (KTR) under the fed status. An absorption delay (lag) for the fed status (FTLAG2 = 0.203 h) was also included in the model as a structural covariate. The fed status was observed to decrease F by 10.5%, and the effect of female sex was a 14.4% increase in clearance. The manuscript presents the first population pharmacokinetic model of doxycycline plasma concentrations following oral doxycycline administration. The model was used to assess the power of bioequivalence between Doryx delayed-release tablets and Doryx MPC, and it could potentially be used to critically examine and optimize doxycycline dose regimens. Copyright © 2017 American Society for Microbiology.

  9. Stochastic, compartmental, and dynamic modeling of cross-contamination during mechanical smearing of cheeses.

    PubMed

    Aziza, Fanny; Mettler, Eric; Daudin, Jean-Jacques; Sanaa, Moez

    2006-06-01

    Cheese smearing is a complex process and the potential for cross-contamination with pathogenic or undesirable microorganisms is critical. During ripening, cheeses are salted and washed with brine to develop flavor and remove molds that could develop on the surfaces. Considering the potential for cross-contamination of this process in quantitative risk assessments could contribute to a better understanding of this phenomenon and, eventually, improve its control. The purpose of this article is to model the cross-contamination of smear-ripened cheeses due to the smearing operation under industrial conditions. A compartmental, dynamic, and stochastic model is proposed for mechanical brush smearing. This model has been developed to describe the exchange of microorganisms between compartments. Based on the analytical solution of the model equations and on experimental data collected with an industrial smearing machine, we assessed the values of the transfer parameters of the model. Monte Carlo simulations, using the distributions of transfer parameters, provide the final number of contaminated products in a batch and their final level of contamination for a given scenario taking into account the initial number of contaminated cheeses of the batch and their contaminant load. Based on analytical results, the model provides indicators for smearing efficiency and propensity of the process for cross-contamination. Unlike traditional approaches in mechanistic models, our approach captures the variability and uncertainty inherent in the process and the experimental data. More generally, this model could represent a generic base to use in modeling similar processes prone to cross-contamination.

  10. Predicting use of effective vegetable parenting practices with the Model of Goal Directed Behavior.

    PubMed

    Diep, Cassandra S; Beltran, Alicia; Chen, Tzu-An; Thompson, Debbe; O'Connor, Teresia; Hughes, Sheryl; Baranowski, Janice; Baranowski, Tom

    2015-06-01

    To model effective vegetable parenting practices using the Model of Goal Directed Vegetable Parenting Practices construct scales. An Internet survey was conducted with parents of pre-school children to assess their agreement with effective vegetable parenting practices and Model of Goal Directed Vegetable Parenting Practices items. Block regression modelling was conducted using the composite score of effective vegetable parenting practices scales as the outcome variable and the Model of Goal Directed Vegetable Parenting Practices constructs as predictors in separate and sequential blocks: demographics, intention, desire (intrinsic motivation), perceived barriers, autonomy, relatedness, self-efficacy, habit, anticipated emotions, perceived behavioural control, attitudes and lastly norms. Backward deletion was employed at the end for any variable not significant at P<0·05. Houston, TX, USA. Three hundred and seven parents (mostly mothers) of pre-school children. Significant predictors in the final model in order of relationship strength included habit of active child involvement in vegetable selection, habit of positive vegetable communications, respondent not liking vegetables, habit of keeping a positive vegetable environment and perceived behavioural control of having a positive influence on child's vegetable consumption. The final model's adjusted R 2 was 0·486. This was the first study to test scales from a behavioural model to predict effective vegetable parenting practices. Further research needs to assess these Model of Goal Directed Vegetable Parenting Practices scales for their (i) predictiveness of child consumption of vegetables in longitudinal samples and (ii) utility in guiding design of vegetable parenting practices interventions.

  11. Assessment of groundwater vulnerability by applying the modified DRASTIC model in Beihai City, China.

    PubMed

    Wu, Xiaoyu; Li, Bin; Ma, Chuanming

    2018-05-01

    This study assesses vulnerability of groundwater to pollution in Beihai City, China, as a support of groundwater resource protection. The assessment result not only objectively reflects potential possibility of groundwater to contamination but also provides scientific basis for the planning and utilization of groundwater resources. This study optimizes the parameters consisting of natural factors and human factors upon the DRASTIC model and modifies the ratings of these parameters, based on the local environmental conditions for the study area. And a weight of each parameter is assigned by the analytic hierarchy process (AHP) to reduce the subjectivity of humans to vulnerability assessment. The resulting scientific ratings and weights of modified DRASTIC model (AHP-DRASTLE model) contribute to obtain the more realistic assessment of vulnerability of groundwater to contaminant. The comparison analysis validates the accuracy and rationality of the AHP-DRASTLE model and shows it suits the particularity of the study area. The new assessment method (AHP-DRASTLE model) can provide a guide for other scholars to assess the vulnerability of groundwater to contamination. The final vulnerability map for the AHP-DRASTLE model shows four classes: highest (2%), high (29%), low (55%), and lowest (14%). The vulnerability map serves as a guide for decision makers on groundwater resource protection and land use planning at the regional scale and that it is adapted to a specific area.

  12. Assessing Chronic Illness Care Education (ACIC-E): a tool for tracking educational re-design for improving chronic care education.

    PubMed

    Bowen, Judith L; Provost, Lloyd; Stevens, David P; Johnson, Julie K; Woods, Donna M; Sixta, Connie S; Wagner, Edward H

    2010-09-01

    Recent Breakthrough Series Collaboratives have focused on improving chronic illness care, but few have included academic practices, and none have specifically targeted residency education in parallel with improving clinical care. Tools are available for assessing progress with clinical improvements, but no similar instruments have been developed for monitoring educational improvements for chronic care education. To design a survey to assist teaching practices with identifying curricular gaps in chronic care education and monitor efforts to address those gaps. During a national academic chronic care collaborative, we used an iterative method to develop and pilot test a survey instrument modeled after the Assessing Chronic Illness Care (ACIC). We implemented this instrument, the ACIC-Education, in a second collaborative and assessed the relationship of survey results with reported educational measures. A combined 57 self-selected teams from 37 teaching hospitals enrolled in one of two collaboratives. We used descriptive statistics to report mean ACIC-E scores and educational measurement results, and Pearson's test for correlation between the final ACIC-E score and reported educational measures. A total of 29 teams from the national collaborative and 15 teams from the second collaborative in California completed the final ACIC-E. The instrument measured progress on all sub-scales of the Chronic Care Model. Fourteen California teams (70%) reported using two to six education measures (mean 4.3). The relationship between the final survey results and the number of educational measures reported was weak (R(2) = 0.06, p = 0.376), but improved when a single outlier was removed (R(2) = 0.37, p = 0.022). The ACIC-E instrument proved feasible to complete. Participating teams, on average, recorded modest improvement in all areas measured by the instrument over the duration of the collaboratives. The relationship between the final ACIC-E score and the number of educational measures was weak. Further research on its utility and validity is required.

  13. High cycle fatigue crack modeling and analysis for deck truss flooring connection details : final report.

    DOT National Transportation Integrated Search

    1997-07-01

    The Oregon Department of Transportation is responsible for many steel deck truss bridges containing connection details that are fatigue prone. A typical bridge, the Winchester Bridge in Roseburg, Oregon, was analyzed to assess the loading conditions,...

  14. Surrogate Safety Assessment Model and Validation : Final Report

    DOT National Transportation Integrated Search

    2008-06-01

    Safety of traffic facilities is most often measured by counting the number (and severity) of crashes that occur. It is not possible to apply such a measurement technique to traffic facility designs that have not yet been built or deployed in the real...

  15. Assessment of the Crashworthiness of Existing Urban Rail Vehicles. Volume 3. Train-Collision Model Users Manual.

    DOT National Transportation Integrated Search

    1975-11-01

    The crashworthiness of existing urban rail vehicles (passenger cars) and the feasibility of improvements in this area were investigated. Both rail-car structural configurations and impact absorption devices were studied. This final report issued unde...

  16. Transportation Network Data Requirements for Assessing Criticality for Resiliency and Adaptation Planning

    DOT National Transportation Integrated Search

    2017-11-01

    This report is one of two NCST Research Report documents produced as part of a project to advance the technical modeling tools for resiliency and adaptation planning, especially those used for criticality rankings. The official final technical report...

  17. Using Explanatory Item Response Models to Evaluate Complex Scientific Tasks Designed for the Next Generation Science Standards

    NASA Astrophysics Data System (ADS)

    Chiu, Tina

    This dissertation includes three studies that analyze a new set of assessment tasks developed by the Learning Progressions in Middle School Science (LPS) Project. These assessment tasks were designed to measure science content knowledge on the structure of matter domain and scientific argumentation, while following the goals from the Next Generation Science Standards (NGSS). The three studies focus on the evidence available for the success of this design and its implementation, generally labelled as "validity" evidence. I use explanatory item response models (EIRMs) as the overarching framework to investigate these assessment tasks. These models can be useful when gathering validity evidence for assessments as they can help explain student learning and group differences. In the first study, I explore the dimensionality of the LPS assessment by comparing the fit of unidimensional, between-item multidimensional, and Rasch testlet models to see which is most appropriate for this data. By applying multidimensional item response models, multiple relationships can be investigated, and in turn, allow for a more substantive look into the assessment tasks. The second study focuses on person predictors through latent regression and differential item functioning (DIF) models. Latent regression models show the influence of certain person characteristics on item responses, while DIF models test whether one group is differentially affected by specific assessment items, after conditioning on latent ability. Finally, the last study applies the linear logistic test model (LLTM) to investigate whether item features can help explain differences in item difficulties.

  18. Assessment of Technologies for the Space Shuttle External Tank Thermal Protection System and Recommendations for Technology Improvement. Part 2; Structural Analysis Technologies and Modeling Practices

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Nemeth, Michael P.; Hilburger, Mark W.

    2004-01-01

    A technology review and assessment of modeling and analysis efforts underway in support of a safe return to flight of the thermal protection system (TPS) for the Space Shuttle external tank (ET) are summarized. This review and assessment effort focuses on the structural modeling and analysis practices employed for ET TPS foam design and analysis and on identifying analysis capabilities needed in the short-term and long-term. The current understanding of the relationship between complex flight environments and ET TPS foam failure modes are reviewed as they relate to modeling and analysis. A literature review on modeling and analysis of TPS foam material systems is also presented. Finally, a review of modeling and analysis tools employed in the Space Shuttle Program is presented for the ET TPS acreage and close-out foam regions. This review includes existing simplified engineering analysis tools are well as finite element analysis procedures.

  19. Accounting for dropout in xenografted tumour efficacy studies: integrated endpoint analysis, reduced bias and better use of animals.

    PubMed

    Martin, Emma C; Aarons, Leon; Yates, James W T

    2016-07-01

    Xenograft studies are commonly used to assess the efficacy of new compounds and characterise their dose-response relationship. Analysis often involves comparing the final tumour sizes across dose groups. This can cause bias, as often in xenograft studies a tumour burden limit (TBL) is imposed for ethical reasons, leading to the animals with the largest tumours being excluded from the final analysis. This means the average tumour size, particularly in the control group, is underestimated, leading to an underestimate of the treatment effect. Four methods to account for dropout due to the TBL are proposed, which use all the available data instead of only final observations: modelling, pattern mixture models, treating dropouts as censored using the M3 method and joint modelling of tumour growth and dropout. The methods were applied to both a simulated data set and a real example. All four proposed methods led to an improvement in the estimate of treatment effect in the simulated data. The joint modelling method performed most strongly, with the censoring method also providing a good estimate of the treatment effect, but with higher uncertainty. In the real data example, the dose-response estimated using the censoring and joint modelling methods was higher than the very flat curve estimated from average final measurements. Accounting for dropout using the proposed censoring or joint modelling methods allows the treatment effect to be recovered in studies where it may have been obscured due to dropout caused by the TBL.

  20. Segmenting Bone Parts for Bone Age Assessment using Point Distribution Model and Contour Modelling

    NASA Astrophysics Data System (ADS)

    Kaur, Amandeep; Singh Mann, Kulwinder, Dr.

    2018-01-01

    Bone age assessment (BAA) is a task performed on radiographs by the pediatricians in hospitals to predict the final adult height, to diagnose growth disorders by monitoring skeletal development. For building an automatic bone age assessment system the step in routine is to do image pre-processing of the bone X-rays so that features row can be constructed. In this research paper, an enhanced point distribution algorithm using contours has been implemented for segmenting bone parts as per well-established procedure of bone age assessment that would be helpful in building feature row and later on; it would be helpful in construction of automatic bone age assessment system. Implementation of the segmentation algorithm shows high degree of accuracy in terms of recall and precision in segmenting bone parts from left hand X-Rays.

  1. New models to predict depth of infiltration in endometrial carcinoma based on transvaginal sonography.

    PubMed

    De Smet, F; De Brabanter, J; Van den Bosch, T; Pochet, N; Amant, F; Van Holsbeke, C; Moerman, P; De Moor, B; Vergote, I; Timmerman, D

    2006-06-01

    Preoperative knowledge of the depth of myometrial infiltration is important in patients with endometrial carcinoma. This study aimed at assessing the value of histopathological parameters obtained from an endometrial biopsy (Pipelle de Cornier; results available preoperatively) and ultrasound measurements obtained after transvaginal sonography with color Doppler imaging in the preoperative prediction of the depth of myometrial invasion, as determined by the final histopathological examination of the hysterectomy specimen (the gold standard). We first collected ultrasound and histopathological data from 97 consecutive women with endometrial carcinoma and divided them into two groups according to surgical stage (Stages Ia and Ib vs. Stages Ic and higher). The areas (AUC) under the receiver-operating characteristics curves of the subjective assessment of depth of invasion by an experienced gynecologist and of the individual ultrasound parameters were calculated. Subsequently, we used these variables to train a logistic regression model and least squares support vector machines (LS-SVM) with linear and RBF (radial basis function) kernels. Finally, these models were validated prospectively on data from 76 new patients in order to make a preoperative prediction of the depth of invasion. Of all ultrasound parameters, the ratio of the endometrial and uterine volumes had the largest AUC (78%), while that of the subjective assessment was 79%. The AUCs of the blood flow indices were low (range, 51-64%). Stepwise logistic regression selected the degree of differentiation, the number of fibroids, the endometrial thickness and the volume of the tumor. Compared with the AUC of the subjective assessment (72%), prospective evaluation of the mathematical models resulted in a higher AUC for the LS-SVM model with an RBF kernel (77%), but this difference was not significant. Single morphological parameters do not improve the predictive power when compared with the subjective assessment of depth of myometrial invasion of endometrial cancer, and blood flow indices do not contribute to the prediction of stage. In this study an LS-SVM model with an RBF kernel gave the best prediction; while this might be more reliable than subjective assessment, confirmation by larger prospective studies is required. Copyright 2006 ISUOG. Published by John Wiley & Sons, Ltd.

  2. BASINs and WEPP Climate Assessment Tools (CAT): Case ...

    EPA Pesticide Factsheets

    EPA announced the release of the final report, BASINs and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications. This report supports application of two recently developed water modeling tools, the Better Assessment Science Integrating point & Non-point Sources (BASINS) and the Water Erosion Prediction Project Climate Assessment Tool (WEPPCAT). The report presents a series of short case studies designed to illustrate the capabilities of these tools for conducting scenario based assessments of the potential effects of climate change on streamflow and water quality. This report presents a series of short, illustrative case studies using the BASINS and WEPP climate assessment tools.

  3. Nevada Test and Training Range Depleted Uranium Target Disposal Environmental Assessment

    DTIC Science & Technology

    2005-03-01

    to establish the probability and scope of such transport. Long-Term Fate of Depleted Uranium at Aberdeen and Yuma Proving Grounds Phase II: Human...1990. Long-Term Fate of Depleted Uranium at Aberdeen and Yuma Proving Grounds Final Report, Phase 1: Geochemical Transport and Modeling. Los...of Depleted Uranium at Aberdeen and Yuma Proving Grounds , Phase II: Human Health and Ecological Risk Assessments. Los Alamos National Laboratory

  4. Love Canal Emergency Declaration Area habitability study. Volume 2. Air assessment: indicator chemicals. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Environmental studies were conducted to provide data that could be used by the Commissioner of Health for the State of New York in determining whether the Emergency Declaration Area (EDA) surrounding the Love Canal hazardous-waste site is habitable. An air assessment was conducted for Love Canal Indicator Chemicals. Homes throughout the EDA were sampled using the Trace Atmospheric Gas Analyzer Model 6000E.

  5. Multi-Level Cultural Models

    DTIC Science & Technology

    2014-11-05

    usable simulations. This procedure was to be tested using real-world data collected from open-source venues. The final system would support rapid...assess social change. Construct is an agent-based dynamic-network simulation system design to allow the user to assess the spread of information and...protest or violence. Technical Challenges Addressed  Re‐use:    Most agent-based simulation ( ABM ) in use today are one-off. In contrast, we

  6. The influences of implementing state-mandated science assessment on teacher practice

    NASA Astrophysics Data System (ADS)

    Katzmann, Jason Matthew

    Four high school Biology teachers, two novice and two experienced, participated in a year and a half case study. By utilizing a naturalistic paradigm, the four individuals were studied in their natural environment, their classrooms. Data sources included: three semi-structured interviews, classroom observation field notes, and classroom artifacts. Through cross-case analysis and a constant comparative methodology, coding nodes where combined and refined resulting in the final themes for discussion. The following research question was investigated: what is the impact of high-stakes testing on high school Biology teacher's instructional planning, instructional practices and classroom assessments? Seven final themes were realized: Assessment, CSAP, Planning, Pressure, Standards, Teaching and Time. Each theme was developed and discussed utilizing each participant's voice. Trustworthiness of this study was established via five avenues: triangulation of data sources, credibility, transferability, dependability and confirmability. A model of the influences of high-stakes testing on teacher practice was developed to describe the seven themes (Figure 5). This model serves as an illustration of the complex nature of teacher practice and the influences upon it. The four participants in this study were influenced by high-stakes assessment. It influenced their instructional decisions, assessment practices, use of time, planning decisions and decreased the amount of inquiry that occurred in the classroom. Implications of this research and future research directions are described.

  7. Digital Avionics Information System (DAIS): Life Cycle Cost Impact Modeling System (LCCIM)--A Managerial Overview. Final Report.

    ERIC Educational Resources Information Center

    Goclowski, John C.; Baran, H. Anthony

    This report gives a managerial overview of the Life Cycle Cost Impact Modeling System (LCCIM), which was designed to provide the Air Force with an in-house capability of assessing the life cycle cost impact of weapon system design alternatives. LCCIM consists of computer programs and the analyses which the user must perform to generate input data.…

  8. Developing statistical wildlife habitat relationships for assessing cumulative effects of fuels treatments: Final Report for Joint Fire Science Program Project

    Treesearch

    Samuel A. Cushman; Kevin S. McKelvey

    2006-01-01

    The primary weakness in our current ability to evaluate future landscapes in terms of wildlife lies in the lack of quantitative models linking wildlife to forest stand conditions, including fuels treatments. This project focuses on 1) developing statistical wildlife habitat relationships models (WHR) utilizing Forest Inventory and Analysis (FIA) and National Vegetation...

  9. The development of a survey instrument for community health improvement.

    PubMed Central

    Bazos, D A; Weeks, W B; Fisher, E S; DeBlois, H A; Hamilton, E; Young, M J

    2001-01-01

    OBJECTIVE: To develop a survey instrument that could be used both to guide and evaluate community health improvement efforts. DATA SOURCES/STUDY SETTING: A randomized telephone survey was administered to a sample of about 250 residents in two communities in Lehigh Valley, Pennsylvania in the fall of 1997. METHODS: The survey instrument was developed by health professionals representing diverse health care organizations. This group worked collaboratively over a period of two years to (1) select a conceptual model of health as a foundation for the survey; (2) review relevant literature to identify indicators that adequately measured the health constructs within the chosen model; (3) develop new indicators where important constructs lacked specific measures; and (4) pilot test the final survey to assess the reliability and validity of the instrument. PRINCIPAL FINDINGS: The Evans and Stoddart Field Model of the Determinants of Health and Well-Being was chosen as the conceptual model within which to develop the survey. The Field Model depicts nine domains important to the origins and production of health and provides a comprehensive framework from which to launch community health improvement efforts. From more than 500 potential indicators we identified 118 survey questions that reflected the multiple determinants of health as conceptualized by this model. Sources from which indicators were selected include the Behavior Risk Factor Surveillance Survey, the National Health Interview Survey, the Consumer Assessment of Health Plans Survey, and the SF-12 Summary Scales. The work group developed 27 new survey questions for constructs for which we could not locate adequate indicators. Twenty-five questions in the final instrument can be compared to nationally published norms or benchmarks. The final instrument was pilot tested in 1997 in two communities. Administration time averaged 22 minutes with a response rate of 66 percent. Reliability of new survey questions was adequate. Face validity was supported by previous findings from qualitative and quantitative studies. CONCLUSIONS: We developed, pilot tested, and validated a survey instrument designed to provide more comprehensive and timely data to communities for community health assessments. This instrument allows communities to identify and measure critical domains of health that have previously not been captured in a single instrument. PMID:11508639

  10. Humanistic Intentionality in Clinical Collaboration

    ERIC Educational Resources Information Center

    Gold, Joshua

    2016-01-01

    This conceptual paper introduces a clinical collaboration model for counselors founded in the principles of humanistic counseling and wellness. The article offers applications for client life self-assessment and the discovery of personally relevant social resources. Finally, implications of this approach for graduate training, clinical service and…

  11. Momentum and Energy Assessments with NASA and Other Model and Data Assimilation Systems

    NASA Technical Reports Server (NTRS)

    Salstein, David; Nelson, Peter; Hu, Wen-Jie

    2001-01-01

    Support from the NASA Global Modeling and Analysis Program has been used for the following research objectives: 1) the study of aspects of dynamics of torques and angular momentum based on the Goddard GEOS and other analyses; 2) the study of how models participating in the second Atmospheric Model Intercomparison Project (AMIP-2) have success in simulating certain large-scale quantities; 3) the study of the energetics and momentum cycle from certain runs from the Goddard Laboratory for Atmospheres and other models as well; 4) the assessment of changes in diabatic heating and related energetics in the community climate model (CCM3); 5) the analysis of modes of climate of the atmosphere, especially the Arctic and North Atlantic Oscillations. Further information on these endeavors will be provided in published works and the Final Report of the project.

  12. The development of an audit technique to assess the quality of safety barrier management.

    PubMed

    Guldenmund, Frank; Hale, Andrew; Goossens, Louis; Betten, Jeroen; Duijm, Nijs Jan

    2006-03-31

    This paper describes the development of a management model to control barriers devised to prevent major hazard scenarios. Additionally, an audit technique is explained that assesses the quality of such a management system. The final purpose of the audit technique is to quantify those aspects of the management system that have a direct impact on the reliability and effectiveness of the barriers and, hence, the probability of the scenarios involved. First, an outline of the management model is given and its elements are explained. Then, the development of the audit technique is described. Because the audit technique uses actual major hazard scenarios and barriers within these as its focus, the technique achieves a concreteness and clarity that many other techniques often lack. However, this strength is also its limitation, since the full safety management system is not covered with the technique. Finally, some preliminary experiences obtained from several test sites are compiled and discussed.

  13. KENNEDY SPACE CENTER, FLA. - The Stafford-Covey Return to Flight Task Group (SCTG) inspects debris in the Columbia Debris Hangar. At right is the model of the left wing that has been used during recovery operations. Chairing the task group are Richard O. Covey, former Space Shuttle commander, and Thomas P. Stafford (third from right, foreground), Apollo commander. Chartered by NASA Administrator Sean O’Keefe, the task group will perform an independent assessment of NASA’s implementation of the final recommendations by the Columbia Accident Investigation Board.

    NASA Image and Video Library

    2003-08-05

    KENNEDY SPACE CENTER, FLA. - The Stafford-Covey Return to Flight Task Group (SCTG) inspects debris in the Columbia Debris Hangar. At right is the model of the left wing that has been used during recovery operations. Chairing the task group are Richard O. Covey, former Space Shuttle commander, and Thomas P. Stafford (third from right, foreground), Apollo commander. Chartered by NASA Administrator Sean O’Keefe, the task group will perform an independent assessment of NASA’s implementation of the final recommendations by the Columbia Accident Investigation Board.

  14. Expectation Maximization Algorithm for Box-Cox Transformation Cure Rate Model and Assessment of Model Misspecification Under Weibull Lifetimes.

    PubMed

    Pal, Suvra; Balakrishnan, Narayanaswamy

    2018-05-01

    In this paper, we develop likelihood inference based on the expectation maximization algorithm for the Box-Cox transformation cure rate model assuming the lifetimes to follow a Weibull distribution. A simulation study is carried out to demonstrate the performance of the proposed estimation method. Through Monte Carlo simulations, we also study the effect of model misspecification on the estimate of cure rate. Finally, we analyze a well-known data on melanoma with the model and the inferential method developed here.

  15. Seasonal patterns of dengue fever and associated climate factors in 4 provinces in Vietnam from 1994 to 2013.

    PubMed

    Lee, Hu Suk; Nguyen-Viet, Hung; Nam, Vu Sinh; Lee, Mihye; Won, Sungho; Duc, Phuc Pham; Grace, Delia

    2017-03-20

    In Vietnam, dengue fever (DF) is still a leading cause of hospitalization. The main objective of this study was to evaluate the seasonality and association with climate factors (temperature and precipitation) on the incidences of DF in four provinces where the highest incidence rates were observed from 1994 to 2013 in Vietnam. Incidence rates (per 100,000) were calculated on a monthly basis from during the study period. The seasonal-decomposition procedure based on loess (STL) was used in order to assess the trend and seasonality of DF. In addition, a seasonal cycle subseries (SCS) plot and univariate negative binomial regression (NBR) model were used to evaluate the monthly variability with statistical analysis. Lastly, a generalized estimating equation (GEE) was used to assess the relationship between monthly incidence rates and weather factors (temperature and precipitation). We found that increased incidence rates were observed in the second half of each year (from May through December) which is the rainy season in each province. In Hanoi, the final model showed that 1 °C rise of temperature corresponded to an increase of 13% in the monthly incidence rate of DF. In Khanh Hoa, the final model displayed that 1 °C increase in temperature corresponded to an increase of 17% while 100 mm increase in precipitation corresponded to an increase of 11% of DF incidence rate. For Ho Chi Minh City, none of variables were significant in the model. In An Giang, the final model showed that 100 mm increase of precipitation in the preceding and same months corresponded to an increase of 30% and 22% of DF incidence rate. Our findings provide insight into understanding the seasonal pattern and associated climate risk factors.

  16. HARMONIZING HEALTH TECHNOLOGY ASSESSMENT PRACTICES IN UNIVERSITY HOSPITALS: TO WHAT EXTENT IS THE MINI-HTA MODEL SUITABLE IN THE FRENCH CONTEXT?

    PubMed

    Martelli, Nicolas; Devaux, Capucine; van den Brink, Hélène; Billaux, Mathilde; Pineau, Judith; Prognon, Patrice; Borget, Isabelle

    2017-01-01

    The number of new medical devices for individual use that are launched annually exceeds the assessment capacity of the French national health technology assessment (HTA) agency. This has resulted in hospitals, and particularly university hospitals (UHs), developing hospital-based HTA initiatives to support their decisions for purchasing innovative devices. However, the methodologies used in such hospitals have no common basis. The aim of this study was to assess a mini-HTA model as a potential solution to harmonize HTA methodology in French UHs. A systematic review was conducted on Medline, Embase, Health Technology Assessment database, and Google Scholar to identify published articles reporting the use of mini-HTA tools and decision support-like models. A survey was also carried out in eighteen French UHs to identify in-house decision support tools. Finally, topics evaluated in the Danish mini-HTA model and in French UHs were compared using Jaccard similarity coefficients. Our findings showed differences between topics evaluated in French UHs and those assessed in decision support models from the literature. Only five topics among the thirteen most evaluated in French UHs were similar to those assessed in the Danish mini-HTA model. The organizational and ethical/social impacts were rarely explored among the surveyed models used in French UHs when introducing new medical devices. Before its widespread and harmonized use in French UHs, the mini-HTA model would first require adaptations to the French context.

  17. How Adverse Outcome Pathways Can Aid the Development and Use of Computational Prediction Models for Regulatory Toxicology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework has emerged as a systematic approach for organizing knowledge that supports such inference. We argue that this systematic organization of knowledge can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment.more » Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment.« less

  18. Highlights of Transient Plume Impingement Model Validation and Applications

    NASA Technical Reports Server (NTRS)

    Woronowicz, Michael

    2011-01-01

    This paper describes highlights of an ongoing validation effort conducted to assess the viability of applying a set of analytic point source transient free molecule equations to model behavior ranging from molecular effusion to rocket plumes. The validation effort includes encouraging comparisons to both steady and transient studies involving experimental data and direct simulation Monte Carlo results. Finally, this model is applied to describe features of two exotic transient scenarios involving NASA Goddard Space Flight Center satellite programs.

  19. Final Design Documentation for the Wartime Personnel Assessment Model (WARPAM) (Version 1.0)

    DTIC Science & Technology

    1991-03-25

    Bldg 401B) Ft. Benjamin Harrison, IN 46216-5000 Accesion For DTI& NTIS CRA& I J DTIC 1A;3 A Uta,I.ou- i Justilicatluol .... . .. . By...GENERATOR FIGURE 2: WARPAN OPERATIONAL ARCHITECTURE 10 WARPAN DESIGN DOCUMENTATION WARPAM is programmed in FORTRAN 77, except for the CRC model which is...to enter directly into a specific model and utilize data currently in the system. The modular architecture of WARPAM is depicted in Figure 3

  20. Comparing GIS-based habitat models for applications in EIA and SEA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gontier, Mikael, E-mail: gontier@kth.s; Moertberg, Ulla, E-mail: mortberg@kth.s; Balfors, Berit, E-mail: balfors@kth.s

    Land use changes, urbanisation and infrastructure developments in particular, cause fragmentation of natural habitats and threaten biodiversity. Tools and measures must be adapted to assess and remedy the potential effects on biodiversity caused by human activities and developments. Within physical planning, environmental impact assessment (EIA) and strategic environmental assessment (SEA) play important roles in the prediction and assessment of biodiversity-related impacts from planned developments. However, adapted prediction tools to forecast and quantify potential impacts on biodiversity components are lacking. This study tested and compared four different GIS-based habitat models and assessed their relevance for applications in environmental assessment. The modelsmore » were implemented in the Stockholm region in central Sweden and applied to data on the crested tit (Parus cristatus), a sedentary bird species of coniferous forest. All four models performed well and allowed the distribution of suitable habitats for the crested tit in the Stockholm region to be predicted. The models were also used to predict and quantify habitat loss for two regional development scenarios. The study highlighted the importance of model selection in impact prediction. Criteria that are relevant for the choice of model for predicting impacts on biodiversity were identified and discussed. Finally, the importance of environmental assessment for the preservation of biodiversity within the general frame of biodiversity conservation is emphasised.« less

  1. Study on quantitative risk assessment model of the third party damage for natural gas pipelines based on fuzzy comprehensive assessment

    NASA Astrophysics Data System (ADS)

    Qiu, Zeyang; Liang, Wei; Wang, Xue; Lin, Yang; Zhang, Meng

    2017-05-01

    As an important part of national energy supply system, transmission pipelines for natural gas are possible to cause serious environmental pollution, life and property loss in case of accident. The third party damage is one of the most significant causes for natural gas pipeline system accidents, and it is very important to establish an effective quantitative risk assessment model of the third party damage for reducing the number of gas pipelines operation accidents. Against the third party damage accident has the characteristics such as diversity, complexity and uncertainty, this paper establishes a quantitative risk assessment model of the third party damage based on Analytic Hierarchy Process (AHP) and Fuzzy Comprehensive Evaluation (FCE). Firstly, risk sources of third party damage should be identified exactly, and the weight of factors could be determined via improved AHP, finally the importance of each factor is calculated by fuzzy comprehensive evaluation model. The results show that the quantitative risk assessment model is suitable for the third party damage of natural gas pipelines and improvement measures could be put forward to avoid accidents based on the importance of each factor.

  2. A new framework to enhance the interpretation of external validation studies of clinical prediction models.

    PubMed

    Debray, Thomas P A; Vergouwe, Yvonne; Koffijberg, Hendrik; Nieboer, Daan; Steyerberg, Ewout W; Moons, Karel G M

    2015-03-01

    It is widely acknowledged that the performance of diagnostic and prognostic prediction models should be assessed in external validation studies with independent data from "different but related" samples as compared with that of the development sample. We developed a framework of methodological steps and statistical methods for analyzing and enhancing the interpretation of results from external validation studies of prediction models. We propose to quantify the degree of relatedness between development and validation samples on a scale ranging from reproducibility to transportability by evaluating their corresponding case-mix differences. We subsequently assess the models' performance in the validation sample and interpret the performance in view of the case-mix differences. Finally, we may adjust the model to the validation setting. We illustrate this three-step framework with a prediction model for diagnosing deep venous thrombosis using three validation samples with varying case mix. While one external validation sample merely assessed the model's reproducibility, two other samples rather assessed model transportability. The performance in all validation samples was adequate, and the model did not require extensive updating to correct for miscalibration or poor fit to the validation settings. The proposed framework enhances the interpretation of findings at external validation of prediction models. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  3. A Person-Centered Approach to Financial Capacity Assessment: Preliminary Development of a New Rating Scale

    PubMed Central

    Lichtenberg, Peter A.; Stoltman, Jonathan; Ficker, Lisa J.; Iris, Madelyn; Mast, Benjamin

    2014-01-01

    Financial exploitation and financial capacity issues often overlap when a gerontologist assesses whether an older adult’s financial decision is an autonomous, capable choice. Our goal is to describe a new conceptual model for assessing financial decisions using principles of person-centered approaches and to introduce a new instrument, the Lichtenberg Financial Decision Rating Scale (LFDRS). We created a conceptual model, convened meetings of experts from various disciplines to critique the model and provide input on content and structure, and select final items. We then videotaped administration of the LFDRS to five older adults and had 10 experts provide independent ratings. The LFDRS demonstrated good to excellent inter-rater agreement. The LFDRS is a new tool that allows gerontologists to systematically gather information about a specific financial decision and the decisional abilities in question. PMID:25866438

  4. A Person-Centered Approach to Financial Capacity Assessment: Preliminary Development of a New Rating Scale.

    PubMed

    Lichtenberg, Peter A; Stoltman, Jonathan; Ficker, Lisa J; Iris, Madelyn; Mast, Benjamin

    2015-01-01

    Financial exploitation and financial capacity issues often overlap when a gerontologist assesses whether an older adult's financial decision is an autonomous, capable choice. Our goal is to describe a new conceptual model for assessing financial decisions using principles of person-centered approaches and to introduce a new instrument, the Lichtenberg Financial Decision Rating Scale (LFDRS). We created a conceptual model, convened meetings of experts from various disciplines to critique the model and provide input on content and structure, and select final items. We then videotaped administration of the LFDRS to five older adults and had 10 experts provide independent ratings. The LFDRS demonstrated good to excellent inter-rater agreement. The LFDRS is a new tool that allows gerontologists to systematically gather information about a specific financial decision and the decisional abilities in question.

  5. Rasch family models in e-learning: analyzing architectural sketching with a digital pen.

    PubMed

    Scalise, Kathleen; Cheng, Nancy Yen-Wen; Oskui, Nargas

    2009-01-01

    Since architecture students studying design drawing are usually assessed qualitatively on the basis of their final products, the challenges and stages of their learning have remained masked. To clarify the challenges in design drawing, we have been using the BEAR Assessment System and Rasch family models to measure levels of understanding for individuals and groups, in order to correct pedagogical assumptions and tune teaching materials. This chapter discusses the analysis of 81 drawings created by architectural students to solve a space layout problem, collected and analyzed with digital pen-and-paper technology. The approach allows us to map developmental performance criteria and perceive achievement overlaps in learning domains assumed separate, and then re-conceptualize a three-part framework to represent learning in architectural drawing. Results and measurement evidence from the assessment and Rasch modeling are discussed.

  6. Cognitive and affective influences on perceived risk of ovarian cancer†

    PubMed Central

    Peipins, Lucy A.; McCarty, Frances; Hawkins, Nikki A.; Rodriguez, Juan L.; Scholl, Lawrence E.; Leadbetter, Steven

    2015-01-01

    Introduction Studies suggest that both affective and cognitive processes are involved in the perception of vulnerability to cancer and that affect has an early influence in this assessment of risk. We constructed a path model based on a conceptual framework of heuristic reasoning (affect, resemblance, and availability) coupled with cognitive processes involved in developing personal models of cancer causation. Methods From an eligible cohort of 16 700 women in a managed care organization, we randomly selected 2524 women at high, elevated, and average risk of ovarian cancer and administered a questionnaire to test our model (response rate 76.3%). Path analysis delineated the relationships between personal and cognitive characteristics (number of relatives with cancer, age, ideas about cancer causation, perceived resemblance to an affected friend or relative, and ovarian cancer knowledge) and emotional constructs (closeness to an affected relative or friend, time spent processing the cancer experience, and cancer worry) on perceived risk of ovarian cancer. Results Our final model fit the data well (root mean square error of approximation (RMSEA) = 0.028, comparative fit index (CFI) = 0.99, normed fit index (NFI) = 0.98). This final model (1) demonstrated the nature and direction of relationships between cognitive characteristics and perceived risk; (2) showed that time spent processing the cancer experience was associated with cancer worry; and (3) showed that cancer worry moderately influenced perceived risk. Discussion Our results highlight the important role that family cancer experience has on cancer worry and shows how cancer experience translates into personal risk perceptions. This understanding informs the discordance between medical or objective risk assessment and personal risk assessment. PMID:24916837

  7. Waste-to-energy: A review of life cycle assessment and its extension methods.

    PubMed

    Zhou, Zhaozhi; Tang, Yuanjun; Chi, Yong; Ni, Mingjiang; Buekens, Alfons

    2018-01-01

    This article proposes a comprehensive review of evaluation tools based on life cycle thinking, as applied to waste-to-energy. Habitually, life cycle assessment is adopted to assess environmental burdens associated with waste-to-energy initiatives. Based on this framework, several extension methods have been developed to focus on specific aspects: Exergetic life cycle assessment for reducing resource depletion, life cycle costing for evaluating its economic burden, and social life cycle assessment for recording its social impacts. Additionally, the environment-energy-economy model integrates both life cycle assessment and life cycle costing methods and judges simultaneously these three features for sustainable waste-to-energy conversion. Life cycle assessment is sufficiently developed on waste-to-energy with concrete data inventory and sensitivity analysis, although the data and model uncertainty are unavoidable. Compared with life cycle assessment, only a few evaluations are conducted to waste-to-energy techniques by using extension methods and its methodology and application need to be further developed. Finally, this article succinctly summarises some recommendations for further research.

  8. Revisiting the generation and interpretation of climate models experiments for adaptation decision-making (Invited)

    NASA Astrophysics Data System (ADS)

    Ranger, N.; Millner, A.; Niehoerster, F.

    2010-12-01

    Traditionally, climate change risk assessments have taken a roughly four-stage linear ‘chain’ of moving from socioeconomic projections, to climate projections, to primary impacts and then finally onto economic and social impact assessment. Adaptation decisions are then made on the basis of these outputs. The escalation of uncertainty through this chain is well known; resulting in an ‘explosion’ of uncertainties in the final risk and adaptation assessment. The space of plausible future risk scenarios is growing ever wider with the application of new techniques which aim to explore uncertainty ever more deeply; such as those used in the recent ‘probabilistic’ UK Climate Projections 2009, and the stochastic integrated assessment models, for example PAGE2002. This explosion of uncertainty can make decision-making problematic, particularly given that the uncertainty information communicated can not be treated as strictly probabilistic and therefore, is not an easy fit with standard decision-making under uncertainty approaches. Additional problems can arise from the fact that the uncertainty estimated for different components of the ‘chain’ is rarely directly comparable or combinable. Here, we explore the challenges and limitations of using current projections for adaptation decision-making. We report the findings of a recent report completed for the UK Adaptation Sub-Committee on approaches to deal with these challenges and make robust adaptation decisions today. To illustrate these approaches, we take a number of illustrative case studies, including a case of adaptation to hurricane risk on the US Gulf Coast. This is a particularly interesting case as it involves urgent adaptation of long-lived infrastructure but requires interpreting highly uncertain climate change science and modelling; i.e. projections of Atlantic basin hurricane activity. An approach we outline is reversing the linear chain of assessments to put the economics and decision-making first. Such an approach forces one to focus on the information of greatest value for the specific decision. We suggest that such an approach will help to accommodate the uncertainties in the chain and facilitate robust decision-making. Initial findings of these case studies will be presented with the aim of raising open questions and promoting discussion of the methodology. Finally, we reflect on the implications for the design of climate model experiments.

  9. Analysing the accuracy of machine learning techniques to develop an integrated influent time series model: case study of a sewage treatment plant, Malaysia.

    PubMed

    Ansari, Mozafar; Othman, Faridah; Abunama, Taher; El-Shafie, Ahmed

    2018-04-01

    The function of a sewage treatment plant is to treat the sewage to acceptable standards before being discharged into the receiving waters. To design and operate such plants, it is necessary to measure and predict the influent flow rate. In this research, the influent flow rate of a sewage treatment plant (STP) was modelled and predicted by autoregressive integrated moving average (ARIMA), nonlinear autoregressive network (NAR) and support vector machine (SVM) regression time series algorithms. To evaluate the models' accuracy, the root mean square error (RMSE) and coefficient of determination (R 2 ) were calculated as initial assessment measures, while relative error (RE), peak flow criterion (PFC) and low flow criterion (LFC) were calculated as final evaluation measures to demonstrate the detailed accuracy of the selected models. An integrated model was developed based on the individual models' prediction ability for low, average and peak flow. An initial assessment of the results showed that the ARIMA model was the least accurate and the NAR model was the most accurate. The RE results also prove that the SVM model's frequency of errors above 10% or below - 10% was greater than the NAR model's. The influent was also forecasted up to 44 weeks ahead by both models. The graphical results indicate that the NAR model made better predictions than the SVM model. The final evaluation of NAR and SVM demonstrated that SVM made better predictions at peak flow and NAR fit well for low and average inflow ranges. The integrated model developed includes the NAR model for low and average influent and the SVM model for peak inflow.

  10. A review of alternative financing methods for roadway projects in small urban and rural areas of Texas : final report.

    DOT National Transportation Integrated Search

    2016-09-01

    In 2014, the Texas A&M Transportation Institute (TTI) published a report titled Public-Private Investment Models for Roadway Infrastructure. This report provided a balanced, objective assessment of the benefits and limitations of transportation publi...

  11. Development of a decision model for selection of appropriate timely delivery techniques for highway projects : final report, April 2009.

    DOT National Transportation Integrated Search

    2009-04-01

    The primary umbrella method used by the Oregon Department of Transportation (ODOT) to ensure on-time performance in standard construction contracting is liquidated damages. The assessment value is usually a matter of some judgment. In practice,...

  12. Taiwan industrial cooperation program technology transfer for low-level radioactive waste final disposal - phase I.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knowlton, Robert G.; Cochran, John Russell; Arnold, Bill Walter

    2007-01-01

    Sandia National Laboratories and the Institute of Nuclear Energy Research, Taiwan have collaborated in a technology transfer program related to low-level radioactive waste (LLW) disposal in Taiwan. Phase I of this program included regulatory analysis of LLW final disposal, development of LLW disposal performance assessment capabilities, and preliminary performance assessments of two potential disposal sites. Performance objectives were based on regulations in Taiwan and comparisons to those in the United States. Probabilistic performance assessment models were constructed based on limited site data using software including GoldSim, BLT-MS, FEHM, and HELP. These software codes provided the probabilistic framework, container degradation, waste-formmore » leaching, groundwater flow, radionuclide transport, and cover infiltration simulation capabilities in the performance assessment. Preliminary performance assessment analyses were conducted for a near-surface disposal system and a mined cavern disposal system at two representative sites in Taiwan. Results of example calculations indicate peak simulated concentrations to a receptor within a few hundred years of LLW disposal, primarily from highly soluble, non-sorbing radionuclides.« less

  13. Southern California Edison Grid Integration Evaluation: Cooperative Research and Development Final Report, CRADA Number CRD-10-376

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mather, Barry

    2015-07-09

    The objective of this project is to use field verification to improve DOE’s ability to model and understand the impacts of, as well as develop solutions for, high penetration PV deployments in electrical utility distribution systems. The Participant will work with NREL to assess the existing distribution system at SCE facilities and assess adding additional PV systems into the electric power system.

  14. Modeling an internal gear pump

    NASA Astrophysics Data System (ADS)

    Chen, Zongbin; Xu, Rongwu; He, Lin; Liao, Jian

    2018-05-01

    Considering the nature and characteristics of construction waste piles, this paper analyzed the factors affecting the stability of the slope of construction waste piles, and established the system of the assessment indexes for the slope failure risks of construction waste piles. Based on the basic principles and methods of fuzzy mathematics, the factor set and the remark set were established. The membership grade of continuous factor indexes is determined using the "ridge row distribution" function, while that for the discrete factor indexes was determined by the Delphi Method. For the weight of factors, the subjective weight was determined by the Analytic Hierarchy Process (AHP) and objective weight by the entropy weight method. And the distance function was introduced to determine the combination coefficient. This paper established a fuzzy comprehensive assessment model of slope failure risks of construction waste piles, and assessed pile slopes in the two dimensions of hazard and vulnerability. The root mean square of the hazard assessment result and vulnerability assessment result was the final assessment result. The paper then used a certain construction waste pile slope as the example for analysis, assessed the risks of the four stages of a landfill, verified the assessment model and analyzed the slope's failure risks and preventive measures against a slide.

  15. How adverse outcome pathways can aid the development and ...

    EPA Pesticide Factsheets

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework has emerged as a systematic approach for organizing knowledge that supports such inference. We argue that this systematic organization of knowledge can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. The present manuscript reports on expert opinion and case studies that came out of a European Commission, Joint Research Centre-sponsored work

  16. Dataset of two experiments of the application of gamified peer assessment model into online learning environment MeuTutor.

    PubMed

    Tenório, Thyago; Bittencourt, Ig Ibert; Isotani, Seiji; Pedro, Alan; Ospina, Patrícia; Tenório, Daniel

    2017-06-01

    In this dataset, we present the collected data of two experiments with the application of the gamified peer assessment model into online learning environment MeuTutor to allow the comparison of the obtained results with others proposed models. MeuTutor is an intelligent tutoring system aims to monitor the learning of the students in a personalized way, ensuring quality education and improving the performance of its members (Tenório et al., 2016) [1]. The first experiment evaluated the effectiveness of the peer assessment model through metrics as final grade (result), time to correct the activities and associated costs. The second experiment evaluated the gamification influence into peer assessment model, analyzing metrics as access number (logins), number of performed activities and number of performed corrections. In this article, we present in table form for each metric: the raw data of each treatment; the summarized data; the application results of the normality test Shapiro-Wilk; the application results of the statistical tests T -Test and/or Wilcoxon. The presented data in this article are related to the article entitled "A gamified peer assessment model for on-line learning environments in a competitive context" (Tenório et al., 2016) [1].

  17. Increasing URM Undergraduate Student Success through Assessment-Driven Interventions: A Multiyear Study Using Freshman-Level General Biology as a Model System

    PubMed Central

    Carmichael, Mary C.; St. Clair, Candace; Edwards, Andrea M.; Barrett, Peter; McFerrin, Harris; Davenport, Ian; Awad, Mohamed; Kundu, Anup; Ireland, Shubha Kale

    2016-01-01

    Xavier University of Louisiana leads the nation in awarding BS degrees in the biological sciences to African-American students. In this multiyear study with ∼5500 participants, data-driven interventions were adopted to improve student academic performance in a freshman-level general biology course. The three hour-long exams were common and administered concurrently to all students. New exam questions were developed using Bloom’s taxonomy, and exam results were analyzed statistically with validated assessment tools. All but the comprehensive final exam were returned to students for self-evaluation and remediation. Among other approaches, course rigor was monitored by using an identical set of 60 questions on the final exam across 10 semesters. Analysis of the identical sets of 60 final exam questions revealed that overall averages increased from 72.9% (2010) to 83.5% (2015). Regression analysis demonstrated a statistically significant correlation between high-risk students and their averages on the 60 questions. Additional analysis demonstrated statistically significant improvements for at least one letter grade from midterm to final and a 20% increase in the course pass rates over time, also for the high-risk population. These results support the hypothesis that our data-driven interventions and assessment techniques are successful in improving student retention, particularly for our academically at-risk students. PMID:27543637

  18. Assessment of nonequilibrium radiation computation methods for hypersonic flows

    NASA Technical Reports Server (NTRS)

    Sharma, Surendra

    1993-01-01

    The present understanding of shock-layer radiation in the low density regime, as appropriate to hypersonic vehicles, is surveyed. Based on the relative importance of electron excitation and radiation transport, the hypersonic flows are divided into three groups: weakly ionized, moderately ionized, and highly ionized flows. In the light of this division, the existing laboratory and flight data are scrutinized. Finally, an assessment of the nonequilibrium radiation computation methods for the three regimes in hypersonic flows is presented. The assessment is conducted by comparing experimental data against the values predicted by the physical model.

  19. Scalar versus fermionic top partner interpretations of toverline{t}+{E}_T^{miss} searches at the LHC

    NASA Astrophysics Data System (ADS)

    Kraml, Sabine; Laa, Ursula; Panizzi, Luca; Prager, Hugo

    2016-11-01

    We assess how different ATLAS and CMS searches for supersymmetry in the toverline{t}+{E}_T^{miss} final state at Run 1 of the LHC constrain scenarios with a fermionic top partner and a dark matter candidate. We find that the efficiencies of these searches in all-hadronic, 1-lepton and 2-lepton channels are quite similar for scalar and fermionic top partners. Therefore, in general, efficiency maps for stop-neutralino simplified models can also be applied to fermionic top-partner models, provided the narrow width approximation holds in the latter. Owing to the much higher production cross-sections of heavy top quarks as compared to stops, masses up to m T ≈ 850 GeV can be excluded from the Run 1 stop searches. Since the simplified-model results published by ATLAS and CMS do not extend to such high masses, we provide our own efficiency maps obtained with C heckMATE and M adA nalysis 5 for these searches. Finally, we also discuss how generic gluino/squark searches in multi-jet final states constrain heavy top partner production.

  20. A comparison of self-oscillating phonation models

    NASA Astrophysics Data System (ADS)

    McPhail, Michael; Campo, Elizabeth; Walters, Gage; Krane, Michael

    2017-11-01

    This talk presents a comparison of self-oscillating models of phonation. The goal is to assess how well synthetic rubber vocal folds reproduce the gross behavior of phonation. Data from molded rubber folds and a variety of excised mammalian larynges were collected from the literature and from the authors' physical model. Gross trends are discussed and a simple scaling is presented that appears to collapse these data. Finally, comparisons between molded rubber folds and excised larynges are highlighted. Acknowledge support from NIH DC R01005642-11.

  1. Risk assessment of pesticides and other stressors in bees: Principles, data gaps and perspectives from the European Food Safety Authority.

    PubMed

    Rortais, Agnès; Arnold, Gérard; Dorne, Jean-Lou; More, Simon J; Sperandio, Giorgio; Streissl, Franz; Szentes, Csaba; Verdonck, Frank

    2017-06-01

    Current approaches to risk assessment in bees do not take into account co-exposures from multiple stressors. The European Food Safety Authority (EFSA) is deploying resources and efforts to move towards a holistic risk assessment approach of multiple stressors in bees. This paper describes the general principles of pesticide risk assessment in bees, including recent developments at EFSA dealing with risk assessment of single and multiple pesticide residues and biological hazards. The EFSA Guidance Document on the risk assessment of plant protection products in bees highlights the need for the inclusion of an uncertainty analysis, other routes of exposures and multiple stressors such as chemical mixtures and biological agents. The EFSA risk assessment on the survival, spread and establishment of the small hive beetle, Aethina tumida, an invasive alien species, is provided with potential insights for other bee pests such as the Asian hornet, Vespa velutina. Furthermore, data gaps are identified at each step of the risk assessment, and recommendations are made for future research that could be supported under the framework of Horizon 2020. Finally, the recent work conducted at EFSA is presented, under the overarching MUST-B project ("EU efforts towards the development of a holistic approach for the risk assessment on MUltiple STressors in Bees") comprising a toolbox for harmonised data collection under field conditions and a mechanistic model to assess effects from pesticides and other stressors such as biological agents and beekeeping management practices, at the colony level and in a spatially complex landscape. Future perspectives at EFSA include the development of a data model to collate high quality data to calibrate and validate the model to be used as a regulatory tool. Finally, the evidence collected within the framework of MUST-B will support EFSA's activities on the development of a holistic approach to the risk assessment of multiple stressors in bees. In conclusion, EFSA calls for collaborative action at the EU level to establish a common and open access database to serve multiple purposes and different stakeholders. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  2. Assessment and application of Reynolds stress closure models to high-speed compressible flows

    NASA Technical Reports Server (NTRS)

    Gatski, T. B.; Sarkar, S.; Speziale, C. G.; Balakrishnan, L.; Abid, R.; Anderson, E. C.

    1990-01-01

    The paper presents results from the development of higher order closure models for the phenomological modeling of high-speed compressible flows. The work presented includes the introduction of an improved pressure-strain correlationi model applicable in both the low- and high-speed regime as well as modifications to the isotropic dissipation rate to account for dilatational effects. Finally, the question of stiffness commonly associated with the solution of two-equation and Reynolds stress transport equations in wall-bounded flows is examined and ways of relaxing these restrictions are discussed.

  3. Risk assessment model for development of advanced age-related macular degeneration.

    PubMed

    Klein, Michael L; Francis, Peter J; Ferris, Frederick L; Hamon, Sara C; Clemons, Traci E

    2011-12-01

    To design a risk assessment model for development of advanced age-related macular degeneration (AMD) incorporating phenotypic, demographic, environmental, and genetic risk factors. We evaluated longitudinal data from 2846 participants in the Age-Related Eye Disease Study. At baseline, these individuals had all levels of AMD, ranging from none to unilateral advanced AMD (neovascular or geographic atrophy). Follow-up averaged 9.3 years. We performed a Cox proportional hazards analysis with demographic, environmental, phenotypic, and genetic covariates and constructed a risk assessment model for development of advanced AMD. Performance of the model was evaluated using the C statistic and the Brier score and externally validated in participants in the Complications of Age-Related Macular Degeneration Prevention Trial. The final model included the following independent variables: age, smoking history, family history of AMD (first-degree member), phenotype based on a modified Age-Related Eye Disease Study simple scale score, and genetic variants CFH Y402H and ARMS2 A69S. The model did well on performance measures, with very good discrimination (C statistic = 0.872) and excellent calibration and overall performance (Brier score at 5 years = 0.08). Successful external validation was performed, and a risk assessment tool was designed for use with or without the genetic component. We constructed a risk assessment model for development of advanced AMD. The model performed well on measures of discrimination, calibration, and overall performance and was successfully externally validated. This risk assessment tool is available for online use.

  4. Finalizing host range determination of a weed biological control pathogen with BLUPs and damage assessment

    USDA-ARS?s Scientific Manuscript database

    Colletotrichum gloeosporioides f. sp. salsolae (Penz.) Penz. & Sacc. in Penz. (CGS) is a facultative parasitic fungus being evaluated as a classical biological control agent of Russian thistle or tumbleweed (Salsola tragus L.). In initial host range determination tests, Henderson’s mixed model equat...

  5. Learning Groups: The Effects of Group Diversity on The Quality of Group Reflection

    ERIC Educational Resources Information Center

    Adelopo, Ismail; Asante, Joseph; Dart, Eleanor; Rufai, Ibrahim

    2017-01-01

    This study explores the quality of reflection, and how group diversity affects group reflection by final-year accounting and finance undergraduates using Mezirow's [(1991). "Transformative dimensions of adult learning." San Francisco, CA: Jossey-Bass] reflection model. Group work and reflective writing are now common assessment features…

  6. Leisure Service Career Programs Model. Final Report.

    ERIC Educational Resources Information Center

    Twining, Marilyn

    This report identifies leisure career occupations, determines the occupational outlook, and develops primary core competencies as well as specialized, optional competencies for entry level employment. The main method of inquiry is described as a needs assessment based on an audit at Moraine Valley Community College, two previous studies by the…

  7. 75 FR 17155 - Preparation of an Environmental Assessment (EA) for Proposed Outer Continental Shelf (OCS) Oil...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-05

    ... all requirements of NEPA, the Coastal Zone Management Act, Outer Continental Shelf Lands Act, and... consistent with each affected state's federally approved Coastal Zone Management program. Finally, the MMS...-circulation modeling, ecological effects of oil and gas activities, and hurricane impacts on coastal...

  8. Development of PCK for Novice and Experienced University Physics Instructors: A Case Study

    ERIC Educational Resources Information Center

    Jang, Syh-Jong; Tsai, Meng-Fang; Chen, Ho-Yuan

    2013-01-01

    The current study assessed and compared university students' perceptions' of a novice and an experienced physics instructor's Pedagogical Content Knowledge (PCK). Two college physics instructors and 116 students voluntarily participated in this study. The research model comprised three workshops, mid-term and final evaluations and instructor…

  9. Assessing Anticalcification Treatments in Bioprosthetic Tissue by Using the New Zealand Rabbit Intramuscular Model

    PubMed Central

    Wright, Gregory A; Faught, Joelle M; Olin, Jane M

    2009-01-01

    The objective of this work was to demonstrate that the New Zealand White (NZW) rabbit intramuscular model can be used for detecting calcification in bioprosthetic tissue and to compare the calcification in the rabbit to that of native human valves. The rabbit model was compared with the commonly used Sprague–Dawley rat subcutaneous model. Eighteen rabbits and 18 rats were used to assess calcification in bioprosthetic tissue over time (7, 14, 30, and 90 d). The explanted rabbit and rat tissue discs were measured for calcium by using atomic absorption and Raman spectroscopy. Calcium deposits on the human valve explants were assessed by using Raman spectroscopy. The results showed that the NZW rabbit model is robust for detecting calcification in a shorter duration (14 d), with less infection complications, more space to implant tissue groups (thereby reducing animal use numbers), and a more metabolically and mechanically dynamic environment than the rat subcutaneous model . The human explanted valves and rabbit explanted tissue both showed Raman peaks at 960 cm−1 which is representative of hydroxyapatite. Hydroxyapatite is the final calcium and phosphate species in the calcification of bioprosthetic heart valves and rabbit intramuscular implants. The NZW rabbit intramuscular model is an effective model for assessing calcification in bioprosthetic tissue. PMID:19619417

  10. Initial Northwest Power Act Power Sales Contracts : Final Environmental Impact Statement. Volume 2, Appendices A--L.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    United States. Bonneville Power Administration.

    1992-01-01

    This report consists of appendices A-L of the final environmental impact statement for the Bonneville Power Administration. The appendices provide information on the following: Ninth circuit Court opinion in Forelaws on Board v. Johnson; guide to Northwest Power act contracts; guide to hydro operations; glossary; affected environment supporting documentation; environmental impacts of generic resource types; information on models used; technical information on analysis; public involvement activities; bibliography; Pacific Northwest Electric Power Planning and Conservation Act; and biological assessment. (CBS)

  11. Integrating local pastoral knowledge, participatory mapping, and species distribution modeling for risk assessment of invasive rubber vine (Cryptostegia grandiflora) in Ethiopia’s Afar region

    USGS Publications Warehouse

    Luizza, Matthew; Wakie, Tewodros; Evangelista, Paul; Jarnevich, Catherine S.

    2016-01-01

    The threats posed by invasive plants span ecosystems and economies worldwide. Local knowledge of biological invasions has proven beneficial for invasive species research, but to date no work has integrated this knowledge with species distribution modeling for invasion risk assessments. In this study, we integrated pastoral knowledge with Maxent modeling to assess the suitable habitat and potential impacts of invasive Cryptostegia grandiflora Robx. Ex R.Br. (rubber vine) in Ethiopia’s Afar region. We conducted focus groups with seven villages across the Amibara and Awash-Fentale districts. Pastoral knowledge revealed the growing threat of rubber vine, which to date has received limited attention in Ethiopia, and whose presence in Afar was previously unknown to our team. Rubber vine occurrence points were collected in the field with pastoralists and processed in Maxent with MODIS-derived vegetation indices, topographic data, and anthropogenic variables. We tested model fit using a jackknife procedure and validated the final model with an independent occurrence data set collected through participatory mapping activities with pastoralists. A Multivariate Environmental Similarity Surface analysis revealed areas with novel environmental conditions for future targeted surveys. Model performance was evaluated using area under the receiver-operating characteristic curve (AUC) and showed good fit across the jackknife models (average AUC = 0.80) and the final model (test AUC = 0.96). Our results reveal the growing threat rubber vine poses to Afar, with suitable habitat extending downstream of its current known location in the middle Awash River basin. Local pastoral knowledge provided important context for its rapid expansion due to acute changes in seasonality and habitat alteration, in addition to threats posed to numerous endemic tree species that provide critical provisioning ecosystem services. This work demonstrates the utility of integrating local ecological knowledge with species distribution modeling for early detection and targeted surveying of recently established invasive species.

  12. Influence of non-homogeneous mixing on final epidemic size in a meta-population model.

    PubMed

    Cui, Jingan; Zhang, Yanan; Feng, Zhilan

    2018-06-18

    In meta-population models for infectious diseases, the basic reproduction number [Formula: see text] can be as much as 70% larger in the case of preferential mixing than that in homogeneous mixing [J.W. Glasser, Z. Feng, S.B. Omer, P.J. Smith, and L.E. Rodewald, The effect of heterogeneity in uptake of the measles, mumps, and rubella vaccine on the potential for outbreaks of measles: A modelling study, Lancet ID 16 (2016), pp. 599-605. doi: 10.1016/S1473-3099(16)00004-9 ]. This suggests that realistic mixing can be an important factor to consider in order for the models to provide a reliable assessment of intervention strategies. The influence of mixing is more significant when the population is highly heterogeneous. In this paper, another quantity, the final epidemic size ([Formula: see text]) of an outbreak, is considered to examine the influence of mixing and population heterogeneity. Final size relation is derived for a meta-population model accounting for a general mixing. The results show that [Formula: see text] can be influenced by the pattern of mixing in a significant way. Another interesting finding is that, heterogeneity in various sub-population characteristics may have the opposite effect on [Formula: see text] and [Formula: see text].

  13. Prediction of Tissue Outcome and Assessment of Treatment Effect in Acute Ischemic Stroke Using Deep Learning.

    PubMed

    Nielsen, Anne; Hansen, Mikkel Bo; Tietze, Anna; Mouridsen, Kim

    2018-06-01

    Treatment options for patients with acute ischemic stroke depend on the volume of salvageable tissue. This volume assessment is currently based on fixed thresholds and single imagine modalities, limiting accuracy. We wish to develop and validate a predictive model capable of automatically identifying and combining acute imaging features to accurately predict final lesion volume. Using acute magnetic resonance imaging, we developed and trained a deep convolutional neural network (CNN deep ) to predict final imaging outcome. A total of 222 patients were included, of which 187 were treated with rtPA (recombinant tissue-type plasminogen activator). The performance of CNN deep was compared with a shallow CNN based on the perfusion-weighted imaging biomarker Tmax (CNN Tmax ), a shallow CNN based on a combination of 9 different biomarkers (CNN shallow ), a generalized linear model, and thresholding of the diffusion-weighted imaging biomarker apparent diffusion coefficient (ADC) at 600×10 -6 mm 2 /s (ADC thres ). To assess whether CNN deep is capable of differentiating outcomes of ±intravenous rtPA, patients not receiving intravenous rtPA were included to train CNN deep, -rtpa to access a treatment effect. The networks' performances were evaluated using visual inspection, area under the receiver operating characteristic curve (AUC), and contrast. CNN deep yields significantly better performance in predicting final outcome (AUC=0.88±0.12) than generalized linear model (AUC=0.78±0.12; P =0.005), CNN Tmax (AUC=0.72±0.14; P <0.003), and ADC thres (AUC=0.66±0.13; P <0.0001) and a substantially better performance than CNN shallow (AUC=0.85±0.11; P =0.063). Measured by contrast, CNN deep improves the predictions significantly, showing superiority to all other methods ( P ≤0.003). CNN deep also seems to be able to differentiate outcomes based on treatment strategy with the volume of final infarct being significantly different ( P =0.048). The considerable prediction improvement accuracy over current state of the art increases the potential for automated decision support in providing recommendations for personalized treatment plans. © 2018 American Heart Association, Inc.

  14. 50 CFR 11.17 - Payment of final assessment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... PLANTS CIVIL PROCEDURES Assessment Procedure § 11.17 Payment of final assessment. When a final... request the Attorney General to institute a civil action in the U.S. District Court to collect the penalty. ...

  15. An empirical study using permutation-based resampling in meta-regression

    PubMed Central

    2012-01-01

    Background In meta-regression, as the number of trials in the analyses decreases, the risk of false positives or false negatives increases. This is partly due to the assumption of normality that may not hold in small samples. Creation of a distribution from the observed trials using permutation methods to calculate P values may allow for less spurious findings. Permutation has not been empirically tested in meta-regression. The objective of this study was to perform an empirical investigation to explore the differences in results for meta-analyses on a small number of trials using standard large sample approaches verses permutation-based methods for meta-regression. Methods We isolated a sample of randomized controlled clinical trials (RCTs) for interventions that have a small number of trials (herbal medicine trials). Trials were then grouped by herbal species and condition and assessed for methodological quality using the Jadad scale, and data were extracted for each outcome. Finally, we performed meta-analyses on the primary outcome of each group of trials and meta-regression for methodological quality subgroups within each meta-analysis. We used large sample methods and permutation methods in our meta-regression modeling. We then compared final models and final P values between methods. Results We collected 110 trials across 5 intervention/outcome pairings and 5 to 10 trials per covariate. When applying large sample methods and permutation-based methods in our backwards stepwise regression the covariates in the final models were identical in all cases. The P values for the covariates in the final model were larger in 78% (7/9) of the cases for permutation and identical for 22% (2/9) of the cases. Conclusions We present empirical evidence that permutation-based resampling may not change final models when using backwards stepwise regression, but may increase P values in meta-regression of multiple covariates for relatively small amount of trials. PMID:22587815

  16. Security risk assessment: applying the concepts of fuzzy logic.

    PubMed

    Bajpai, Shailendra; Sachdeva, Anish; Gupta, J P

    2010-01-15

    Chemical process industries (CPI) handling hazardous chemicals in bulk can be attractive targets for deliberate adversarial actions by terrorists, criminals and disgruntled employees. It is therefore imperative to have comprehensive security risk management programme including effective security risk assessment techniques. In an earlier work, it has been shown that security risk assessment can be done by conducting threat and vulnerability analysis or by developing Security Risk Factor Table (SRFT). HAZOP type vulnerability assessment sheets can be developed that are scenario based. In SRFT model, important security risk bearing factors such as location, ownership, visibility, inventory, etc., have been used. In this paper, the earlier developed SRFT model has been modified using the concepts of fuzzy logic. In the modified SRFT model, two linguistic fuzzy scales (three-point and four-point) are devised based on trapezoidal fuzzy numbers. Human subjectivity of different experts associated with previous SRFT model is tackled by mapping their scores to the newly devised fuzzy scale. Finally, the fuzzy score thus obtained is defuzzyfied to get the results. A test case of a refinery is used to explain the method and compared with the earlier work.

  17. Desert Dust Properties, Modelling, and Monitoring

    NASA Technical Reports Server (NTRS)

    Kaskaoutis, Dimitris G.; Kahn, Ralph A.; Gupta, Pawan; Jayaraman, Achuthan; Bartzokas, Aristides

    2013-01-01

    This paper is just the three-page introduction to a Special Issue of Advances in Meteorology focusing on desert dust. It provides a paragraph each on 13 accepted papers, most relating to the used of satellite data to assess attributes or distribution of airborne desert dust. As guest Associate Editors of this issue, we organized the papers into a systematic whole, beginning with large-scale transport and seasonal behavior, then to regional dust transport, transport history, and climate impacts, first in the Mediterranean region, then India and central Asia, and finally focusing on transport model assessment and the use of lidar as a technique to constrain dust spatial-temporal distribution.

  18. [work motivation -- assessment instruments and their relevance for medical care].

    PubMed

    Fiedler, Rolf G; Ranft, Andreas; Greitemann, Bernhard; Heuft, Gereon

    2005-11-01

    The relevance of work motivation for medical research and healthcare, in particular rehabilitation, is described. Four diagnostic instruments in the German language are introduced which can assess work motivation using a scale system: AVEM, JDS, LMI and FBTM. Their possible application and potential usage for the clinical area are discussed. Apart from the FBTM, none of these instruments can be directly used as a general instrument in a normal medical clinical setting. Finally, a current model for work motivation (compensatory model of work motivation and volition) is presented that contains basis concepts, which are judged as important for future research questions concerning the development of motivation diagnostic instruments.

  19. A multicriteria decision making approach based on fuzzy theory and credibility mechanism for logistics center location selection.

    PubMed

    Wang, Bowen; Xiong, Haitao; Jiang, Chengrui

    2014-01-01

    As a hot topic in supply chain management, fuzzy method has been widely used in logistics center location selection to improve the reliability and suitability of the logistics center location selection with respect to the impacts of both qualitative and quantitative factors. However, it does not consider the consistency and the historical assessments accuracy of experts in predecisions. So this paper proposes a multicriteria decision making model based on credibility of decision makers by introducing priority of consistency and historical assessments accuracy mechanism into fuzzy multicriteria decision making approach. In this way, only decision makers who pass the credibility check are qualified to perform the further assessment. Finally, a practical example is analyzed to illustrate how to use the model. The result shows that the fuzzy multicriteria decision making model based on credibility mechanism can improve the reliability and suitability of site selection for the logistics center.

  20. A Multicriteria Decision Making Approach Based on Fuzzy Theory and Credibility Mechanism for Logistics Center Location Selection

    PubMed Central

    Wang, Bowen; Jiang, Chengrui

    2014-01-01

    As a hot topic in supply chain management, fuzzy method has been widely used in logistics center location selection to improve the reliability and suitability of the logistics center location selection with respect to the impacts of both qualitative and quantitative factors. However, it does not consider the consistency and the historical assessments accuracy of experts in predecisions. So this paper proposes a multicriteria decision making model based on credibility of decision makers by introducing priority of consistency and historical assessments accuracy mechanism into fuzzy multicriteria decision making approach. In this way, only decision makers who pass the credibility check are qualified to perform the further assessment. Finally, a practical example is analyzed to illustrate how to use the model. The result shows that the fuzzy multicriteria decision making model based on credibility mechanism can improve the reliability and suitability of site selection for the logistics center. PMID:25215319

  1. Regression Model Term Selection for the Analysis of Strain-Gage Balance Calibration Data

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    The paper discusses the selection of regression model terms for the analysis of wind tunnel strain-gage balance calibration data. Different function class combinations are presented that may be used to analyze calibration data using either a non-iterative or an iterative method. The role of the intercept term in a regression model of calibration data is reviewed. In addition, useful algorithms and metrics originating from linear algebra and statistics are recommended that will help an analyst (i) to identify and avoid both linear and near-linear dependencies between regression model terms and (ii) to make sure that the selected regression model of the calibration data uses only statistically significant terms. Three different tests are suggested that may be used to objectively assess the predictive capability of the final regression model of the calibration data. These tests use both the original data points and regression model independent confirmation points. Finally, data from a simplified manual calibration of the Ames MK40 balance is used to illustrate the application of some of the metrics and tests to a realistic calibration data set.

  2. Measuring implementation behaviour of menu guidelines in the childcare setting: confirmatory factor analysis of a theoretical domains framework questionnaire (TDFQ).

    PubMed

    Seward, Kirsty; Wolfenden, Luke; Wiggers, John; Finch, Meghan; Wyse, Rebecca; Oldmeadow, Christopher; Presseau, Justin; Clinton-McHarg, Tara; Yoong, Sze Lin

    2017-04-04

    While there are number of frameworks which focus on supporting the implementation of evidence based approaches, few psychometrically valid measures exist to assess constructs within these frameworks. This study aimed to develop and psychometrically assess a scale measuring each domain of the Theoretical Domains Framework for use in assessing the implementation of dietary guidelines within a non-health care setting (childcare services). A 75 item 14-domain Theoretical Domains Framework Questionnaire (TDFQ) was developed and administered via telephone interview to 202 centre based childcare service cooks who had a role in planning the service menu. Confirmatory factor analysis (CFA) was undertaken to assess the reliability, discriminant validity and goodness of fit of the 14-domain theoretical domain framework measure. For the CFA, five iterative processes of adjustment were undertaken where 14 items were removed, resulting in a final measure consisting of 14 domains and 61 items. For the final measure: the Chi-Square goodness of fit statistic was 3447.19; the Standardized Root Mean Square Residual (SRMR) was 0.070; the Root Mean Square Error of Approximation (RMSEA) was 0.072; and the Comparative Fit Index (CFI) had a value of 0.78. While only one of the three indices support goodness of fit of the measurement model tested, a 14-domain model with 61 items showed good discriminant validity and internally consistent items. Future research should aim to assess the psychometric properties of the developed TDFQ in other community-based settings.

  3. Assessment of IT solutions used in the Hungarian income tax microsimulation system

    NASA Astrophysics Data System (ADS)

    Molnar, I.; Hardhienata, S.

    2017-01-01

    This paper focuses on the use of information technology (IT) in diverse microsimulation studies and presents state-of-the-art solutions in the traditional application field of personal income tax simulation. The aim of the paper is to promote solutions, which can improve the efficiency and quality of microsimulation model implementation, assess their applicability and help to shift attention from microsimulation model implementation and data analysis towards experiment design and model use. First, the authors shortly discuss the relevant characteristics of the microsimulation application field and the managerial decision-making problem. After examination of the salient problems, advanced IT solutions, such as meta-database and service-oriented architecture are presented. The authors show how selected technologies can be applied to support both data- and behavior-driven and even agent-based personal income tax microsimulation model development. Finally, examples are presented and references made to the Hungarian Income Tax Simulator (HITS) models and their results. The paper concludes with a summary of the IT assessment and application-related author remarks dedicated to an Indonesian Income Tax Microsimulation Model.

  4. 76 FR 41323 - Notice of Availability of the Final Environmental Assessment for the Metropolitan Branch Trail

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-13

    ... Environmental Assessment for the Metropolitan Branch Trail AGENCY: Federal Highway Administration, District of.... ACTION: Notice of availability of the Final Environmental Assessment for the Metropolitan Branch Trail... availability of the Final Environmental Assessment (Ea) for the Metropolitan Branch Trail Project, pursuant to...

  5. Skill and independence weighting for multi-model assessments

    DOE PAGES

    Sanderson, Benjamin M.; Wehner, Michael; Knutti, Reto

    2017-06-28

    We present a weighting strategy for use with the CMIP5 multi-model archive in the fourth National Climate Assessment, which considers both skill in the climatological performance of models over North America as well as the inter-dependency of models arising from common parameterizations or tuning practices. The method exploits information relating to the climatological mean state of a number of projection-relevant variables as well as metrics representing long-term statistics of weather extremes. The weights, once computed can be used to simply compute weighted means and significance information from an ensemble containing multiple initial condition members from potentially co-dependent models of varyingmore » skill. Two parameters in the algorithm determine the degree to which model climatological skill and model uniqueness are rewarded; these parameters are explored and final values are defended for the assessment. The influence of model weighting on projected temperature and precipitation changes is found to be moderate, partly due to a compensating effect between model skill and uniqueness. However, more aggressive skill weighting and weighting by targeted metrics is found to have a more significant effect on inferred ensemble confidence in future patterns of change for a given projection.« less

  6. Contact Versus Non-Contact Measurement of a Helicopter Main Rotor Composite Blade

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luczak, Marcin; Dziedziech, Kajetan; Peeters, Bart

    2010-05-28

    The dynamic characterization of lightweight structures is particularly complex as the impact of the weight of sensors and instrumentation (cables, mounting of exciters...) can distort the results. Varying mass loading or constraint effects between partial measurements may determine several errors on the final conclusions. Frequency shifts can lead to erroneous interpretations of the dynamics parameters. Typically these errors remain limited to a few percent. Inconsistent data sets however can result in major processing errors, with all related consequences towards applications based on the consistency assumption, such as global modal parameter identification, model-based damage detection and FRF-based matrix inversion in substructuring,more » load identification and transfer path analysis [1]. This paper addresses the subject of accuracy in the context of the measurement of the dynamic properties of a particular lightweight structure. It presents a comprehensive comparative study between the use of accelerometer, laser vibrometer (scanning LDV) and PU-probe (acoustic particle velocity and pressure) measurements to measure the structural responses, with as final aim the comparison of modal model quality assessment. The object of the investigation is a composite material blade from the main rotor of a helicopter. The presented results are part of an extensive test campaign performed with application of SIMO, MIMO, random and harmonic excitation, and the use of the mentioned contact and non-contact measurement techniques. The advantages and disadvantages of the applied instrumentation are discussed. Presented are real-life measurement problems related to the different set up conditions. Finally an analysis of estimated models is made in view of assessing the applicability of the various measurement approaches for successful fault detection based on modal parameters observation as well as in uncertain non-deterministic numerical model updating.« less

  7. Contact Versus Non-Contact Measurement of a Helicopter Main Rotor Composite Blade

    NASA Astrophysics Data System (ADS)

    Luczak, Marcin; Dziedziech, Kajetan; Vivolo, Marianna; Desmet, Wim; Peeters, Bart; Van der Auweraer, Herman

    2010-05-01

    The dynamic characterization of lightweight structures is particularly complex as the impact of the weight of sensors and instrumentation (cables, mounting of exciters…) can distort the results. Varying mass loading or constraint effects between partial measurements may determine several errors on the final conclusions. Frequency shifts can lead to erroneous interpretations of the dynamics parameters. Typically these errors remain limited to a few percent. Inconsistent data sets however can result in major processing errors, with all related consequences towards applications based on the consistency assumption, such as global modal parameter identification, model-based damage detection and FRF-based matrix inversion in substructuring, load identification and transfer path analysis [1]. This paper addresses the subject of accuracy in the context of the measurement of the dynamic properties of a particular lightweight structure. It presents a comprehensive comparative study between the use of accelerometer, laser vibrometer (scanning LDV) and PU-probe (acoustic particle velocity and pressure) measurements to measure the structural responses, with as final aim the comparison of modal model quality assessment. The object of the investigation is a composite material blade from the main rotor of a helicopter. The presented results are part of an extensive test campaign performed with application of SIMO, MIMO, random and harmonic excitation, and the use of the mentioned contact and non-contact measurement techniques. The advantages and disadvantages of the applied instrumentation are discussed. Presented are real-life measurement problems related to the different set up conditions. Finally an analysis of estimated models is made in view of assessing the applicability of the various measurement approaches for successful fault detection based on modal parameters observation as well as in uncertain non-deterministic numerical model updating.

  8. Predicting the chance of live birth for women undergoing IVF: a novel pretreatment counselling tool.

    PubMed

    Dhillon, R K; McLernon, D J; Smith, P P; Fishel, S; Dowell, K; Deeks, J J; Bhattacharya, S; Coomarasamy, A

    2016-01-01

    Which pretreatment patient variables have an effect on live birth rates following assisted conception? The predictors in the final multivariate logistic regression model found to be significantly associated with reduced chances of IVF/ICSI success were increasing age (particularly above 36 years), tubal factor infertility, unexplained infertility and Asian or Black ethnicity. The two most widely recognized prediction models for live birth following IVF were developed on data from 1991 to 2007; pre-dating significant changes in clinical practice. These existing IVF outcome prediction models do not incorporate key pretreatment predictors, such as BMI, ethnicity and ovarian reserve, which are readily available now. In this cohort study a model to predict live birth was derived using data collected from 9915 women who underwent IVF/ICSI treatment at any CARE (Centres for Assisted Reproduction) clinic from 2008 to 2012. Model validation was performed on data collected from 2723 women who underwent treatment in 2013. The primary outcome for the model was live birth, which was defined as any birth event in which at least one baby was born alive and survived for more than 1 month. Data were collected from 12 fertility clinics within the CARE consortium in the UK. Multivariable logistic regression was used to develop the model. Discriminatory ability was assessed using the area under receiver operating characteristic (AUROC) curve, and calibration was assessed using calibration-in-the-large and the calibration slope test. The predictors in the final model were female age, BMI, ethnicity, antral follicle count (AFC), previous live birth, previous miscarriage, cause and duration of infertility. Upon assessing predictive ability, the AUROC curve for the final model and validation cohort was (0.62; 95% confidence interval (CI) 0.61-0.63) and (0.62; 95% CI 0.60-0.64) respectively. Calibration-in-the-large showed a systematic over-estimation of the predicted probability of live birth (Intercept (95% CI) = -0.168 (-0.252 to -0.084), P < 0.001). However, the calibration slope test was not significant (slope (95% CI) = 1.129 (0.893-1.365), P = 0.28). Due to the calibration-in-the-large test being significant we recalibrated the final model. The recalibrated model showed a much-improved calibration. Our model is unable to account for factors such as smoking and alcohol that can affect IVF/ICSI outcome and is somewhat restricted to representing the ethnic distribution and outcomes for the UK population only. We were unable to account for socioeconomic status and it may be that by having 75% of the population paying privately for their treatment, the results cannot be generalized to people of all socioeconomic backgrounds. In addition, patients and clinicians should understand this model is designed for use before treatment begins and does not include variables that become available (oocyte, embryo and endometrial) as treatment progresses. Finally, this model is also limited to use prior to first cycle only. To our knowledge, this is the first study to present a novel, up-to-date model encompassing three readily available prognostic factors; female BMI, ovarian reserve and ethnicity, which have not previously been used in prediction models for IVF outcome. Following geographical validation, the model can be used to build a user-friendly interface to aid decision-making for couples and their clinicians. Thereafter, a feasibility study of its implementation could focus on patient acceptability and quality of decision-making. None. © The Author 2015. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. TOPICAL REVIEW: Modelling the interaction of electromagnetic fields (10 MHz 10 GHz) with the human body: methods and applications

    NASA Astrophysics Data System (ADS)

    Hand, J. W.

    2008-08-01

    Numerical modelling of the interaction between electromagnetic fields (EMFs) and the dielectrically inhomogeneous human body provides a unique way of assessing the resulting spatial distributions of internal electric fields, currents and rate of energy deposition. Knowledge of these parameters is of importance in understanding such interactions and is a prerequisite when assessing EMF exposure or when assessing or optimizing therapeutic or diagnostic medical applications that employ EMFs. In this review, computational methods that provide this information through full time-dependent solutions of Maxwell's equations are summarized briefly. This is followed by an overview of safety- and medical-related applications where modelling has contributed significantly to development and understanding of the techniques involved. In particular, applications in the areas of mobile communications, magnetic resonance imaging, hyperthermal therapy and microwave radiometry are highlighted. Finally, examples of modelling the potentially new medical applications of recent technologies such as ultra-wideband microwaves are discussed.

  10. An interactive web application for visualizing climate data

    USGS Publications Warehouse

    Alder, J.; Hostetler, S.; Williams, D.

    2013-01-01

    Massive volumes of data are being created as modeling centers from around the world finalize their submission of climate simulations for the Coupled Model Intercomparison Project, phase 5 (CMIP5), in preparation for the forthcoming Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). Scientists, resource managers, and other potential users of climate data are faced with the daunting task of analyzing, distilling, and summarizing this unprecedented wealth of climate information.

  11. An Interactive Web Application for Visualizing Climate Data

    NASA Astrophysics Data System (ADS)

    Alder, J.; Hostetler, S.; Williams, D.

    2013-05-01

    Massive volumes of data are being created as modeling centers from around the world finalize their submission of climate simulations for the Coupled Model Intercomparison Project, phase 5 (CMIP5), in preparation for the forthcoming Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). Scientists, resource managers, and other potential users of climate data are faced with the daunting task of analyzing, distilling, and summarizing this unprecedented wealth of climate information.

  12. A methodology for overall consequence modeling in chemical industry.

    PubMed

    Arunraj, N S; Maiti, J

    2009-09-30

    Risk assessment in chemical process industry is a very important issue for safeguarding human and the ecosystem from damages caused to them. Consequence assessment is an integral part of risk assessment. However, the commonly used consequence estimation methods involve time-consuming complex mathematical models and simple assimilation of losses without considering all the consequence factors. This lead to the deterioration of quality of estimated risk value. So, the consequence modeling has to be performed in detail considering all major losses with optimal time to improve the decisive value of risk. The losses can be broadly categorized into production loss, assets loss, human health and safety loss, and environment loss. In this paper, a conceptual framework is developed to assess the overall consequence considering all the important components of major losses. Secondly, a methodology is developed for the calculation of all the major losses, which are normalized to yield the overall consequence. Finally, as an illustration, the proposed methodology is applied to a case study plant involving benzene extraction. The case study result using the proposed consequence assessment scheme is compared with that from the existing methodologies.

  13. Development and weighting of a life cycle assessment screening model

    NASA Astrophysics Data System (ADS)

    Bates, Wayne E.; O'Shaughnessy, James; Johnson, Sharon A.; Sisson, Richard

    2004-02-01

    Nearly all life cycle assessment tools available today are high priced, comprehensive and quantitative models requiring a significant amount of data collection and data input. In addition, most of the available software packages require a great deal of training time to learn how to operate the model software. Even after this time investment, results are not guaranteed because of the number of estimations and assumptions often necessary to run the model. As a result, product development, design teams and environmental specialists need a simplified tool that will allow for the qualitative evaluation and "screening" of various design options. This paper presents the development and design of a generic, qualitative life cycle screening model and demonstrates its applicability and ease of use. The model uses qualitative environmental, health and safety factors, based on site or product-specific issues, to sensitize the overall results for a given set of conditions. The paper also evaluates the impact of different population input ranking values on model output. The final analysis is based on site or product-specific variables. The user can then evaluate various design changes and the apparent impact or improvement on the environment, health and safety, compliance cost and overall corporate liability. Major input parameters can be varied, and factors such as materials use, pollution prevention, waste minimization, worker safety, product life, environmental impacts, return of investment, and recycle are evaluated. The flexibility of the model format will be discussed in order to demonstrate the applicability and usefulness within nearly any industry sector. Finally, an example using audience input value scores will be compared to other population input results.

  14. Consequences of Switching to Blended Learning: The Grenoble Medical School Key Elements.

    PubMed

    Houssein, Sahal; Di Marco, Lionel; Schwebel, Carole; Luengo, Vanda; Morand, Patrice; Gillois, Pierre

    2018-01-01

    In 2006, the Grenoble-Alpes University Medical School decided to switch the learning paradigm of the first year to a blended learning model based on a flipped classroom with a continuous dual assessment system providing personal follow-up. We report a descriptive analysis of two pedagogical models. The innovative blended learning model is divided into 5 week-sequences of learning, starting with a series of knowledge capsules, following with Interactive On Line Questions, Interactive On Site Training and an Explanation Meeting. The fourth and final steps are the dual assessment system that prepares for the final contest and the personal weekly follow-up. The data were extracted from the information systems over 17 years, during which the same learning model was applied. With the same student workload, the hourly knowledge/skills ratio decreased to approximately 50% with the blended learning model. The teachers' workload increased significantly in the first year (+70%), and then decreased each year (reaching -20%). Furthermore, the type of education has also changed for the teacher, from an initial hourly knowledge/skill ratio of 3, to a ratio of 1/3 with the new model after a few years. The institution also needed to resize the classroom from a large amphitheatre to small interactive learning spaces. There is a significant initial effort required to establish this model both for the teachers and for the institution, which have different needs and costs However, the satisfaction rates and the demand for extension to the other curriculums from medics and paramedics learners indicate that this model provides the enhanced learning paradigm of the future.

  15. Final June Revisions Rule Significant Contribution Assessment TSD

    EPA Pesticide Factsheets

    This Technical Support Document (TSD) presents quantitative assessments of the relationship between the final February revisions to the Transport Rule, the final June revisions rule, and the original analysis conducted for the final Transport Rule.

  16. Proposal of an environmental performance index to assess solid waste treatment technologies.

    PubMed

    Coelho, Hosmanny Mauro Goulart; Lange, Liséte Celina; Coelho, Lineker Max Goulart

    2012-07-01

    Although the concern with sustainable development and environment protection has considerably grown in the last years it is noted that the majority of decision making models and tools are still either excessively tied to economic aspects or geared to the production process. Moreover, existing models focus on the priority steps of solid waste management, beyond waste energy recovery and disposal. So, in order to help the lack of models and tools aiming at the waste treatment and final disposal, a new concept is proposed: the Cleaner Treatment, which is based on the Cleaner Production principles. This paper focuses on the development and validation of the Cleaner Treatment Index (CTI), to assess environmental performance of waste treatment technologies based on the Cleaner Treatment concept. The index is formed by aggregation (summation or product) of several indicators that consists in operational parameters. The weights of the indicator were established by Delphi Method and Brazilian Environmental Laws. In addition, sensitivity analyses were carried out comparing both aggregation methods. Finally, index validation was carried out by applying the CTI to 10 waste-to-energy plants data. From sensitivity analysis and validation results it is possible to infer that summation model is the most suitable aggregation method. For summation method, CTI results were superior to 0.5 (in a scale from 0 to 1) for most facilities evaluated. So, this study demonstrates that CTI is a simple and robust tool to assess and compare the environmental performance of different treatment plants being an excellent quantitative tool to support Cleaner Treatment implementation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Application of the Ecosystem Assessment Model to Lake Norman: A cooling lake in North Carolina: Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porcella, D.B.; Bowie, G.L.; Campbell, C.L.

    The Ecosystem Assessment Model (EAM) of the Cooling Lake Assessment Methodology was applied to the extensive ecological field data collected at Lake Norman, North Carolina by Duke Power Company to evaluate its capability to simulate lake ecosystems and the ecological effects of steam electric power plants. The EAM provided simulations over a five-year verification period that behaved as expected based on a one-year calibration. Major state variables of interest to utilities and regulatory agencies are: temperature, dissolved oxygen, and fish community variables. In qualitative terms, temperature simulation was very accurate, dissolved oxygen simulation was accurate, and fish prediction was reasonablymore » accurate. The need for more accurate fisheries data collected at monthly intervals and non-destructive sampling techniques was identified.« less

  18. IRIS Toxicological Review of Ammonia Noncancer Inhalation (Final Report)

    EPA Science Inventory

    EPA has finalized the Integrated Risk Information System (IRIS) Assessment of Ammonia (Noncancer Inhalation). This assessment addresses the potential noncancer human health effects from long-term inhalation exposure to ammonia. Now final, this assessment will update the ...

  19. Emissions and dispersion modeling system (EDMS). Its development and application at airports and airbases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moss, M.T.; Segal, H.M.

    1994-06-01

    A new complex source microcomputer model has been developed for use at civil airports and Air Force bases. This paper describes both the key features of this model and its application in evaluating the air quality impact of new construction projects at three airports: one in the United States and two in Canada. The single EDMS model replaces the numerous models previously required to assess the air quality impact of pollution sources at airports. EDMS also employs a commercial data base to reduce the time and manpower required to accurately assess and document the air quality impact of airfield operations.more » On July 20, 1993, the U.S. Environmental Protection Agency (EPA) issued the final rule (Federal Register, 7/20/93, page 38816) to add new models to the Guideline on Air Quality Models. At that time EDMS was incorporated into the Guideline as an Appendix A model. 12 refs., 4 figs., 1 tab.« less

  20. GIS-Based Suitability Model for Assessment of Forest Biomass Energy Potential in a Region of Portugal

    NASA Astrophysics Data System (ADS)

    Quinta-Nova, Luis; Fernandez, Paulo; Pedro, Nuno

    2017-12-01

    This work focuses on developed a decision support system based on multicriteria spatial analysis to assess the potential for generation of biomass residues from forestry sources in a region of Portugal (Beira Baixa). A set of environmental, economic and social criteria was defined, evaluated and weighted in the context of Saaty’s analytic hierarchies. The best alternatives were obtained after applying Analytic Hierarchy Process (AHP). The model was applied to the central region of Portugal where forest and agriculture are the most representative land uses. Finally, sensitivity analysis of the set of factors and their associated weights was performed to test the robustness of the model. The proposed evaluation model provides a valuable reference for decision makers in establishing a standardized means of selecting the optimal location for new biomass plants.

  1. An Update of the Analytical Groundwater Modeling to Assess Water Resource Impacts at the Afton Solar Energy Zone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinn, John J.; Greer, Christopher B.; Carr, Adrianne E.

    2014-10-01

    The purpose of this study is to update a one-dimensional analytical groundwater flow model to examine the influence of potential groundwater withdrawal in support of utility-scale solar energy development at the Afton Solar Energy Zone (SEZ) as a part of the Bureau of Land Management’s (BLM’s) Solar Energy Program. This report describes the modeling for assessing the drawdown associated with SEZ groundwater pumping rates for a 20-year duration considering three categories of water demand (high, medium, and low) based on technology-specific considerations. The 2012 modeling effort published in the Final Programmatic Environmental Impact Statement for Solar Energy Development in Sixmore » Southwestern States (Solar PEIS; BLM and DOE 2012) has been refined based on additional information described below in an expanded hydrogeologic discussion.« less

  2. Phthalates and Cumulative Risk Assessment (NAS Final ...

    EPA Pesticide Factsheets

    On December 18, 2008, the National Academy of Sciences' National Research Council released a final report, requested and sponsored by the EPA, entitled Phthalates and Cumulative Risk Assessment: The Task Ahead. Risk assessment has become a dominant public policy tool for making choices, based on limited resources, to protect public health and the environment. It has been instrumental to the mission of the U.S. Environmental Protection Agency (EPA) as well as other federal agencies in evaluating public health concerns, informing regulatory and technological decisions, prioritizing research needs and funding, and in developing approaches for cost-benefit analysis. People are exposed to a variety of chemicals throughout their daily lives. To protect public health, regulators use risk assessments to examine the effects of chemical exposures. This book provides guidance for assessing the risk of phthalates, chemicals found in many consumer products that have been shown to affect the development of the male reproductive system of laboratory animals. Because people are exposed to multiple phthalates and other chemicals that affect male reproductive development, a cumulative risk assessment should be conducted that evaluates the combined effects of exposure to all these chemicals. The book suggests an approach for cumulative risk assessment that can serve as a model for evaluating the health risks of other types of chemicals.

  3. A multi-objective optimization model for hub network design under uncertainty: An inexact rough-interval fuzzy approach

    NASA Astrophysics Data System (ADS)

    Niakan, F.; Vahdani, B.; Mohammadi, M.

    2015-12-01

    This article proposes a multi-objective mixed-integer model to optimize the location of hubs within a hub network design problem under uncertainty. The considered objectives include minimizing the maximum accumulated travel time, minimizing the total costs including transportation, fuel consumption and greenhouse emissions costs, and finally maximizing the minimum service reliability. In the proposed model, it is assumed that for connecting two nodes, there are several types of arc in which their capacity, transportation mode, travel time, and transportation and construction costs are different. Moreover, in this model, determining the capacity of the hubs is part of the decision-making procedure and balancing requirements are imposed on the network. To solve the model, a hybrid solution approach is utilized based on inexact programming, interval-valued fuzzy programming and rough interval programming. Furthermore, a hybrid multi-objective metaheuristic algorithm, namely multi-objective invasive weed optimization (MOIWO), is developed for the given problem. Finally, various computational experiments are carried out to assess the proposed model and solution approaches.

  4. Blending Individual and Group Assessment: A Model for Measuring Student Performance

    ERIC Educational Resources Information Center

    Reiser, Elana

    2017-01-01

    Two sections of a college discrete mathematics class were taught using cooperative learning techniques throughout the semester. The 33 students attending these sections were randomly assigned into groups of three. Their final examination consisted of an individual and group blended examination where students worked in their groups and discussed…

  5. A Model Program for the Retention of High-Risk, Postsecondary Vocational Students. Final Report.

    ERIC Educational Resources Information Center

    Cossatot Vocational Technical School, DeQueen, AR.

    An orientation, testing, and counseling program to improve the dropout rate of at-risk postsecondary vocational students in Arkansas was developed and field tested at Cossatot Vocational Technical School, DeQueen, Arkansas. Project activities included: (1) improving prevocational exploration and assessment through the development of a…

  6. Best Practices in Documenting Workforce Success of College Graduates: Final Report

    ERIC Educational Resources Information Center

    Lessne, Deborah S.

    2004-01-01

    This report documents the Connecticut Department of Higher Education's efforts to investigate and document best practices in assessing student achievement as measured by workforce success. That effort is part of a five-state project to Define Best Practices for Responsible Accountability Models in Higher Education funded by a U.S. Department of…

  7. United States Air Force Training Line Simulator. Final Report.

    ERIC Educational Resources Information Center

    Nauta, Franz; Pierce, Michael B.

    This report describes the technical aspects and potential applications of a computer-based model simulating the flow of airmen through basic training and entry-level technical training. The objective of the simulation is to assess the impacts of alternative recruit classification and training policies under a wide variety of assumptions regarding…

  8. Teaching Sociology Students to Become Qualitative-Researchers Using an Internship Model of Learner-Support

    ERIC Educational Resources Information Center

    Tolich, Martin; Scarth, Bonnie; Shephard, Kerry

    2015-01-01

    This article examines the experiences of final year undergraduate sociology students enrolled in an internship course where they researched a local community project, mostly in small groups, for a client. A sociology lecturer supervised their projects. Course-related outcomes were assessed using conventional university procedures but a research…

  9. Information Literacy in Oman's Higher Education: A Descriptive-Inferential Approach

    ERIC Educational Resources Information Center

    Al-Aufi, Ali; Al-Azri, Hamed

    2013-01-01

    This study aims to identify the current status of information literacy among the students at Sultan Qaboos University in their final year through using the Big6 model for solving information problems. The study utilizes self-assessment survey approach, with the questionnaire as a tool for data collection. It surveyed undergraduate students of…

  10. Using Microcomputer-Based Logistics Models to Enhance Supportability Assessment for the USAF Productivity, Reliability, Availability and Maintainability (PRAM) Program Office: A Tailored Approach

    DTIC Science & Technology

    1989-09-01

    goes on to discuss how the innovation process should function within an organization, including five specific phases for successfully managing ... innovation : the recognition of opportunity; idea formulation; product defin- ition; prototype solution; and finally, technology utiliza- tion and diffusion

  11. Integrated Stable Isotope - Reactive Transport Model Approach for Assessment of Chlorinated Solvent Degradation

    DTIC Science & Technology

    2016-06-16

    immediate electron donor. In the microcosm, H2 was produced by fermentation of lactate. It was previously reported that H2 and water undergo fast...being tied to the H isotope composition of the fermentation substrate (see Kuder et al., 2013 for more information). ESTCP Final Report

  12. Simulating Radionuclide Migrations of Low-level Wastes in Nearshore Environment

    NASA Astrophysics Data System (ADS)

    Lu, C. C.; Li, M. H.; Chen, J. S.; Yeh, G. T.

    2016-12-01

    Tunnel disposal into nearshore mountains was tentatively selected as one of final disposal sites for low-level wastes in Taiwan. Safety assessment on radionuclide migrations in far-filed may involve geosphere processes under coastal environments and into nearshore ocean. In this study the 3-D HYDROFEOCHE5.6 numerical model was used to perform simulations of groundwater flow and radionuclide transport with decay chains. Domain of interest on the surface includes nearby watersheds delineated by digital elevation models and nearshore seabed. As deep as 800 m below the surface and 400 m below sea bed were considered for simulations. The disposal site was located at 200m below the surface. Release rates of radionuclides from near-field was estimated by analytical solutions of radionuclide diffusion with decay out of engineered barriers. Far-field safety assessments were performed starting from the release of radionuclides out of engineered barriers to a time scale of 10,000 years. Sensitivity analyses of geosphere and transport parameters were performed to improve our understanding of safety on final disposal of low-level waste in nearshore environments.

  13. Coupling Processes Between Atmospheric Chemistry and Climate

    NASA Technical Reports Server (NTRS)

    Ko, Malcolm K. W.; Weisenstein, Debra; Rodriguez, Jose; Danilin, Michael; Scott, Courtney; Shia, Run-Lie; Eluszkiewicz, Junusz; Sze, Nien-Dak

    1999-01-01

    This is the final report. The overall objective of this project is to improve the understanding of coupling processes among atmospheric chemistry, aerosol and climate, all important for quantitative assessments of global change. Among our priority are changes in ozone and stratospheric sulfate aerosol, with emphasis on how ozone in the lower stratosphere would respond to natural or anthropogenic changes. The work emphasizes two important aspects: (1) AER's continued participation in preparation of, and providing scientific input for, various scientific reports connected with assessment of stratospheric ozone and climate. These include participation in various model intercomparison exercises as well as preparation of national and international reports. and (2) Continued development of the AER three-wave interactive model to address how the transport circulation will change as ozone and the thermal properties of the atmosphere change, and assess how these new findings will affect our confidence in the ozone assessment results.

  14. IRIS Toxicological Review of Benzo[a]pyrene (Final Report)

    EPA Science Inventory

    EPA has finalized the Integrated Risk Information System (IRIS) assessment of benzo[a]pyrene. This assessment addresses the potential cancer and noncancer human health effects from long-term exposure to benzo[a]pyrene. Now final, this assessment will update the toxicological info...

  15. IRIS Toxicological Review of Trimethylbenzenes (Final Report)

    EPA Science Inventory

    EPA has finalized the Integrated Risk Information System (IRIS) Assessment of Trimethylbenzenes (TMBs). This assessment addresses the potential non-cancer and cancer human health effects from long-term exposure to TMBs. Now final, this assessment will be the first IRIS a...

  16. How Reliable is Bayesian Model Averaging Under Noisy Data? Statistical Assessment and Implications for Robust Model Selection

    NASA Astrophysics Data System (ADS)

    Schöniger, Anneli; Wöhling, Thomas; Nowak, Wolfgang

    2014-05-01

    Bayesian model averaging ranks the predictive capabilities of alternative conceptual models based on Bayes' theorem. The individual models are weighted with their posterior probability to be the best one in the considered set of models. Finally, their predictions are combined into a robust weighted average and the predictive uncertainty can be quantified. This rigorous procedure does, however, not yet account for possible instabilities due to measurement noise in the calibration data set. This is a major drawback, since posterior model weights may suffer a lack of robustness related to the uncertainty in noisy data, which may compromise the reliability of model ranking. We present a new statistical concept to account for measurement noise as source of uncertainty for the weights in Bayesian model averaging. Our suggested upgrade reflects the limited information content of data for the purpose of model selection. It allows us to assess the significance of the determined posterior model weights, the confidence in model selection, and the accuracy of the quantified predictive uncertainty. Our approach rests on a brute-force Monte Carlo framework. We determine the robustness of model weights against measurement noise by repeatedly perturbing the observed data with random realizations of measurement error. Then, we analyze the induced variability in posterior model weights and introduce this "weighting variance" as an additional term into the overall prediction uncertainty analysis scheme. We further determine the theoretical upper limit in performance of the model set which is imposed by measurement noise. As an extension to the merely relative model ranking, this analysis provides a measure of absolute model performance. To finally decide, whether better data or longer time series are needed to ensure a robust basis for model selection, we resample the measurement time series and assess the convergence of model weights for increasing time series length. We illustrate our suggested approach with an application to model selection between different soil-plant models following up on a study by Wöhling et al. (2013). Results show that measurement noise compromises the reliability of model ranking and causes a significant amount of weighting uncertainty, if the calibration data time series is not long enough to compensate for its noisiness. This additional contribution to the overall predictive uncertainty is neglected without our approach. Thus, we strongly advertise to include our suggested upgrade in the Bayesian model averaging routine.

  17. Backward-stochastic-differential-equation approach to modeling of gene expression

    NASA Astrophysics Data System (ADS)

    Shamarova, Evelina; Chertovskih, Roman; Ramos, Alexandre F.; Aguiar, Paulo

    2017-03-01

    In this article, we introduce a backward method to model stochastic gene expression and protein-level dynamics. The protein amount is regarded as a diffusion process and is described by a backward stochastic differential equation (BSDE). Unlike many other SDE techniques proposed in the literature, the BSDE method is backward in time; that is, instead of initial conditions it requires the specification of end-point ("final") conditions, in addition to the model parametrization. To validate our approach we employ Gillespie's stochastic simulation algorithm (SSA) to generate (forward) benchmark data, according to predefined gene network models. Numerical simulations show that the BSDE method is able to correctly infer the protein-level distributions that preceded a known final condition, obtained originally from the forward SSA. This makes the BSDE method a powerful systems biology tool for time-reversed simulations, allowing, for example, the assessment of the biological conditions (e.g., protein concentrations) that preceded an experimentally measured event of interest (e.g., mitosis, apoptosis, etc.).

  18. Backward-stochastic-differential-equation approach to modeling of gene expression.

    PubMed

    Shamarova, Evelina; Chertovskih, Roman; Ramos, Alexandre F; Aguiar, Paulo

    2017-03-01

    In this article, we introduce a backward method to model stochastic gene expression and protein-level dynamics. The protein amount is regarded as a diffusion process and is described by a backward stochastic differential equation (BSDE). Unlike many other SDE techniques proposed in the literature, the BSDE method is backward in time; that is, instead of initial conditions it requires the specification of end-point ("final") conditions, in addition to the model parametrization. To validate our approach we employ Gillespie's stochastic simulation algorithm (SSA) to generate (forward) benchmark data, according to predefined gene network models. Numerical simulations show that the BSDE method is able to correctly infer the protein-level distributions that preceded a known final condition, obtained originally from the forward SSA. This makes the BSDE method a powerful systems biology tool for time-reversed simulations, allowing, for example, the assessment of the biological conditions (e.g., protein concentrations) that preceded an experimentally measured event of interest (e.g., mitosis, apoptosis, etc.).

  19. Mid-term Results of Total Knee Arthroplasty Using PFC Sigma RP-F.

    PubMed

    Kim, Jun-Young; Cheon, Sang-Ho; Kyung, Hee-Soo

    2012-12-01

    We compared the mid-term results after total knee arthroplasty (TKA) using PFC Sigma RP-F mobile model with PFC Sigma PS fixed model. We analyzed 45 knees that underwent TKA with PFC Sigma RP-Fn (study group) and 45 knees with PFC Sigma PS (control group). The mean follow-up period was 65 months (range, 60-69 months). The evaluation system of the American knee society was used for clinical and radiological assessment. Also, the maximal knee flexion angle was assessed. The mean maximum flexion angle in the study group (135°) was greater than that in the control group (125°) at the early post-operation & final follow-up period (p=0.033). The range of motion (ROM) in the study group was recovered earlier at the postoperative 6 months, and ROM gain was improved to a greater extent at the final follow-up period (p=0.039). The knee score and function score and radiographic evaluation were no difference between the two groups (p>0.05) at the final follow-up. The two cases of radiolucency in posterior condyle and medial tibial plateau and one case of patellar elongation were seen in the study group. The PFC Sigma RP-F mobile system appears to facilitate greater maximum flexion angle and ROM gain with two cases of radiolucent line.

  20. Development of a Remote Accessibility Assessment System through three-dimensional reconstruction technology.

    PubMed

    Kim, Jong Bae; Brienza, David M

    2006-01-01

    A Remote Accessibility Assessment System (RAAS) that uses three-dimensional (3-D) reconstruction technology is being developed; it enables clinicians to assess the wheelchair accessibility of users' built environments from a remote location. The RAAS uses commercial software to construct 3-D virtualized environments from photographs. We developed custom screening algorithms and instruments for analyzing accessibility. Characteristics of the camera and 3-D reconstruction software chosen for the system significantly affect its overall reliability. In this study, we performed an accuracy assessment to verify that commercial hardware and software can construct accurate 3-D models by analyzing the accuracy of dimensional measurements in a virtual environment and a comparison of dimensional measurements from 3-D models created with four cameras/settings. Based on these two analyses, we were able to specify a consumer-grade digital camera and PhotoModeler (EOS Systems, Inc, Vancouver, Canada) software for this system. Finally, we performed a feasibility analysis of the system in an actual environment to evaluate its ability to assess the accessibility of a wheelchair user's typical built environment. The field test resulted in an accurate accessibility assessment and thus validated our system.

  1. IRIS Toxicological Review of Trimethylbenzenes (Final Report ...

    EPA Pesticide Factsheets

    EPA has finalized the Integrated Risk Information System (IRIS) Assessment of Trimethylbenzenes (TMBs). This assessment addresses the potential non-cancer and cancer human health effects from long-term exposure to TMBs. Now final, this assessment will be the first IRIS assessment for TMBs that may be used by EPA’s program and regional offices to inform decisions to protect human health. The IRIS Toxicological Review of Trimethylbenzenes was originally released for a 60-day public comment period on June 29, 2012. EPA revised the toxicological review in response to the public comments received and released the finalized TMB assessment.

  2. Psychometrics of the preschooler physical activity parenting practices instrument among a Latino sample.

    PubMed

    O'Connor, Teresia M; Cerin, Ester; Hughes, Sheryl O; Robles, Jessica; Thompson, Deborah I; Mendoza, Jason A; Baranowski, Tom; Lee, Rebecca E

    2014-01-15

    Latino preschoolers (3-5 year old children) have among the highest rates of obesity. Low levels of physical activity (PA) are a risk factor for obesity. Characterizing what Latino parents do to encourage or discourage their preschooler to be physically active can help inform interventions to increase their PA. The objective was therefore to develop and assess the psychometrics of a new instrument: the Preschooler Physical Activity Parenting Practices (PPAPP) among a Latino sample, to assess parenting practices used to encourage or discourage PA among preschool-aged children. Cross-sectional study of 240 Latino parents who reported the frequency of using PA parenting practices. 95% of respondents were mothers; 42% had more than a high school education. Child mean age was 4.5 (±0.9) years (52% male). Test-retest reliability was assessed in 20%, 2 weeks later. We assessed the fit of a priori models using Confirmatory factor analyses (CFA). In a separate sub-sample (35%), preschool-aged children wore accelerometers to assess associations with their PA and PPAPP subscales. The a-priori models showed poor fit to the data. A modified factor structure for encouraging PPAPP had one multiple-item scale: engagement (15 items), and two single-items (have outdoor toys; not enroll in sport-reverse coded). The final factor structure for discouraging PPAPP had 4 subscales: promote inactive transport (3 items), promote screen time (3 items), psychological control (4 items) and restricting for safety (4 items). Test-retest reliability (ICC) for the two scales ranged from 0.56-0.85. Cronbach's alphas ranged from 0.5-0.9. Several sub-factors correlated in the expected direction with children's objectively measured PA. The final models for encouraging and discouraging PPAPP had moderate to good fit, with moderate to excellent test-retest reliabilities. The PPAPP should be further evaluated to better assess its associations with children's PA and offers a new tool for measuring PPAPP among Latino families with preschool-aged children.

  3. Understanding the Patient Perspective of Seizure Severity in Epilepsy: Development of a Conceptual Model.

    PubMed

    Borghs, Simon; Tomaszewski, Erin L; Halling, Katarina; de la Loge, Christine

    2016-10-01

    For patients with uncontrolled epilepsy, the severity and postictal sequelae of seizures might be more impactful than their frequency. Seizure severity is often assessed using patient-reported outcome (PRO) instruments; however, evidence of content validity for existing instruments is lacking. Our aim was to understand the real-life experiences of patients with uncontrolled epilepsy. A preliminary conceptual model was developed. The model was refined through (1) a targeted literature review of qualitative research on seizure severity; (2) interviews with four clinical epilepsy experts to evaluate identified concepts; and (3) qualitative interviews with patients with uncontrolled epilepsy, gathering descriptions of symptoms and impacts of epilepsy, focusing on how patients experience and describe "seizure severity." Findings were summarized in a final conceptual model of seizure severity in epilepsy. Twenty-five patients (12 who experienced primary generalized tonic-clonic seizures and 13 who experienced partial-onset seizures) expressed 42 different symptoms and 26 different impacts related to seizures. The final conceptual model contained a wide range of concepts related to seizure frequency, symptoms, and duration. Our model identified several new concepts that characterize the patient experience of seizure severity. A seizure severity PRO instrument should cover a wide range of seizure symptoms alongside frequency and duration of seizures. This qualitative work reinforces the notion that measuring seizure frequency is insufficient and that seizure severity is important in defining the patient's experience of epilepsy. This model could be used to assess the content validity of existing PRO instruments, or could support the development of a new one.

  4. Endobronchial ultrasound-guided transbronchial needle aspiration: performance of biomedical scientists on rapid on-site evaluation and preliminary diagnosis.

    PubMed

    Schacht, M J; Toustrup, C B; Madsen, L B; Martiny, M S; Larsen, B B; Simonsen, J T

    2016-10-01

    Rapid on-site evaluation (ROSE) of endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) followed by a subsequent preliminary adequacy assessment and a preliminary diagnosis, was performed at Aarhus University Hospital by biomedical scientists (BMS). The aim of this study was to evaluate the BMS accuracy of ROSE adequacy assessment, the preliminary adequacy assessment and the preliminary diagnosis as compared with the cytopathologist-rendered final adequacy assessment and final diagnosis. The BMS-rendered assessments for 717 sites from 319 consecutive patients over a 4-month period were compared with the cytopathologist-rendered assessments. Comparisons of adequacy and preliminary diagnoses were based on inter-observer Cohen's Kappa coefficient with a 95% confidence interval (CI). Strong correlations between ROSE and final adequacy assessments [Kappa coefficient of 0.90 (CI: 0.85-0.96)] and between the preliminary and final adequacy assessments [Kappa coefficient of 0.93 (CI: 0.87-0.99)] were found. As for the correlation between the preliminary and final diagnoses, the Kappa coefficient was 0.99 (CI: 0.98-1). Both ROSE and preliminary adequacy assessments as well as preliminary diagnoses, all performed by BMS, were highly accurate when compared with the final assessment by the cytopathologist. © 2016 John Wiley & Sons Ltd.

  5. Screening level risk assessment model for chemical fate and effects in the environment.

    PubMed

    Arnot, Jon A; Mackay, Don; Webster, Eva; Southwood, Jeanette M

    2006-04-01

    A screening level risk assessment model is developed and described to assess and prioritize chemicals by estimating environmental fate and transport, bioaccumulation, and exposure to humans and wildlife for a unit emission rate. The most sensitive risk endpoint is identified and a critical emission rate is then calculated as a result of that endpoint being reached. Finally, this estimated critical emission rate is compared with the estimated actual emission rate as a risk assessment factor. This "back-tracking" process avoids the use of highly uncertain emission rate data as model input. The application of the model is demonstrated in detail for three diverse chemicals and in less detail for a group of 70 chemicals drawn from the Canadian Domestic Substances List. The simple Level II and the more complex Level III fate calculations are used to "bin" substances into categories of similar probable risk. The essential role of the model is to synthesize information on chemical and environmental properties within a consistent mass balance framework to yield an overall estimate of screening level risk with respect to the defined endpoint. The approach may be useful to identify and prioritize those chemicals of commerce that are of greatest potential concern and require more comprehensive modeling and monitoring evaluations in actual regional environments and food webs.

  6. DOE SBIR Phase II Final Technical Report - Assessing Climate Change Effects on Wind Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiteman, Cameron; Capps, Scott

    Specialized Vertum Partners software tools were prototyped, tested and commercialized to allow wind energy stakeholders to assess the uncertainties of climate change on wind power production and distribution. This project resulted in three commercially proven products and a marketing tool. The first was a Weather Research and Forecasting Model (WRF) based resource evaluation system. The second was a web-based service providing global 10m wind data from multiple sources to wind industry subscription customers. The third product addressed the needs of our utility clients looking at climate change effects on electricity distribution. For this we collaborated on the Santa Ana Wildfiremore » Threat Index (SAWTi), which was released publicly last quarter. Finally to promote these products and educate potential users we released “Gust or Bust”, a graphic-novel styled marketing publication.« less

  7. Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.

    1997-01-01

    Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.

  8. An experimental study of wall adaptation and interference assessment using Cauchy integral formula

    NASA Technical Reports Server (NTRS)

    Murthy, A. V.

    1991-01-01

    This paper summarizes the results of an experimental study of combined wall adaptation and residual interference assessment using the Cauchy integral formula. The experiments were conducted on a supercritical airfoil model in the Langley 0.3-m Transonic Cryogenic Tunnel solid flexible wall test section. The ratio of model chord to test section height was about 0.7. The method worked satisfactorily in reducing the blockage interference and demonstrated the primary requirement for correcting for the blockage effects at high model incidences to correctly determine high lift characteristics. The studies show that the method has potential for reducing the residual interference to considerably low levels. However, corrections to blockage and upwash velocities gradients may still be required for the final adapted wall shapes.

  9. Rasch Model Analysis with the BICAL Computer Program

    DTIC Science & Technology

    1976-09-01

    and persons which lead to measures that persist from trial to trial . The measurement model is essential in this process because it provides a framework...and his students. Section 15 two derives the estimating equations for the Bernoulli (i.e. one trial per task) form : " and then generalizes to the...Binomial form (several trials per task). Finall) goodness of fit tests are presented for assessing the adequacy of the calibration. t { ) I I I 41 CHAPTER

  10. A Consensus Model: Shifting assessment practices in dietetics tertiary education.

    PubMed

    Bacon, Rachel; Kellett, Jane; Dart, Janeane; Knight-Agarwal, Cathy; Mete, Rebecca; Ash, Susan; Palermo, Claire

    2018-02-21

    The aim of this research was to evaluate a Consensus Model for competency-based assessment. An evaluative case study was used to allow a holistic examination of a constructivist-interpretivist programmatic model of assessment. Using a modified Delphi process, the competence of all 29 students enrolled in their final year of a Master of Nutrition and Dietetics course was assessed by a panel (with expertise in competency-based assessment; industry and academic representation) from a course e-portfolio (that included the judgements of student performance made by worksite educators) and a panel interview. Data were triangulated with assessments from a capstone internship. Qualitative descriptive studies with worksite educators (focus groups n = 4, n = 5, n = 8) and students (personal interviews n = 29) explored stakeholder experiences analysed using thematic analysis. Panel consensus was achieved for all cases by the third-round and corroborated by internship outcomes. For 34% of students this differed to the 'interpretations' of their performance made by their worksite educator/s. Emerging qualitative themes from stakeholder data found the model: (i) supported sustainable assessment practices; (ii) shifted the power relationship between students and worksite educators and (iii) provided a fair method to assess competence. To maximise benefits, more refinement, resources and training are required. This research questions competency-based assessment practices based on discrete placement units and supports a constructivist-interpretivist programmatic approach where evidence across a whole course of study is considered by a panel of assessors. © 2018 Dietitians Association of Australia.

  11. Final Revisions Rule Significant Contribution Assessment TSD

    EPA Pesticide Factsheets

    This Technical Support Document (TSD) presents quantitative assessments of the relationship between final revisions to the Transport Rule and the original analysis conducted for the final Transport Rule.

  12. Using phenomenological models for forecasting the 2015 Ebola challenge.

    PubMed

    Pell, Bruce; Kuang, Yang; Viboud, Cecile; Chowell, Gerardo

    2018-03-01

    The rising number of novel pathogens threatening the human population has motivated the application of mathematical modeling for forecasting the trajectory and size of epidemics. We summarize the real-time forecasting results of the logistic equation during the 2015 Ebola challenge focused on predicting synthetic data derived from a detailed individual-based model of Ebola transmission dynamics and control. We also carry out a post-challenge comparison of two simple phenomenological models. In particular, we systematically compare the logistic growth model and a recently introduced generalized Richards model (GRM) that captures a range of early epidemic growth profiles ranging from sub-exponential to exponential growth. Specifically, we assess the performance of each model for estimating the reproduction number, generate short-term forecasts of the epidemic trajectory, and predict the final epidemic size. During the challenge the logistic equation consistently underestimated the final epidemic size, peak timing and the number of cases at peak timing with an average mean absolute percentage error (MAPE) of 0.49, 0.36 and 0.40, respectively. Post-challenge, the GRM which has the flexibility to reproduce a range of epidemic growth profiles ranging from early sub-exponential to exponential growth dynamics outperformed the logistic growth model in ascertaining the final epidemic size as more incidence data was made available, while the logistic model underestimated the final epidemic even with an increasing amount of data of the evolving epidemic. Incidence forecasts provided by the generalized Richards model performed better across all scenarios and time points than the logistic growth model with mean RMS decreasing from 78.00 (logistic) to 60.80 (GRM). Both models provided reasonable predictions of the effective reproduction number, but the GRM slightly outperformed the logistic growth model with a MAPE of 0.08 compared to 0.10, averaged across all scenarios and time points. Our findings further support the consideration of transmission models that incorporate flexible early epidemic growth profiles in the forecasting toolkit. Such models are particularly useful for quickly evaluating a developing infectious disease outbreak using only case incidence time series of the early phase of an infectious disease outbreak. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  13. The quantitative modelling of human spatial habitability

    NASA Technical Reports Server (NTRS)

    Wise, J. A.

    1985-01-01

    A model for the quantitative assessment of human spatial habitability is presented in the space station context. The visual aspect assesses how interior spaces appear to the inhabitants. This aspect concerns criteria such as sensed spaciousness and the affective (emotional) connotations of settings' appearances. The kinesthetic aspect evaluates the available space in terms of its suitability to accommodate human movement patterns, as well as the postural and anthrometric changes due to microgravity. Finally, social logic concerns how the volume and geometry of available space either affirms or contravenes established social and organizational expectations for spatial arrangements. Here, the criteria include privacy, status, social power, and proxemics (the uses of space as a medium of social communication).

  14. An Exploratory Study: Assessment of Modeled Dioxin ...

    EPA Pesticide Factsheets

    EPA announced the availability of the final report, An Exploratory Study: Assessment of Modeled Dioxin Exposure in Ceramic Art Studios. This report investigates the potential dioxin exposure to artists/hobbyists who use ball clay to make pottery and related products. Dermal, inhalation, and ingestion exposures to clay were measured at the ceramics art department of Ohio State University in Columbus, OH. The exposure estimates were based on measured levels of clay in the studio air, deposited on surrogate food samples and on the skin of the artists. The purpose of this report is to describe an exploratory investigation of potential dioxin exposures to artists/hobbyists who use ball clay to make pottery and related products.

  15. A geometric nonlinear degenerated shell element using a mixed formulation with independently assumed strain fields. Final Report; Ph.D. Thesis, 1989

    NASA Technical Reports Server (NTRS)

    Graf, Wiley E.

    1991-01-01

    A mixed formulation is chosen to overcome deficiencies of the standard displacement-based shell model. Element development is traced from the incremental variational principle on through to the final set of equilibrium equations. Particular attention is paid to developing specific guidelines for selecting the optimal set of strain parameters. A discussion of constraint index concepts and their predictive capability related to locking is included. Performance characteristics of the elements are assessed in a wide variety of linear and nonlinear plate/shell problems. Despite limiting the study to geometric nonlinear analysis, a substantial amount of additional insight concerning the finite element modeling of thin plate/shell structures is provided. For example, in nonlinear analysis, given the same mesh and load step size, mixed elements converge in fewer iterations than equivalent displacement-based models. It is also demonstrated that, in mixed formulations, lower order elements are preferred. Additionally, meshes used to obtain accurate linear solutions do not necessarily converge to the correct nonlinear solution. Finally, a new form of locking was identified associated with employing elements designed for biaxial bending in uniaxial bending applications.

  16. The OARSI Histopathology Initiative - Recommendations for Histological Assessments of Osteoarthritis in the Guinea Pig

    PubMed Central

    Kraus, Virginia B; Huebner, Janet L.; DeGroot, Jeroen; Bendele, Alison

    2010-01-01

    Objective This review focuses on the criteria for assessing osteoarthritis (OA) in the guinea pig at the macroscopic and microscopic levels, and recommends particular assessment criteria to assist standardization in the conduct and reporting of preclinical trails in guinea pig models of OA. Methods A review was conducted of all OA studies from 1958 until the present that utilized the guinea pig. The PubMed database was originally searched August 1, 2006 using the following search terms: guinea pig and osteoarthritis. We continued to check the database periodically throughout the process of preparing this chapter and the final search was conducted January 7, 2009. Additional studies were found in a review of abstracts from the OsteoArthritis Research Society International (OARSI) conferences, Orthopaedic Research Society (ORS) conferences, and literature related to histology in other preclinical models of OA reviewed for relevant references. Studies that described or used systems for guinea pig joint scoring on a macroscopic, microscopic, or ultrastructural basis were included in the final comprehensive summary and review. General recommendations regarding methods of OA assessment in the guinea pig were derived on the basis of a comparison across studies and an inter-rater reliability assessment of the recommended scoring system. Results A histochemical-histological scoring system (based on one first introduced by H. Mankin) is recommended for semi-quantitative histological assessment of OA in the guinea pig, due to its already widespread adoption, ease of use, similarity to scoring systems used for OA in humans, its achievable high inter-rater reliability, and its demonstrated correlation with synovial fluid biomarker concentrations. Specific recommendations are also provided for histological scoring of synovitis and scoring of macroscopic lesions of OA. Conclusions As summarized herein, a wealth of tools exist to aid both in the semi-quantitative and quantitative assessment of OA in the guinea pig and provide a means of comprehensively characterizing the whole joint organ. In an ongoing effort at standardization, we recommend specific criteria for assessing the guinea pig model of OA as part of an OARSI initiative, termed herein the OARSI-HISTOgp recommendations. PMID:20864022

  17. Definition of 1992 Technology Aircraft Noise Levels and the Methodology for Assessing Airplane Noise Impact of Component Noise Reduction Concepts

    NASA Technical Reports Server (NTRS)

    Kumasaka, Henry A.; Martinez, Michael M.; Weir, Donald S.

    1996-01-01

    This report describes the methodology for assessing the impact of component noise reduction on total airplane system noise. The methodology is intended to be applied to the results of individual study elements of the NASA-Advanced Subsonic Technology (AST) Noise Reduction Program, which will address the development of noise reduction concepts for specific components. Program progress will be assessed in terms of noise reduction achieved, relative to baseline levels representative of 1992 technology airplane/engine design and performance. In this report, the 1992 technology reference levels are defined for assessment models based on four airplane sizes - an average business jet and three commercial transports: a small twin, a medium sized twin, and a large quad. Study results indicate that component changes defined as program final goals for nacelle treatment and engine/airframe source noise reduction would achieve from 6-7 EPNdB reduction of total airplane noise at FAR 36 Stage 3 noise certification conditions for all of the airplane noise assessment models.

  18. An Applied Framework for Incorporating Multiple Sources of Uncertainty in Fisheries Stock Assessments.

    PubMed

    Scott, Finlay; Jardim, Ernesto; Millar, Colin P; Cerviño, Santiago

    2016-01-01

    Estimating fish stock status is very challenging given the many sources and high levels of uncertainty surrounding the biological processes (e.g. natural variability in the demographic rates), model selection (e.g. choosing growth or stock assessment models) and parameter estimation. Incorporating multiple sources of uncertainty in a stock assessment allows advice to better account for the risks associated with proposed management options, promoting decisions that are more robust to such uncertainty. However, a typical assessment only reports the model fit and variance of estimated parameters, thereby underreporting the overall uncertainty. Additionally, although multiple candidate models may be considered, only one is selected as the 'best' result, effectively rejecting the plausible assumptions behind the other models. We present an applied framework to integrate multiple sources of uncertainty in the stock assessment process. The first step is the generation and conditioning of a suite of stock assessment models that contain different assumptions about the stock and the fishery. The second step is the estimation of parameters, including fitting of the stock assessment models. The final step integrates across all of the results to reconcile the multi-model outcome. The framework is flexible enough to be tailored to particular stocks and fisheries and can draw on information from multiple sources to implement a broad variety of assumptions, making it applicable to stocks with varying levels of data availability The Iberian hake stock in International Council for the Exploration of the Sea (ICES) Divisions VIIIc and IXa is used to demonstrate the framework, starting from length-based stock and indices data. Process and model uncertainty are considered through the growth, natural mortality, fishing mortality, survey catchability and stock-recruitment relationship. Estimation uncertainty is included as part of the fitting process. Simple model averaging is used to integrate across the results and produce a single assessment that considers the multiple sources of uncertainty.

  19. 75 FR 38809 - Marquette Board of Light and Power; Notice of Availability of Final Environmental Assessment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-06

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 2589-057--Michigan] Marquette Board of Light and Power; Notice of Availability of Final Environmental Assessment June 28, 2010... prepared a Final Environmental Assessment (FEA) regarding Marquette Board of Light and Power's plan to...

  20. Integrated Science Assessment (ISA) for Sulfur Oxides – Health Criteria (Final Report, Sep 2008)

    EPA Science Inventory

    EPA announced the availability of the final report, Integrated Science Assessment (ISA) for Sulfur Oxides – Health Criteria final assessment. This report represents a concise synthesis and evaluation of the most policy-relevant science and will ultimately provide the scien...

  1. A comparison between the multimedia fate and exposure models CalTOX and uniform system for evaluation of substances adapted for life-cycle assessment based on the population intake fraction of toxic pollutants.

    PubMed

    Huijbregts, Mark A J; Geelen, Loes M J; Hertwich, Edgar G; McKone, Thomas E; van de Meent, Dik

    2005-02-01

    In life-cycle assessment (LCA) and comparative risk assessment, potential human exposure to toxic pollutants can be expressed as the population intake fraction (iF), which represents the fraction of the quantity emitted that enters the human population. To assess the influence of model differences in the calculation of the population iF ingestion and inhalation iFs of 365 substances emitted to air, freshwater, and soil were calculated with two commonly applied multimedia fate and exposure models, CalTOX and the uniform system for evaluation of substances adapted for life-cycle assessment (USES-LCA). The model comparison showed that differences in the iFs due to model choices were the lowest after emission to air and the highest after emission to soil. Inhalation iFs were more sensitive to model differences compared to ingestion iFs. The choice for a continental seawater compartment, vertical stratification of the soil compartment, rain and no-rain scenarios, and drinking water purification mainly clarify the relevant model differences found in population iFs. Furthermore, pH correction of chemical properties and aerosol-associated deposition on plants appeared to be important for dissociative organics and metals emitted to air, respectively. Finally, it was found that quantitative structure-activity relationship estimates for superhydrophobics may introduce considerable uncertainty in the calculation of population intake fractions.

  2. Test of a foraging-bioenergetics model to evaluate growth dynamics of endangered pallid sturgeon (Scaphirhynchus albus)

    USGS Publications Warehouse

    Deslauriers, David; Heironimus, Laura B.; Chipps, Steven R.

    2016-01-01

    Factors affecting feeding and growth of early life stages of the federally endangered pallid sturgeon (Scaphirhynchus albus) are not fully understood, owing to their scarcity in the wild. In this study was we evaluated the performance of a combined foraging-bioenergetics model as a tool for assessing growth of age-0 pallid sturgeon in the Missouri River. In the laboratory, three size classes of sturgeon larvae (18–44 mm; 0.027–0.329 g) were grown for 7 to 14 days under differing temperature (14–24 °C) and prey density (0–9 Chironomidae larvae/d) regimes. After accounting for effects of water temperature and prey density on fish activity, we compared observed final weight, final length, and number of prey consumed to values generated from the foraging-bioenergetics model. When confronted with an independent dataset, the combined model provided reliable estimates (within 13% of observations) of fish growth and prey consumption, underscoring the usefulness of the modeling approach for evaluating growth dynamics of larval fish when empirical data are lacking.

  3. Environmental Assessment: Proposed U.S. Air Force Military Family Housing Privatization Initiative Patrick Air Force Base, Florida

    DTIC Science & Technology

    2008-11-03

    Australian pine Cormorants resting on pilings in Banana River ( near North Housing Area ). FINAL Environmental Assessment for Proposed Privatization...the North and Central Housing areas are privatized. FINAL Environmental Assessment for Proposed Privatization of Military Housing PAFB... areas , although numerous groundwater wells are located immediately adjacent to both sites (Figure 3-1). FINAL Environmental Assessment for Proposed

  4. How Adverse Outcome Pathways Can Aid the Development and Use of Computational Prediction Models for Regulatory Toxicology

    PubMed Central

    Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J.; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M. E. (Bette); Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M.; Whelan, Maurice

    2017-01-01

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24–25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. PMID:27994170

  5. The structure of DSM-IV-TR personality disorder diagnoses in NESARC: a reanalysis.

    PubMed

    Trull, Timothy J; Vergés, Alvaro; Wood, Phillip K; Sher, Kenneth J

    2013-12-01

    Cox, Clara, Worobec, and Grant (2012) recently presented results from a series of analyses aimed at identifying the factor structure underlying the DSM-IV-TR (APA, 2000) personality diagnoses assessed in the large NESARC study. Cox et al. (2012) concluded that the best fitting model was one that modeled three lower-order factors (the three clusters of PDs as outlined by DSM-IV-TR), which in turn loaded on a single PD higher-order factor. Our reanalyses of the NESARC Wave 1 and Wave 2 data for personality disorder diagnoses revealed that the best fitting model was that of a general PD factor that spans each of the ten DSM-IV PD diagnoses, and our reanalyses do not support the three-cluster hierarchical structure outlined by Cox et al. (2012) and DSM-IV-TR. Finally, we note the importance of modeling the Wave 2 assessment method factor in analyses of NESARC PD data.

  6. Numerical Assessment of Rockbursting.

    DTIC Science & Technology

    1987-05-27

    static equilibrium, nonlinear elasticity, strain-softening • material , unstable propagation of pre-existing cracks , and finally - surface...structure of LINOS, which is common to most of the large finite element codes, the library of element and material subroutines can be easily expanded... material model subroutines , are tested by comparing finite element results with analytical or numerical results derived for hypo-elastic and

  7. Personality Measurement with Mentally Retarded and Other Sub-Cultural Adults. Final Report.

    ERIC Educational Resources Information Center

    Eber, Herbert W.

    Two 160-item experimental forms of multidimensional personality test to assess vocational potential of clients of limited literacy (third grade reading level) were developed and administered to clients at rehabilitation centers and at centers for the retarded. Using the 16 Personality Factors Test as a model, items were constructed to do the…

  8. ASSESSING THE IMPACT OF HUMAN PON1 POLYMORPHISMS: SENSITIVITY AND MONTE CARLO ANALYSES USING A PHYSIOLOGICALLY BASED PHARMACOKINETIC/ PHARMACODYNAMIC (PBPK/PD) MODEL FOR CHLORPYRIFOS. (R828608)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  9. Vocational Assessment for Special Needs Individuals. Project Final Report, Phase I, 1979-1980.

    ERIC Educational Resources Information Center

    Stodden, Robert A.

    The present state of the art of vocational evaluation is largely manifested within the field of vocational rehabilitation, and as a result, the concepts, instrumentation, and strategies do not lend themselves readily to an educational setting. This project attempted to bridge the gap between the conceptualized model of vocational evaluation within…

  10. Final Environmental Assessment for the Beddown and Flight Operations of Unmanned Aircraft Systems at Grand Forks Air Force Base, North Dakota

    DTIC Science & Technology

    2008-08-01

    Medical Group provides dental and medical services to military personnel and their families on GFAFB. EA for the Beddown and Flight Operations of...for System Safety Moulton, Carey L. 1990. Air Force Procedure For Predicting Aircraft Noise Around Airbases: Noise Exposure Model (NOISEMAP

  11. EUCLID IN RETROSPECT, 1967 CONFERENCE BULLETIN.

    ERIC Educational Resources Information Center

    Euclid English Demonstration Center, OH.

    A PROJECT ENGLISH GRANT IN 1962 ESTABLISHED THE EUCLID ENGLISH DEMONSTRATION CENTER (EEDC) TO DEVELOP AND MAKE AVAILABLE ON A NATIONAL BASIS A MODEL ENGLISH CURRICULUM. THE SIX PAPERS OF THIS EEDC FINAL REPORT FOCUS ON THE WORK OF THE CENTER, BUT ALSO ASSESS AND COMMENT MORE BROADLY UPON MANY OF THE PROBLEMS OF ENGLISH TEACHING TODAY.…

  12. The New Jersey Department of Higher Education's Health Manpower Planning Contract. Final Comprehensive Report.

    ERIC Educational Resources Information Center

    Tomson, Jon; Vasilenko, Patricia

    This report of the New Jersey Department of Higher Education's Health Manpower Planning Contract describes project activities and assessment methods and models, presents data on employment supply and demand, and provides samples of questionnaires and forms that were used. The department was required under contract to: survey and evaluate pertinent…

  13. Analysis and Approach to the Development of an Advanced Multimedia Instructional System. Volume I. Final Report.

    ERIC Educational Resources Information Center

    Rhode, William E.; And Others

    In order to examine the possibilities for an advanced multimedia instructional system, a review and assessment of current instructional media was undertaken in terms of a functional description, instructional flexibility, support requirements, and costs. Following this, a model of an individual instructional system was developed as a basis for…

  14. Criteria of Effectiveness for Network Delivery of Citizens Information through Libraries. Final Report.

    ERIC Educational Resources Information Center

    Chen, Ching-chih; Hernon, Peter

    This two-part publication reports on a study of consumer information delivery by library and non-library networks, which involved an extensive literature review, a telephone survey of 620 library networks, the development of an assessment model for the effectiveness of network information delivery, the development of an in-depth guide for…

  15. Warm prestress effects in fracture-margin assessment of PWR-RPVs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shum, D.K.M.

    This paper examines various issues that would impact the incorporation of warm prestress (WPS) effects in the fracture-margin assessment of reactor pressure vessels (RPVs). By way of an example problem, possible beneficial effects of including type-I WPS in the assessment of an RPV subjected to a small break loss of coolant accident are described. In addition, the need to consider possible loss of constraint effects when interpreting available small specimen WPS-enhanced fracture toughness data is demonstrated through two- and three-dimensional local crack-lip field analyses of a compact tension specimen. Finally, a hybrid correlative-predictive model of WPS base on J-Q theorymore » and the Ritchie-Knott-Rice model is applied to a small scale yielding boundary layer formulation to investigate near crack-tip fields under varying degrees of loading and unloading.« less

  16. Uncertainty and sensitivity assessment of flood risk assessments

    NASA Astrophysics Data System (ADS)

    de Moel, H.; Aerts, J. C.

    2009-12-01

    Floods are one of the most frequent and costly natural disasters. In order to protect human lifes and valuable assets from the effect of floods many defensive structures have been build. Despite these efforts economic losses due to catastrophic flood events have, however, risen substantially during the past couple of decades because of continuing economic developments in flood prone areas. On top of that, climate change is expected to affect the magnitude and frequency of flood events. Because these ongoing trends are expected to continue, a transition can be observed in various countries to move from a protective flood management approach to a more risk based flood management approach. In a risk based approach, flood risk assessments play an important role in supporting decision making. Most flood risk assessments assess flood risks in monetary terms (damage estimated for specific situations or expected annual damage) in order to feed cost-benefit analysis of management measures. Such flood risk assessments contain, however, considerable uncertainties. This is the result from uncertainties in the many different input parameters propagating through the risk assessment and accumulating in the final estimate. Whilst common in some other disciplines, as with integrated assessment models, full uncertainty and sensitivity analyses of flood risk assessments are not so common. Various studies have addressed uncertainties regarding flood risk assessments, but have mainly focussed on the hydrological conditions. However, uncertainties in other components of the risk assessment, like the relation between water depth and monetary damage, can be substantial as well. This research therefore tries to assess the uncertainties of all components of monetary flood risk assessments, using a Monte Carlo based approach. Furthermore, the total uncertainty will also be attributed to the different input parameters using a variance based sensitivity analysis. Assessing and visualizing the uncertainties of the final risk estimate will be helpful to decision makers to make better informed decisions and attributing this uncertainty to the input parameters helps to identify which parameters are most important when it comes to uncertainty in the final estimate and should therefore deserve additional attention in further research.

  17. Final Report - Enhanced LAW Glass Property - Composition Models - Phase 1 VSL-13R2940-1, Rev. 0, dated 9/27/2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kruger, Albert A.; Muller, I.; Gilbo, K.

    2013-11-13

    The objectives of this work are aimed at the development of enhanced LAW propertycomposition models that expand the composition region covered by the models. The models of interest include PCT, VHT, viscosity and electrical conductivity. This is planned as a multi-year effort that will be performed in phases with the objectives listed below for the current phase.  Incorporate property- composition data from the new glasses into the database.  Assess the database and identify composition spaces in the database that need augmentation.  Develop statistically-designed composition matrices to cover the composition regions identified in the above analysis.  Preparemore » crucible melts of glass compositions from the statistically-designed composition matrix and measure the properties of interest.  Incorporate the above property-composition data into the database.  Assess existing models against the complete dataset and, as necessary, start development of new models.« less

  18. Coupling of computer modeling with in vitro methodologies to reduce animal usage in toxicity testing.

    PubMed

    Clewell, H J

    1993-05-01

    The use of in vitro data to support the development of physiologically based pharmacokinetic (PBPK) models and to reduce the requirement for in vivo testing is demonstrated by three examples. In the first example, polychlorotrifluoroethylene, in vitro studies comparing metabolism and tissue response in rodents and primates made it possible to obtain definitive data for a human risk assessment without resorting to additional in vivo studies with primates. In the second example, a PBPK model for organophosphate esters was developed in which the parameters defining metabolism, tissue partitioning, and enzyme inhibition were all characterized by in vitro studies, and the rest of the model parameters were established from the literature. The resulting model was able to provide a coherent description of enzyme inhibition following both acute and chronic exposures in mice, rats, and humans. In the final example, the carcinogenic risk assessment for methylene chloride was refined by the incorporation of in vitro data on human metabolism into a PBPK model.

  19. Environmental fate and exposure models: advances and challenges in 21st century chemical risk assessment.

    PubMed

    Di Guardo, Antonio; Gouin, Todd; MacLeod, Matthew; Scheringer, Martin

    2018-01-24

    Environmental fate and exposure models are a powerful means to integrate information on chemicals, their partitioning and degradation behaviour, the environmental scenario and the emissions in order to compile a picture of chemical distribution and fluxes in the multimedia environment. A 1995 pioneering book, resulting from a series of workshops among model developers and users, reported the main advantages and identified needs for research in the field of multimedia fate models. Considerable efforts were devoted to their improvement in the past 25 years and many aspects were refined; notably the inclusion of nanomaterials among the modelled substances, the development of models at different spatial and temporal scales, the estimation of chemical properties and emission data, the incorporation of additional environmental media and processes, the integration of sensitivity and uncertainty analysis in the simulations. However, some challenging issues remain and require research efforts and attention: the need of methods to estimate partition coefficients for polar and ionizable chemical in the environment, a better description of bioavailability in different environments as well as the requirement of injecting more ecological realism in exposure predictions to account for the diversity of ecosystem structures and functions in risk assessment. Finally, to transfer new scientific developments into the realm of regulatory risk assessment, we propose the formation of expert groups that compare, discuss and recommend model modifications and updates and help develop practical tools for risk assessment.

  20. Use of the AHP methodology in system dynamics: Modelling and simulation for health technology assessments to determine the correct prosthesis choice for hernia diseases.

    PubMed

    Improta, Giovanni; Russo, Mario Alessandro; Triassi, Maria; Converso, Giuseppe; Murino, Teresa; Santillo, Liberatina Carmela

    2018-05-01

    Health technology assessments (HTAs) are often difficult to conduct because of the decisive procedures of the HTA algorithm, which are often complex and not easy to apply. Thus, their use is not always convenient or possible for the assessment of technical requests requiring a multidisciplinary approach. This paper aims to address this issue through a multi-criteria analysis focusing on the analytic hierarchy process (AHP). This methodology allows the decision maker to analyse and evaluate different alternatives and monitor their impact on different actors during the decision-making process. However, the multi-criteria analysis is implemented through a simulation model to overcome the limitations of the AHP methodology. Simulations help decision-makers to make an appropriate decision and avoid unnecessary and costly attempts. Finally, a decision problem regarding the evaluation of two health technologies, namely, the evaluation of two biological prostheses for incisional infected hernias, will be analysed to assess the effectiveness of the model. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  1. An information entropy model on clinical assessment of patients based on the holographic field of meridian

    NASA Astrophysics Data System (ADS)

    Wu, Jingjing; Wu, Xinming; Li, Pengfei; Li, Nan; Mao, Xiaomei; Chai, Lihe

    2017-04-01

    Meridian system is not only the basis of traditional Chinese medicine (TCM) method (e.g. acupuncture, massage), but also the core of TCM's basic theory. This paper has introduced a new informational perspective to understand the reality and the holographic field of meridian. Based on maximum information entropy principle (MIEP), a dynamic equation for the holographic field has been deduced, which reflects the evolutionary characteristics of meridian. By using self-organizing artificial neural network as algorithm, the evolutionary dynamic equation of the holographic field can be resolved to assess properties of meridians and clinically diagnose the health characteristics of patients. Finally, through some cases from clinical patients (e.g. a 30-year-old male patient, an apoplectic patient, an epilepsy patient), we use this model to assess the evolutionary properties of meridians. It is proved that this model not only has significant implications in revealing the essence of meridian in TCM, but also may play a guiding role in clinical assessment of patients based on the holographic field of meridians.

  2. Comparing the appropriate geographic region for assessing built environmental correlates with walking trips using different metrics and model approaches

    PubMed Central

    Tribby, Calvin P.; Miller, Harvey J.; Brown, Barbara B.; Smith, Ken R.; Werner, Carol M.

    2017-01-01

    There is growing international evidence that supportive built environments encourage active travel such as walking. An unsettled question is the role of geographic regions for analyzing the relationship between the built environment and active travel. This paper examines the geographic region question by assessing walking trip models that use two different regions: walking activity spaces and self-defined neighborhoods. We also use two types of built environment metrics, perceived and audit data, and two types of study design, cross-sectional and longitudinal, to assess these regions. We find that the built environment associations with walking are dependent on the type of metric and the type of model. Audit measures summarized within walking activity spaces better explain walking trips compared to audit measures within self-defined neighborhoods. Perceived measures summarized within self-defined neighborhoods have mixed results. Finally, results differ based on study design. This suggests that results may not be comparable among different regions, metrics and designs; researchers need to consider carefully these choices when assessing active travel correlates. PMID:28237743

  3. Long-term occlusal changes assessed by the American Board of Orthodontics' model grading system.

    PubMed

    Aszkler, Robert M; Preston, Charles B; Saltaji, Humam; Tabbaa, Sawsan

    2014-02-01

    The purpose of this study was to assess the long-term posttreatment changes in all criteria of the American Board of Orthodontics' (ABO) model grading system. We used plaster models from patients' final and posttreatment records. Thirty patients treated by 1 orthodontist using 1 bracket prescription were selected. An initial discrepancy index for each subject was performed to determine the complexity of each case. The final models were then graded using the ABO's model grading system immediately at posttreatment and postretention. Statistical analysis was performed on the 8 criteria of the model grading system, including paired t tests and Pearson correlations. An alpha of 0.05 was considered statistically significant. The average length of time between the posttreatment and postretention records was 12.7 ± 4.4 years. It was shown that alignment and rotations worsened by postretention (P = 0.014), and a weak statistically significant correlation at posttreatment and postretention was found (0.44; P = 0.016). Both marginal ridges and occlusal contacts scored less well at posttreatment. These criteria showed a significant decrease in scores between posttreatment and postretention (P <0.001), but the correlations were not statistically significant. The average total score showed a significant decrease between posttreatment and postretention (P <0.001), partly because of the large decrease in the previous 2 criteria. Higher scores for occlusal contacts and marginal ridges were found at the end of treatment; however, those scores and the overall scores for the 30 subjects improved in the postretention phase. Copyright © 2014. Published by Mosby, Inc.

  4. The flipped classroom allows for more class time devoted to critical thinking.

    PubMed

    DeRuisseau, Lara R

    2016-12-01

    The flipped classroom was utilized in a two-semester, high-content science course that enrolled between 50 and 80 students at a small liberal arts college. With the flipped model, students watched ~20-min lectures 2 days/wk outside of class. These videos were recorded via screen capture and included a detailed note outline, PowerPoint slides, and review questions. The traditional format included the same materials, except that lectures were delivered in class each week and spanned the entire period. During the flipped course, the instructor reviewed common misconceptions and asked questions requiring higher-order thinking, and five graded case studies were performed each semester. To determine whether assessments included additional higher-order thinking skills in the flipped vs. traditional model, questions across course formats were compared via Blooms Taxonomy. Application-level questions that required prediction of an outcome in a new scenario comprised 38 ± 3 vs. 12 ± 1% of summative assessment questions (<0.01): flipped vs. traditional. Final letter grades in both formats of the course were compared with major GPA. Students in the flipped model performed better than their GPA predicted, as 85.5% earned a higher grade (vs. 42.2% in the traditional classroom) compared with their major GPA. These data demonstrate that assessments transitioned to more application-level compared with factual knowledge-based questions with this particular flipped model, and students performed better in their final letter grade compared with the traditional lecture format. Although the benefits to a flipped classroom are highlighted, student evaluations did suffer. More detailed studies comparing the traditional and flipped formats are warranted. Copyright © 2016 the American Physiological Society.

  5. Development of a Self-Report Physical Function Instrument for Disability Assessment: Item Pool Construction and Factor Analysis

    PubMed Central

    McDonough, Christine M.; Jette, Alan M.; Ni, Pengsheng; Bogusz, Kara; Marfeo, Elizabeth E; Brandt, Diane E; Chan, Leighton; Meterko, Mark; Haley, Stephen M.; Rasch, Elizabeth K.

    2014-01-01

    Objectives To build a comprehensive item pool representing work-relevant physical functioning and to test the factor structure of the item pool. These developmental steps represent initial outcomes of a broader project to develop instruments for the assessment of function within the context of Social Security Administration (SSA) disability programs. Design Comprehensive literature review; gap analysis; item generation with expert panel input; stakeholder interviews; cognitive interviews; cross-sectional survey administration; and exploratory and confirmatory factor analyses to assess item pool structure. Setting In-person and semi-structured interviews; internet and telephone surveys. Participants A sample of 1,017 SSA claimants, and a normative sample of 999 adults from the US general population. Interventions Not Applicable. Main Outcome Measure Model fit statistics Results The final item pool consisted of 139 items. Within the claimant sample 58.7% were white; 31.8% were black; 46.6% were female; and the mean age was 49.7 years. Initial factor analyses revealed a 4-factor solution which included more items and allowed separate characterization of: 1) Changing and Maintaining Body Position, 2) Whole Body Mobility, 3) Upper Body Function and 4) Upper Extremity Fine Motor. The final 4-factor model included 91 items. Confirmatory factor analyses for the 4-factor models for the claimant and the normative samples demonstrated very good fit. Fit statistics for claimant and normative samples respectively were: Comparative Fit Index = 0.93 and 0.98; Tucker-Lewis Index = 0.92 and 0.98; Root Mean Square Error Approximation = 0.05 and 0.04. Conclusions The factor structure of the Physical Function item pool closely resembled the hypothesized content model. The four scales relevant to work activities offer promise for providing reliable information about claimant physical functioning relevant to work disability. PMID:23542402

  6. Development of a self-report physical function instrument for disability assessment: item pool construction and factor analysis.

    PubMed

    McDonough, Christine M; Jette, Alan M; Ni, Pengsheng; Bogusz, Kara; Marfeo, Elizabeth E; Brandt, Diane E; Chan, Leighton; Meterko, Mark; Haley, Stephen M; Rasch, Elizabeth K

    2013-09-01

    To build a comprehensive item pool representing work-relevant physical functioning and to test the factor structure of the item pool. These developmental steps represent initial outcomes of a broader project to develop instruments for the assessment of function within the context of Social Security Administration (SSA) disability programs. Comprehensive literature review; gap analysis; item generation with expert panel input; stakeholder interviews; cognitive interviews; cross-sectional survey administration; and exploratory and confirmatory factor analyses to assess item pool structure. In-person and semistructured interviews and Internet and telephone surveys. Sample of SSA claimants (n=1017) and a normative sample of adults from the U.S. general population (n=999). Not applicable. Model fit statistics. The final item pool consisted of 139 items. Within the claimant sample, 58.7% were white; 31.8% were black; 46.6% were women; and the mean age was 49.7 years. Initial factor analyses revealed a 4-factor solution, which included more items and allowed separate characterization of: (1) changing and maintaining body position, (2) whole body mobility, (3) upper body function, and (4) upper extremity fine motor. The final 4-factor model included 91 items. Confirmatory factor analyses for the 4-factor models for the claimant and the normative samples demonstrated very good fit. Fit statistics for claimant and normative samples, respectively, were: Comparative Fit Index=.93 and .98; Tucker-Lewis Index=.92 and .98; and root mean square error approximation=.05 and .04. The factor structure of the physical function item pool closely resembled the hypothesized content model. The 4 scales relevant to work activities offer promise for providing reliable information about claimant physical functioning relevant to work disability. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  7. Phase IV of Early Restoration | NOAA Gulf Spill Restoration

    Science.gov Websites

    Trustees published the Final Phase IV Early Restoration Plan and Environmental Assessments. The plan habitats. Useful Links: Final Phase IV Early Restoration Plan and Environmental Assessments (pdf, 4.8 MB ) Final Phase IV Early Restoration Plan and Environmental Assessments Executive Summary (pdf, 729 KB

  8. Cost effective management of space venture risks

    NASA Technical Reports Server (NTRS)

    Giuntini, Ronald E.; Storm, Richard E.

    1986-01-01

    The development of a model for the cost-effective management of space venture risks is discussed. The risk assessment and control program of insurance companies is examined. A simplified system development cycle which consists of a conceptual design phase, a preliminary design phase, a final design phase, a construction phase, and a system operations and maintenance phase is described. The model incorporates insurance safety risk methods and reliability engineering, and testing practices used in the development of large aerospace and defense systems.

  9. An Assessment of Aquifer/Well Flow Dynamics: Identification of Parameters Key to Passive Sampling and Application of Downhole Sensor Technologies

    DTIC Science & Technology

    2014-12-01

    Simulated Solute Transport in a Numerical Replication of Britt’s 2005 Experiment Figure 44 In-Well Flow Inhibitor Figure 45 Results of a Preliminary Dye ...Tracer Experiment Conducted at INL Figure 46 Results Horizontally-Oriented Dye Tracer Experiment Conducted at INL ER-1704 Final Report 2014 vii...possible sources of well convection and mixing. Specifically, the modeling explored: • 2D and 3D physical tank models. Dye tracer testing was conducted

  10. Unit Level WRSK (War Readiness Spares Kit) Assessment and Sortie Generation Simulation Model.

    DTIC Science & Technology

    1987-12-01

    also grateful to Capt. Richard Mabe and Capt. Michael Budde for teaching me the secrets of Dyna-MUTRIC. To my typist and fiancee, Marcia Rossow, thank...interviews with Capt. Budde, HQ TAC/LGY and Capt. Mabe , AFIT/LSMA (5) (21). Final agreement on the model flow took place at a meeting on 30 September...Gunter AFS, AL. 3. Arthur, Jeffrey L., James 0. Frendewey, Parviz Ghandforoush and Loren Paul Rees. "Microcomputer Simulation Systems," Computers and

  11. North Carolina’s Response to the Directorate of Health Care Studies and Clinical Investigation’s Final Report: Assessing Power Analysis Approaches for the Fort Bragg Evaluation Project

    DTIC Science & Technology

    1993-08-31

    American Society of Criminology, Denver , Colorado , November, 1983. (50) ’Unemployment and Crime Rates in Post-World War 11 United States: A Theoretical and... Denver , Colorado , August 20-September 2, 1971 (2) "Social Indicator Models," American Sociological Association Annual Meeting, San Francisco...Association, Denver , Colorado , August, 1971. (10) ’Social Indicator Models: An Overview." Paper presented at the annual meetings of the American

  12. Environmental impact of geopressure - geothermal cogeneration facility on wetland resources and socioeconomic characteristics in Louisiana Gulf Coast region. Final report, October 10, 1983-September 31, 1984

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smalley, A.M.; Saleh, F.M.S.; Fontenot, M.

    1984-08-01

    Baseline data relevant to air quality are presented. The following are also included: geology and resource assessment, design well prospects in southwestern Louisiana, water quality monitoring, chemical analysis subsidence, microseismicity, geopressure-geothermal subsidence modeling, models of compaction and subsidence, sampling handling and preparation, brine chemistry, wetland resources, socioeconomic characteristics, impacts on wetlands, salinity, toxic metals, non-metal toxicants, temperature, subsidence, and socioeconomic impacts. (MHR)

  13. Analysis of Crystallization Kinetics

    NASA Technical Reports Server (NTRS)

    Kelton, Kenneth F.

    1997-01-01

    A realistic computer model for polymorphic crystallization (i.e., initial and final phases with identical compositions), which includes time-dependent nucleation and cluster-size-dependent growth rates, is developed and tested by fits to experimental data. Model calculations are used to assess the validity of two of the more common approaches for the analysis of crystallization data. The effects of particle size on transformation kinetics, important for the crystallization of many systems of limited dimension including thin films, fine powders, and nanoparticles, are examined.

  14. Artificial Intelligence for VHSIC Systems Design (AIVD) User Reference Manual

    DTIC Science & Technology

    1988-12-01

    The goal of this program was to develop prototype tools which would use artificial intelligence techniques to extend the Architecture Design and Assessment (ADAS) software capabilities. These techniques were applied in a number of ways to increase the productivity of ADAS users. AIM will reduce the amount of time spent on tedious, negative, and error-prone steps. It will also provide f documentation that will assist users in varifying that the models they build are correct Finally, AIVD will help make ADAS models more reusable.

  15. Proposal of an environmental performance index to assess solid waste treatment technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goulart Coelho, Hosmanny Mauro, E-mail: hosmanny@hotmail.com; Lange, Lisete Celina; Coelho, Lineker Max Goulart

    2012-07-15

    Highlights: Black-Right-Pointing-Pointer Proposal of a new concept in waste management: Cleaner Treatment. Black-Right-Pointing-Pointer Development of an index to assess quantitatively waste treatment technologies. Black-Right-Pointing-Pointer Delphi Method was carried out so as to define environmental indicators. Black-Right-Pointing-Pointer Environmental performance evaluation of waste-to-energy plants. - Abstract: Although the concern with sustainable development and environment protection has considerably grown in the last years it is noted that the majority of decision making models and tools are still either excessively tied to economic aspects or geared to the production process. Moreover, existing models focus on the priority steps of solid waste management, beyond wastemore » energy recovery and disposal. So, in order to help the lack of models and tools aiming at the waste treatment and final disposal, a new concept is proposed: the Cleaner Treatment, which is based on the Cleaner Production principles. This paper focuses on the development and validation of the Cleaner Treatment Index (CTI), to assess environmental performance of waste treatment technologies based on the Cleaner Treatment concept. The index is formed by aggregation (summation or product) of several indicators that consists in operational parameters. The weights of the indicator were established by Delphi Method and Brazilian Environmental Laws. In addition, sensitivity analyses were carried out comparing both aggregation methods. Finally, index validation was carried out by applying the CTI to 10 waste-to-energy plants data. From sensitivity analysis and validation results it is possible to infer that summation model is the most suitable aggregation method. For summation method, CTI results were superior to 0.5 (in a scale from 0 to 1) for most facilities evaluated. So, this study demonstrates that CTI is a simple and robust tool to assess and compare the environmental performance of different treatment plants being an excellent quantitative tool to support Cleaner Treatment implementation.« less

  16. Modeling High-Impact Weather and Climate: Lessons From a Tropical Cyclone Perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Done, James; Holland, Greg; Bruyere, Cindy

    2013-10-19

    Although the societal impact of a weather event increases with the rarity of the event, our current ability to assess extreme events and their impacts is limited by not only rarity but also by current model fidelity and a lack of understanding of the underlying physical processes. This challenge is driving fresh approaches to assess high-impact weather and climate. Recent lessons learned in modeling high-impact weather and climate are presented using the case of tropical cyclones as an illustrative example. Through examples using the Nested Regional Climate Model to dynamically downscale large-scale climate data the need to treat bias inmore » the driving data is illustrated. Domain size, location, and resolution are also shown to be critical and should be guided by the need to: include relevant regional climate physical processes; resolve key impact parameters; and to accurately simulate the response to changes in external forcing. The notion of sufficient model resolution is introduced together with the added value in combining dynamical and statistical assessments to fill out the parent distribution of high-impact parameters. Finally, through the example of a tropical cyclone damage index, direct impact assessments are resented as powerful tools that distill complex datasets into concise statements on likely impact, and as highly effective communication devices.« less

  17. Internal consistency of the CHAMPS physical activity questionnaire for Spanish speaking older adults.

    PubMed

    Rosario, Martín G; Vázquez, Jenniffer M; Cruz, Wanda I; Ortiz, Alexis

    2008-09-01

    The Community Healthy Activities Model Program for Seniors (CHAMPS) is a physical activity monitoring questionnaire for people between 65 to 90 years old. This questionnaire has been previously translated to Spanish to be used in the Latin American population. To adapt the Spanish version of the CHAMPS questionnaire to Puerto Rico and assess its internal consistency. An external review committee adapted the existent Spanish version of the CHAMPS to be used in the Puerto Rican population. Three older adults participated in a second phase with the purpose of training the research team. After the second phase, 35 older adults participated in a third content adaptation phase. During the third phase, the preliminary Spanish version for Puerto Rico of the CHAMPS was given to the 35 participants to assess for clarity, vocabulary and understandability. Interviews to each participant in the third phase were carried out to obtain feedback and create a final Spanish version of the CHAMPS for Puerto Rico. After analyses of this phase, the external review committee prepared a final Spanish version of the CHAMPS for Puerto Rico. The final version was administered to 15 older adults (76 +/- 6.5 years) to assess the internal consistency by using Cronbach's Alpha analysis. The questionnaire showed a strong internal consistency of 0.76. The total time to answer the questionnaire was 17.4 minutes. The Spanish version of the CHAMPS questionnaire for Puerto Rico suggested being an easy to administer and consistent measurement tool to assess physical activity in older adults.

  18. Construction risk assessment of deep foundation pit in metro station based on G-COWA method

    NASA Astrophysics Data System (ADS)

    You, Weibao; Wang, Jianbo; Zhang, Wei; Liu, Fangmeng; Yang, Diying

    2018-05-01

    In order to get an accurate understanding of the construction safety of deep foundation pit in metro station and reduce the probability and loss of risk occurrence, a risk assessment method based on G-COWA is proposed. Firstly, relying on the specific engineering examples and the construction characteristics of deep foundation pit, an evaluation index system based on the five factors of “human, management, technology, material and environment” is established. Secondly, the C-OWA operator is introduced to realize the evaluation index empowerment and weaken the negative influence of expert subjective preference. The gray cluster analysis and fuzzy comprehensive evaluation method are combined to construct the construction risk assessment model of deep foundation pit, which can effectively solve the uncertainties. Finally, the model is applied to the actual project of deep foundation pit of Qingdao Metro North Station, determine its construction risk rating is “medium”, evaluate the model is feasible and reasonable. And then corresponding control measures are put forward and useful reference are provided.

  19. Assessment of health surveys: fitting a multidimensional graded response model.

    PubMed

    Depaoli, Sarah; Tiemensma, Jitske; Felt, John M

    The multidimensional graded response model, an item response theory (IRT) model, can be used to improve the assessment of surveys, even when sample sizes are restricted. Typically, health-based survey development utilizes classical statistical techniques (e.g. reliability and factor analysis). In a review of four prominent journals within the field of Health Psychology, we found that IRT-based models were used in less than 10% of the studies examining scale development or assessment. However, implementing IRT-based methods can provide more details about individual survey items, which is useful when determining the final item content of surveys. An example using a quality of life survey for Cushing's syndrome (CushingQoL) highlights the main components for implementing the multidimensional graded response model. Patients with Cushing's syndrome (n = 397) completed the CushingQoL. Results from the multidimensional graded response model supported a 2-subscale scoring process for the survey. All items were deemed as worthy contributors to the survey. The graded response model can accommodate unidimensional or multidimensional scales, be used with relatively lower sample sizes, and is implemented in free software (example code provided in online Appendix). Use of this model can help to improve the quality of health-based scales being developed within the Health Sciences.

  20. Benchmarking computational fluid dynamics models of lava flow simulation for hazard assessment, forecasting, and risk management

    USGS Publications Warehouse

    Dietterich, Hannah; Lev, Einat; Chen, Jiangzhi; Richardson, Jacob A.; Cashman, Katharine V.

    2017-01-01

    Numerical simulations of lava flow emplacement are valuable for assessing lava flow hazards, forecasting active flows, designing flow mitigation measures, interpreting past eruptions, and understanding the controls on lava flow behavior. Existing lava flow models vary in simplifying assumptions, physics, dimensionality, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess existing models and guide the development of new codes, we conduct a benchmarking study of computational fluid dynamics (CFD) models for lava flow emplacement, including VolcFlow, OpenFOAM, FLOW-3D, COMSOL, and MOLASSES. We model viscous, cooling, and solidifying flows over horizontal planes, sloping surfaces, and into topographic obstacles. We compare model results to physical observations made during well-controlled analogue and molten basalt experiments, and to analytical theory when available. Overall, the models accurately simulate viscous flow with some variability in flow thickness where flows intersect obstacles. OpenFOAM, COMSOL, and FLOW-3D can each reproduce experimental measurements of cooling viscous flows, and OpenFOAM and FLOW-3D simulations with temperature-dependent rheology match results from molten basalt experiments. We assess the goodness-of-fit of the simulation results and the computational cost. Our results guide the selection of numerical simulation codes for different applications, including inferring emplacement conditions of past lava flows, modeling the temporal evolution of ongoing flows during eruption, and probabilistic assessment of lava flow hazard prior to eruption. Finally, we outline potential experiments and desired key observational data from future flows that would extend existing benchmarking data sets.

  1. Non-linear assessment and deficiency of linear relationship for healthcare industry

    NASA Astrophysics Data System (ADS)

    Nordin, N.; Abdullah, M. M. A. B.; Razak, R. C.

    2017-09-01

    This paper presents the development of the non-linear service satisfaction model that assumes patients are not necessarily satisfied or dissatisfied with good or poor service delivery. With that, compliment and compliant assessment is considered, simultaneously. Non-linear service satisfaction instrument called Kano-Q and Kano-SS is developed based on Kano model and Theory of Quality Attributes (TQA) to define the unexpected, hidden and unspoken patient satisfaction and dissatisfaction into service quality attribute. A new Kano-Q and Kano-SS algorithm for quality attribute assessment is developed based satisfaction impact theories and found instrumentally fit the reliability and validity test. The results were also validated based on standard Kano model procedure before Kano model and Quality Function Deployment (QFD) is integrated for patient attribute and service attribute prioritization. An algorithm of Kano-QFD matrix operation is developed to compose the prioritized complaint and compliment indexes. Finally, the results of prioritized service attributes are mapped to service delivery category to determine the most prioritized service delivery that need to be improved at the first place by healthcare service provider.

  2. No-Reference Image Quality Assessment by Wide-Perceptual-Domain Scorer Ensemble Method.

    PubMed

    Liu, Tsung-Jung; Liu, Kuan-Hsien

    2018-03-01

    A no-reference (NR) learning-based approach to assess image quality is presented in this paper. The devised features are extracted from wide perceptual domains, including brightness, contrast, color, distortion, and texture. These features are used to train a model (scorer) which can predict scores. The scorer selection algorithms are utilized to help simplify the proposed system. In the final stage, the ensemble method is used to combine the prediction results from selected scorers. Two multiple-scale versions of the proposed approach are also presented along with the single-scale one. They turn out to have better performances than the original single-scale method. Because of having features from five different domains at multiple image scales and using the outputs (scores) from selected score prediction models as features for multi-scale or cross-scale fusion (i.e., ensemble), the proposed NR image quality assessment models are robust with respect to more than 24 image distortion types. They also can be used on the evaluation of images with authentic distortions. The extensive experiments on three well-known and representative databases confirm the performance robustness of our proposed model.

  3. Analysis of psychological factors for quality assessment of interactive multimodal service

    NASA Astrophysics Data System (ADS)

    Yamagishi, Kazuhisa; Hayashi, Takanori

    2005-03-01

    We proposed a subjective quality assessment model for interactive multimodal services. First, psychological factors of an audiovisual communication service were extracted by using the semantic differential (SD) technique and factor analysis. Forty subjects participated in subjective tests and performed point-to-point conversational tasks on a PC-based TV phone that exhibits various network qualities. The subjects assessed those qualities on the basis of 25 pairs of adjectives. Two psychological factors, i.e., an aesthetic feeling and a feeling of activity, were extracted from the results. Then, quality impairment factors affecting these two psychological factors were analyzed. We found that the aesthetic feeling is mainly affected by IP packet loss and video coding bit rate, and the feeling of activity depends on delay time and video frame rate. We then proposed an opinion model derived from the relationships among quality impairment factors, psychological factors, and overall quality. The results indicated that the estimation error of the proposed model is almost equivalent to the statistical reliability of the subjective score. Finally, using the proposed model, we discuss guidelines for quality design of interactive audiovisual communication services.

  4. Constrained recycling: a framework to reduce landfilling in developing countries.

    PubMed

    Diaz, Ricardo; Otoma, Suehiro

    2013-01-01

    This article presents a model that integrates three branches of research: (i) economics of solid waste that assesses consumer's willingness to recycle and to pay for disposal; (ii) economics of solid waste that compares private and social costs of final disposal and recycling; and (iii) theories on personal attitudes and social influence. The model identifies two arenas where decisions are made: upstream arena, where residents are decision-makers, and downstream arena, where municipal authorities are decision-makers, and graphically proposes interactions between disposal and recycling, as well as the concept of 'constrained recycling' (an alternative to optimal recycling) to guide policy design. It finally concludes that formative instruments, such as environmental education and benchmarks, should be combined with economic instruments, such as subsidies, to move constraints on source separation and recycling in the context of developing countries.

  5. Mastoidectomy performance assessment of virtual simulation training using final-product analysis.

    PubMed

    Andersen, Steven A W; Cayé-Thomasen, Per; Sørensen, Mads S

    2015-02-01

    The future development of integrated automatic assessment in temporal bone virtual surgical simulators calls for validation against currently established assessment tools. This study aimed to explore the relationship between mastoidectomy final-product performance assessment in virtual simulation and traditional dissection training. Prospective trial with blinding. A total of 34 novice residents performed a mastoidectomy on the Visible Ear Simulator and on a cadaveric temporal bone. Two blinded senior otologists assessed the final-product performance using a modified Welling scale. The simulator gathered basic metrics on time, steps, and volumes in relation to the on-screen tutorial and collisions with vital structures. Substantial inter-rater reliability (kappa = 0.77) for virtual simulation and moderate inter-rater reliability (kappa = 0.59) for dissection final-product assessment was found. The simulation and dissection performance scores had significant correlation (P = .014). None of the basic simulator metrics correlated significantly with the final-product score except for number of steps completed in the simulator. A modified version of a validated final-product performance assessment tool can be used to assess mastoidectomy on virtual temporal bones. Performance assessment of virtual mastoidectomy could potentially save the use of cadaveric temporal bones for more advanced training when a basic level of competency in simulation has been achieved. NA. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  6. Quantitative risk assessment of human campylobacteriosis associated with thermophilic Campylobacter species in chickens.

    PubMed

    Rosenquist, Hanne; Nielsen, Niels L; Sommer, Helle M; Nørrung, Birgit; Christensen, Bjarke B

    2003-05-25

    A quantitative risk assessment comprising the elements hazard identification, hazard characterization, exposure assessment, and risk characterization has been prepared to assess the effect of different mitigation strategies on the number of human cases in Denmark associated with thermophilic Campylobacter spp. in chickens. To estimate the human exposure to Campylobacter from a chicken meal and the number of human cases associated with this exposure, a mathematical risk model was developed. The model details the spread and transfer of Campylobacter in chickens from slaughter to consumption and the relationship between ingested dose and the probability of developing campylobacteriosis. Human exposure was estimated in two successive mathematical modules. Module 1 addresses changes in prevalence and numbers of Campylobacter on chicken carcasses throughout the processing steps of a slaughterhouse. Module 2 covers the transfer of Campylobacter during food handling in private kitchens. The age and sex of consumers were included in this module to introduce variable hygiene levels during food preparation and variable sizes and compositions of meals. Finally, the outcome of the exposure assessment modules was integrated with a Beta-Poisson dose-response model to provide a risk estimate. Simulations designed to predict the effect of different mitigation strategies showed that the incidence of campylobacteriosis associated with consumption of chicken meals could be reduced 30 times by introducing a 2 log reduction of the number of Campylobacter on the chicken carcasses. To obtain a similar reduction of the incidence, the flock prevalence should be reduced approximately 30 times or the kitchen hygiene improved approximately 30 times. Cross-contamination from positive to negative flocks during slaughter had almost no effect on the human Campylobacter incidence, which indicates that implementation of logistic slaughter will only have a minor influence on the risk. Finally, the simulations showed that people in the age of 18-29 years had the highest risk of developing campylobacteriosis.

  7. Uncertainty on shallow landslide hazard assessment: from field data to hazard mapping

    NASA Astrophysics Data System (ADS)

    Trefolini, Emanuele; Tolo, Silvia; Patelli, Eduardo; Broggi, Matteo; Disperati, Leonardo; Le Tuan, Hai

    2015-04-01

    Shallow landsliding that involve Hillslope Deposits (HD), the surficial soil that cover the bedrock, is an important process of erosion, transport and deposition of sediment along hillslopes. Despite Shallow landslides generally mobilize relatively small volume of material, they represent the most hazardous factor in mountain regions due to their high velocity and the common absence of warning signs. Moreover, increasing urbanization and likely climate change make shallow landslides a source of widespread risk, therefore the interest of scientific community about this process grown in the last three decades. One of the main aims of research projects involved on this topic, is to perform robust shallow landslides hazard assessment for wide areas (regional assessment), in order to support sustainable spatial planning. Currently, three main methodologies may be implemented to assess regional shallow landslides hazard: expert evaluation, probabilistic (or data mining) methods and physical models based methods. The aim of this work is evaluate the uncertainty of shallow landslides hazard assessment based on physical models taking into account spatial variables such as: geotechnical and hydrogeologic parameters as well as hillslope morphometry. To achieve this goal a wide dataset of geotechnical properties (shear strength, permeability, depth and unit weight) of HD was gathered by integrating field survey, in situ and laboratory tests. This spatial database was collected from a study area of about 350 km2 including different bedrock lithotypes and geomorphological features. The uncertainty associated to each step of the hazard assessment process (e.g. field data collection, regionalization of site specific information and numerical modelling of hillslope stability) was carefully characterized. The most appropriate probability density function (PDF) was chosen for each numerical variable and we assessed the uncertainty propagation on HD strength parameters obtained by empirical relations with geotechnical index properties. Site specific information was regionalized at map scale by (hard and fuzzy) clustering analysis taking into account spatial variables such as: geology, geomorphology and hillslope morphometric variables (longitudinal and transverse curvature, flow accumulation and slope), the latter derived by a DEM with 10 m cell size. In order to map shallow landslide hazard, Monte Carlo simulation was performed for some common physically based models available in literature (eg. SINMAP, SHALSTAB, TRIGRS). Furthermore, a new approach based on the use of Bayesian Network was proposed and validated. Different models, such as Intervals, Convex Models and Fuzzy Sets, were adopted for the modelling of input parameters. Finally, an accuracy assessment was carried out on the resulting maps and the propagation of uncertainty of input parameters into the final shallow landslide hazard estimation was estimated. The outcomes of the analysis are compared and discussed in term of discrepancy among map pixel values and related estimated error. The novelty of the proposed method is on estimation of the confidence of the shallow landslides hazard mapping at regional level. This allows i) to discriminate regions where hazard assessment is robust from areas where more data are necessary to increase the confidence level and ii) to assess the reliability of the procedure used for hazard assessment.

  8. Estimating severity of sideways fall using a generic multi linear regression model based on kinematic input variables.

    PubMed

    van der Zijden, A M; Groen, B E; Tanck, E; Nienhuis, B; Verdonschot, N; Weerdesteyn, V

    2017-03-21

    Many research groups have studied fall impact mechanics to understand how fall severity can be reduced to prevent hip fractures. Yet, direct impact force measurements with force plates are restricted to a very limited repertoire of experimental falls. The purpose of this study was to develop a generic model for estimating hip impact forces (i.e. fall severity) in in vivo sideways falls without the use of force plates. Twelve experienced judokas performed sideways Martial Arts (MA) and Block ('natural') falls on a force plate, both with and without a mat on top. Data were analyzed to determine the hip impact force and to derive 11 selected (subject-specific and kinematic) variables. Falls from kneeling height were used to perform a stepwise regression procedure to assess the effects of these input variables and build the model. The final model includes four input variables, involving one subject-specific measure and three kinematic variables: maximum upper body deceleration, body mass, shoulder angle at the instant of 'maximum impact' and maximum hip deceleration. The results showed that estimated and measured hip impact forces were linearly related (explained variances ranging from 46 to 63%). Hip impact forces of MA falls onto the mat from a standing position (3650±916N) estimated by the final model were comparable with measured values (3698±689N), even though these data were not used for training the model. In conclusion, a generic linear regression model was developed that enables the assessment of fall severity through kinematic measures of sideways falls, without using force plates. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Final Environmental Assessment: Solar Panel Systems at Joint Base McGuire-Dix-Lakehurst New Jersey

    DTIC Science & Technology

    2012-03-01

    FINAL ENVIRONMENTAL ASSESSMENT Solar Panel Systems at Joint Base McGuire-Dix-Lakehurst, New Jersey MARCH 2012...Final Environmental Assessment : Solar Panel Systems at Joint Base McGuire-Dix-Lakehurst New Jersey 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Purpose Finding of No Significant Impact (FONSI) Environmental Assessment (EA

  10. Final Report: Demographic Tools for Climate Change and Environmental Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Neill, Brian

    2017-01-24

    This report summarizes work over the course of a three-year project (2012-2015, with one year no-cost extension to 2016). The full proposal detailed six tasks: Task 1: Population projection model Task 2: Household model Task 3: Spatial population model Task 4: Integrated model development Task 5: Population projections for Shared Socio-economic Pathways (SSPs) Task 6: Population exposure to climate extremes We report on all six tasks, provide details on papers that have appeared or been submitted as a result of this project, and list selected key presentations that have been made within the university community and at professional meetings.

  11. A business model analysis of telecardiology service.

    PubMed

    Lin, Shu-Hsia; Liu, Jorn-Hon; Wei, Jen; Yin, Wei-Hsian; Chen, Hung-Hsin; Chiu, Wen-Ta

    2010-12-01

    Telecare has become an increasingly common medical service in recent years. However, new service must be close to the market and be market-driven to have a high likelihood of success. This article analyzes the business model of a telecardiology service managed by a general hospital. The methodology of the article is as follows: (1) initially it describes the elements of the service based on the ontology of the business model, (2) then it transfers these elements into the choices for business model dynamic loops and examines their validity, and (3) finally provides an empirical financial analysis of the service to assess the profit-making possibilities.

  12. Computer simulation of the probability that endangered whales will interact with oil spills, Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reed, M.; Jayko, K.; Bowles, A.

    1986-10-01

    A numerical model system was developed to assess quantitatively the probability that endangered bowhead and gray whales will encounter spilled oil in Alaskan waters. Bowhead and gray whale migration diving-surfacing models, and an oil-spill-trajectory model comprise the system. The migration models were developed from conceptual considerations, then calibrated with and tested against observations. The distribution of animals is represented in space and time by discrete points, each of which may represent one or more whales. The movement of a whale point is governed by a random-walk algorithm which stochastically follows a migratory pathway.

  13. Assessment of variation in the alberta context tool: the contribution of unit level contextual factors and specialty in Canadian pediatric acute care settings

    PubMed Central

    2011-01-01

    Background There are few validated measures of organizational context and none that we located are parsimonious and address modifiable characteristics of context. The Alberta Context Tool (ACT) was developed to meet this need. The instrument assesses 8 dimensions of context, which comprise 10 concepts. The purpose of this paper is to report evidence to further the validity argument for ACT. The specific objectives of this paper are to: (1) examine the extent to which the 10 ACT concepts discriminate between patient care units and (2) identify variables that significantly contribute to between-unit variation for each of the 10 concepts. Methods 859 professional nurses (844 valid responses) working in medical, surgical and critical care units of 8 Canadian pediatric hospitals completed the ACT. A random intercept, fixed effects hierarchical linear modeling (HLM) strategy was used to quantify and explain variance in the 10 ACT concepts to establish the ACT's ability to discriminate between units. We ran 40 models (a series of 4 models for each of the 10 concepts) in which we systematically assessed the unique contribution (i.e., error variance reduction) of different variables to between-unit variation. First, we constructed a null model in which we quantified the variance overall, in each of the concepts. Then we controlled for the contribution of individual level variables (Model 1). In Model 2, we assessed the contribution of practice specialty (medical, surgical, critical care) to variation since it was central to construction of the sampling frame for the study. Finally, we assessed the contribution of additional unit level variables (Model 3). Results The null model (unadjusted baseline HLM model) established that there was significant variation between units in each of the 10 ACT concepts (i.e., discrimination between units). When we controlled for individual characteristics, significant variation in the 10 concepts remained. Assessment of the contribution of specialty to between-unit variation enabled us to explain more variance (1.19% to 16.73%) in 6 of the 10 ACT concepts. Finally, when we assessed the unique contribution of the unit level variables available to us, we were able to explain additional variance (15.91% to 73.25%) in 7 of the 10 ACT concepts. Conclusion The findings reported here represent the third published argument for validity of the ACT and adds to the evidence supporting its use to discriminate patient care units by all 10 contextual factors. We found evidence of relationships between a variety of individual and unit-level variables that explained much of this between-unit variation for each of the 10 ACT concepts. Future research will include examination of the relationships between the ACT's contextual factors and research utilization by nurses and ultimately the relationships between context, research utilization, and outcomes for patients. PMID:21970404

  14. Using toxicokinetic-toxicodynamic modeling as an acute risk assessment refinement approach in vertebrate ecological risk assessment.

    PubMed

    Ducrot, Virginie; Ashauer, Roman; Bednarska, Agnieszka J; Hinarejos, Silvia; Thorbek, Pernille; Weyman, Gabriel

    2016-01-01

    Recent guidance identified toxicokinetic-toxicodynamic (TK-TD) modeling as a relevant approach for risk assessment refinement. Yet, its added value compared to other refinement options is not detailed, and how to conduct the modeling appropriately is not explained. This case study addresses these issues through 2 examples of individual-level risk assessment for 2 hypothetical plant protection products: 1) evaluating the risk for small granivorous birds and small omnivorous mammals of a single application, as a seed treatment in winter cereals, and 2) evaluating the risk for fish after a pulsed treatment in the edge-of-field zone. Using acute test data, we conducted the first tier risk assessment as defined in the European Food Safety Authority (EFSA) guidance. When first tier risk assessment highlighted a concern, refinement options were discussed. Cases where the use of models should be preferred over other existing refinement approaches were highlighted. We then practically conducted the risk assessment refinement by using 2 different models as examples. In example 1, a TK model accounting for toxicokinetics and relevant feeding patterns in the skylark and in the wood mouse was used to predict internal doses of the hypothetical active ingredient in individuals, based on relevant feeding patterns in an in-crop situation, and identify the residue levels leading to mortality. In example 2, a TK-TD model accounting for toxicokinetics, toxicodynamics, and relevant exposure patterns in the fathead minnow was used to predict the time-course of fish survival for relevant FOCUS SW exposure scenarios and identify which scenarios might lead to mortality. Models were calibrated using available standard data and implemented to simulate the time-course of internal dose of active ingredient or survival for different exposure scenarios. Simulation results were discussed and used to derive the risk assessment refinement endpoints used for decision. Finally, we compared the "classical" risk assessment approach with the model-based approach. These comparisons showed that TK and TK-TD models can bring more realism to the risk assessment through the possibility to study realistic exposure scenarios and to simulate relevant mechanisms of effects (including delayed toxicity and recovery). Noticeably, using TK-TD models is currently the most relevant way to directly connect realistic exposure patterns to effects. We conclude with recommendations on how to properly use TK and TK-TD model in acute risk assessment for vertebrates. © 2015 SETAC.

  15. Degradation Factor Approach for Impacted Composite Structural Assessment: MSFC Center Director's Discretionary Fund Final Report, Project No. 96-17

    NASA Technical Reports Server (NTRS)

    Ortega, R.; Price, J. M.; Fox, D.

    2000-01-01

    This technical memorandum documents the results of the research to develop a concept for assessing the structural integrity of impacted composite structures using the strength degradation factor in conjunction with available finite element tools. For this purpose, a literature search was conducted, a plan for conducting impact testing on two laminates was developed, and a finite element model of the impact process was created. Specimens for the impact testing were fabricated to support the impact testing plan.

  16. Prediction of risk of recurrence of venous thromboembolism following treatment for a first unprovoked venous thromboembolism: systematic review, prognostic model and clinical decision rule, and economic evaluation.

    PubMed

    Ensor, Joie; Riley, Richard D; Jowett, Sue; Monahan, Mark; Snell, Kym Ie; Bayliss, Susan; Moore, David; Fitzmaurice, David

    2016-02-01

    Unprovoked first venous thromboembolism (VTE) is defined as VTE in the absence of a temporary provoking factor such as surgery, immobility and other temporary factors. Recurrent VTE in unprovoked patients is highly prevalent, but easily preventable with oral anticoagulant (OAC) therapy. The unprovoked population is highly heterogeneous in terms of risk of recurrent VTE. The first aim of the project is to review existing prognostic models which stratify individuals by their recurrence risk, therefore potentially allowing tailored treatment strategies. The second aim is to enhance the existing research in this field, by developing and externally validating a new prognostic model for individual risk prediction, using a pooled database containing individual patient data (IPD) from several studies. The final aim is to assess the economic cost-effectiveness of the proposed prognostic model if it is used as a decision rule for resuming OAC therapy, compared with current standard treatment strategies. Standard systematic review methodology was used to identify relevant prognostic model development, validation and cost-effectiveness studies. Bibliographic databases (including MEDLINE, EMBASE and The Cochrane Library) were searched using terms relating to the clinical area and prognosis. Reviewing was undertaken by two reviewers independently using pre-defined criteria. Included full-text articles were data extracted and quality assessed. Critical appraisal of included full texts was undertaken and comparisons made of model performance. A prognostic model was developed using IPD from the pooled database of seven trials. A novel internal-external cross-validation (IECV) approach was used to develop and validate a prognostic model, with external validation undertaken in each of the trials iteratively. Given good performance in the IECV approach, a final model was developed using all trials data. A Markov patient-level simulation was used to consider the economic cost-effectiveness of using a decision rule (based on the prognostic model) to decide on resumption of OAC therapy (or not). Three full-text articles were identified by the systematic review. Critical appraisal identified methodological and applicability issues; in particular, all three existing models did not have external validation. To address this, new prognostic models were sought with external validation. Two potential models were considered: one for use at cessation of therapy (pre D-dimer), and one for use after cessation of therapy (post D-dimer). Model performance measured in the external validation trials showed strong calibration performance for both models. The post D-dimer model performed substantially better in terms of discrimination (c = 0.69), better separating high- and low-risk patients. The economic evaluation identified that a decision rule based on the final post D-dimer model may be cost-effective for patients with predicted risk of recurrence of over 8% annually; this suggests continued therapy for patients with predicted risks ≥ 8% and cessation of therapy otherwise. The post D-dimer model performed strongly and could be useful to predict individuals' risk of recurrence at any time up to 2-3 years, thereby aiding patient counselling and treatment decisions. A decision rule using this model may be cost-effective for informing clinical judgement and patient opinion in treatment decisions. Further research may investigate new predictors to enhance model performance and aim to further externally validate to confirm performance in new, non-trial populations. Finally, it is essential that further research is conducted to develop a model predicting bleeding risk on therapy, to manage the balance between the risks of recurrence and bleeding. This study is registered as PROSPERO CRD42013003494. The National Institute for Health Research Health Technology Assessment programme.

  17. Use of artificial intelligence in supervisory control

    NASA Technical Reports Server (NTRS)

    Cohen, Aaron; Erickson, Jon D.

    1989-01-01

    Viewgraphs describing the design and testing of an intelligent decision support system called OFMspert are presented. In this expert system, knowledge about the human operator is represented through an operator/system model referred to as the OFM (Operator Function Model). OFMspert uses the blackboard model of problem solving to maintain a dynamic representation of operator goals, plans, tasks, and actions given previous operator actions and current system state. Results of an experiment to assess OFMspert's intent inferencing capability are outlined. Finally, the overall design philosophy for an intelligent tutoring system (OFMTutor) for operators of complex dynamic systems is summarized.

  18. Generation and performance assessment of the global TanDEM-X digital elevation model

    NASA Astrophysics Data System (ADS)

    Rizzoli, Paola; Martone, Michele; Gonzalez, Carolina; Wecklich, Christopher; Borla Tridon, Daniela; Bräutigam, Benjamin; Bachmann, Markus; Schulze, Daniel; Fritz, Thomas; Huber, Martin; Wessel, Birgit; Krieger, Gerhard; Zink, Manfred; Moreira, Alberto

    2017-10-01

    The primary objective of the TanDEM-X mission is the generation of a global, consistent, and high-resolution digital elevation model (DEM) with unprecedented global accuracy. The goal is achieved by exploiting the interferometric capabilities of the two twin SAR satellites TerraSAR-X and TanDEM-X, which fly in a close orbit formation, acting as an X-band single-pass interferometer. Between December 2010 and early 2015 all land surfaces have been acquired at least twice, difficult terrain up to seven or eight times. The acquisition strategy, data processing, and DEM calibration and mosaicking have been systematically monitored and optimized throughout the entire mission duration, in order to fulfill the specification. The processing of all data has finally been completed in September 2016 and this paper reports on the final performance of the TanDEM-X global DEM and presents the acquisition and processing strategy which allowed to obtain the final DEM quality. The results confirm the outstanding global accuracy of the delivered product, which can be now utilized for both scientific and commercial applications.

  19. Hybrid life-cycle assessment of natural gas based fuel chains for transportation.

    PubMed

    Strømman, Anders Hammer; Solli, Christian; Hertwich, Edgar G

    2006-04-15

    This research compares the use of natural gas, methanol, and hydrogen as transportation fuels. These three fuel chains start with the extraction and processing of natural gas in the Norwegian North Sea and end with final use in Central Europe. The end use is passenger transportation with a sub-compact car that has an internal combustion engine for the natural gas case and a fuel cell for the methanol and hydrogen cases. The life cycle assessment is performed by combining a process based life-cycle inventory with economic input-output data. The analysis shows that the potential climate impacts are lowest for the hydrogen fuel scenario with CO2 deposition. The hydrogen fuel chain scenario has no significant environmental disadvantage compared to the other fuel chains. Detailed analysis shows that the construction of the car contributes significantly to most impact categories. Finally, it is shown how the application of a hybrid inventory model ensures a more complete inventory description compared to standard process-based life-cycle assessment. This is particularly significant for car construction which would have been significantly underestimated in this study using standard process life-cycle assessment alone.

  20. Workshop overview: approaches to the assessment of the allergenic potential of food from genetically modified crops.

    PubMed

    Ladics, Gregory S; Holsapple, Michael P; Astwood, James D; Kimber, Ian; Knippels, Leon M J; Helm, Ricki M; Dong, Wumin

    2003-05-01

    There is a need to assess the safety of foods deriving from genetically modified (GM) crops, including the allergenic potential of novel gene products. Presently, there is no single in vitro or in vivo model that has been validated for the identification or characterization of potential food allergens. Instead, the evaluation focuses on risk factors such as source of the gene (i.e., allergenic vs. nonallergenic sources), physicochemical and genetic comparisons to known allergens, and exposure assessments. The purpose of this workshop was to gather together researchers working on various strategies for assessing protein allergenicity: (1) to describe the current state of knowledge and progress that has been made in the development and evaluation of appropriate testing strategies and (2) to identify critical issues that must now be addressed. This overview begins with a consideration of the current issues involved in assessing the allergenicity of GM foods. The second section presents information on in vitro models of digestibility, bioinformatics, and risk assessment in the context of clinical prevention and management of food allergy. Data on rodent models are presented in the next two sections. Finally, nonrodent models for assessing protein allergenicity are discussed. Collectively, these studies indicate that significant progress has been made in developing testing strategies. However, further efforts are needed to evaluate and validate the sensitivity, specificity, and reproducibility of many of these assays for determining the allergenicity potential of GM foods.

  1. Model and Analytic Processes for Export License Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determinemore » which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An approach to developing testable hypotheses for the macro-level assessment methodologies is provided. The outcome of this works suggests that we should develop a Bayes Net for micro-level analysis and continue to focus on Bayes Net, System Dynamics and Economic Input/Output models for assessing macro-level problems. Simultaneously, we need to develop metrics for assessing intent in export control, including the risks and consequences associated with all aspects of export control.« less

  2. Parameterization models for pesticide exposure via crop consumption.

    PubMed

    Fantke, Peter; Wieland, Peter; Juraske, Ronnie; Shaddick, Gavin; Itoiz, Eva Sevigné; Friedrich, Rainer; Jolliet, Olivier

    2012-12-04

    An approach for estimating human exposure to pesticides via consumption of six important food crops is presented that can be used to extend multimedia models applied in health risk and life cycle impact assessment. We first assessed the variation of model output (pesticide residues per kg applied) as a function of model input variables (substance, crop, and environmental properties) including their possible correlations using matrix algebra. We identified five key parameters responsible for between 80% and 93% of the variation in pesticide residues, namely time between substance application and crop harvest, degradation half-lives in crops and on crop surfaces, overall residence times in soil, and substance molecular weight. Partition coefficients also play an important role for fruit trees and tomato (Kow), potato (Koc), and lettuce (Kaw, Kow). Focusing on these parameters, we develop crop-specific models by parametrizing a complex fate and exposure assessment framework. The parametric models thereby reflect the framework's physical and chemical mechanisms and predict pesticide residues in harvest using linear combinations of crop, crop surface, and soil compartments. Parametric model results correspond well with results from the complex framework for 1540 substance-crop combinations with total deviations between a factor 4 (potato) and a factor 66 (lettuce). Predicted residues also correspond well with experimental data previously used to evaluate the complex framework. Pesticide mass in harvest can finally be combined with reduction factors accounting for food processing to estimate human exposure from crop consumption. All parametric models can be easily implemented into existing assessment frameworks.

  3. Estimation Model of Spacecraft Parameters and Cost Based on a Statistical Analysis of COMPASS Designs

    NASA Technical Reports Server (NTRS)

    Gerberich, Matthew W.; Oleson, Steven R.

    2013-01-01

    The Collaborative Modeling for Parametric Assessment of Space Systems (COMPASS) team at Glenn Research Center has performed integrated system analysis of conceptual spacecraft mission designs since 2006 using a multidisciplinary concurrent engineering process. The set of completed designs was archived in a database, to allow for the study of relationships between design parameters. Although COMPASS uses a parametric spacecraft costing model, this research investigated the possibility of using a top-down approach to rapidly estimate the overall vehicle costs. This paper presents the relationships between significant design variables, including breakdowns of dry mass, wet mass, and cost. It also develops a model for a broad estimate of these parameters through basic mission characteristics, including the target location distance, the payload mass, the duration, the delta-v requirement, and the type of mission, propulsion, and electrical power. Finally, this paper examines the accuracy of this model in regards to past COMPASS designs, with an assessment of outlying spacecraft, and compares the results to historical data of completed NASA missions.

  4. Influence of Tension Stiffening on the Flexural Stiffness of Reinforced Concrete Circular Sections

    PubMed Central

    Morelli, Francesco; Amico, Cosimo; Salvatore, Walter; Squeglia, Nunziante; Stacul, Stefano

    2017-01-01

    Within this paper, the assessment of tension stiffening effects on a reinforced concrete element with circular section subjected to axial and bending loads is presented. To this purpose, an enhancement of an analytical model already present within the actual technical literature is proposed. The accuracy of the enhanced method is assessed by comparing the experimental results carried out in past research and the numerical ones obtained by the model. Finally, a parametric study is executed in order to study the influence of axial compressive force on the flexural stiffness of reinforced concrete elements that are characterized by a circular section, comparing the secant stiffness evaluated at yielding and at maximum resistance, considering and not considering the effects of tension stiffness. PMID:28773028

  5. Influence of Tension Stiffening on the Flexural Stiffness of Reinforced Concrete Circular Sections.

    PubMed

    Morelli, Francesco; Amico, Cosimo; Salvatore, Walter; Squeglia, Nunziante; Stacul, Stefano

    2017-06-18

    Within this paper, the assessment of tension stiffening effects on a reinforced concrete element with the circular sections subjected to axial and bending loads is presented. To this purpose, an enhancement of an analytical model already present within the actual technical literature is proposed. The accuracy of the enhanced method is assessed by comparing the experimental results carried out in past research and the numerical ones obtained by the model. Finally, a parametric study is executed in order to study the influence of axial compressive force on the flexural stiffness of reinforced concrete elements that are characterized by a circular section, comparing the secant stiffness evaluated at yielding and at maximum resistance, considering and not considering the effects of tension stiffness.

  6. Hawaii Island Groundwater Flow Model

    DOE Data Explorer

    Nicole Lautze

    2015-01-01

    Groundwater flow model for Hawaii Island. Data is from the following sources: Whittier, R.B., K. Rotzoll, S. Dhal, A.I. El-Kadi, C. Ray, G. Chen, and D. Chang. 2004. Hawaii Source Water Assessment Program Report – Volume II – Island of Hawaii Source Water Assessment Program Report. Prepared for the Hawaii Department of Health, Safe Drinking Water Branch. University of Hawaii, Water Resources Research Center. Updated 2008; and Whittier, R. and A.I. El-Kadi. 2014. Human and Environmental Risk Ranking of Onsite Sewage Disposal Systems For the Hawaiian Islands of Kauai, Molokai, Maui, and Hawaii – Final. Prepared by the University of Hawaii, Dept. of Geology and Geophysics for the State of Hawaii Dept. of Health, Safe Drinking Water Branch. September 2014.

  7. Kauai Groundwater Flow Model

    DOE Data Explorer

    Nicole Lautze

    2015-01-01

    Groundwater flow model for Kauai. Data is from the following sources: Whittier, R. and A.I. El-Kadi. 2014. Human and Environmental Risk Ranking of Onsite Sewage Disposal Systems For the Hawaiian Islands of Kauai, Molokai, Maui, and Hawaii – Final. Prepared by the University of Hawaii, Dept. of Geology and Geophysics for the State of Hawaii Dept. of Health, Safe Drinking Water Branch. September 2014.; and Whittier, R.B., K. Rotzoll, S. Dhal, A.I. El-Kadi, C. Ray, G. Chen, and D. Chang. 2004. Hawaii Source Water Assessment Program Report – Volume IV – Island of Kauai Source Water Assessment Program Report. Prepared for the Hawaii Department of Health, Safe Drinking Water Branch. University of Hawaii, Water Resources Research Center. Updated 2015.

  8. East Maui Groundwater Flow Model

    DOE Data Explorer

    Nicole Lautze

    2015-01-01

    Groundwater flow model for East Maui. Data is from the following sources: Whittier, R. and A.I. El-Kadi. 2014. Human and Environmental Risk Ranking of Onsite Sewage Disposal Systems For the Hawaiian Islands of Kauai, Molokai, Maui, and Hawaii – Final. Prepared by the University of Hawaii, Dept. of Geology and Geophysics for the State of Hawaii Dept. of Health, Safe Drinking Water Branch. September 2014; and Whittier, R.B., K. Rotzoll, S. Dhal, A.I. El-Kadi, C. Ray, G. Chen, and D. Chang. 2004. Hawaii Source Water Assessment Program Report – Volume V – Island of Maui Source Water Assessment Program Report. Prepared for the Hawaii Department of Health, Safe Drinking Water Branch. University of Hawaii, Water Resources Research Center. Updated 2008.

  9. West Maui Groundwater Flow Model

    DOE Data Explorer

    Nicole Lautze

    2015-01-01

    Groundwater flow model for West Maui. Data is from the following sources: Whittier, R. and A.I. El-Kadi. 2014. Human and Environmental Risk Ranking of Onsite Sewage Disposal Systems For the Hawaiian Islands of Kauai, Molokai, Maui, and Hawaii – Final. Prepared by the University of Hawaii, Dept. of Geology and Geophysics for the State of Hawaii Dept. of Health, Safe Drinking Water Branch. September 2014; and Whittier, R.B., K. Rotzoll, S. Dhal, A.I. El-Kadi, C. Ray, G. Chen, and D. Chang. 2004. Hawaii Source Water Assessment Program Report – Volume V – Island of Maui Source Water Assessment Program Report. Prepared for the Hawaii Department of Health, Safe Drinking Water Branch. University of Hawaii, Water Resources Research Center. Updated 2008.

  10. The implications of free 3D scanning in the conservation state assessment of old wood painted icon

    NASA Astrophysics Data System (ADS)

    Munteanu, Marius; Sandu, Ion

    2016-06-01

    The present paper presents the conservation state and the making of a 3D model of a XVIII-th century orthodox icon on wood support, using free available software and cloud computing. In order to create the 3D model of the painting layer of the icon a number of 70 pictures were taken using a Nikon DSLR D3300, 24.2 MP in setup with a Hama Star 75 photo tripod, in loops 360° around the painting, at three different angles. The pictures were processed with Autodesk I23D Catch, which automatically finds and matches common features among all of the uploaded photographs in order to create the 3D scene, using the power and speed of cloud computing. The obtained 3D model was afterwards analyzed and processed in order to obtain a final version, which can now be use to better identify, to map and to prioritize the future conservation processes and finally can be shared online as an animation.

  11. An Analysis of a Comprehensive Evaluation Model for Guided Group Interaction Techniques with Juvenile Delinquents. Final Report.

    ERIC Educational Resources Information Center

    Silverman, Mitchell

    Reported are the first phase activities of a longitudinal project designed to evaluate the effectiveness of Guided Group Interaction (GGI) technique as a meaningful approach in the field of corrections. The main findings relate to the establishment of reliability for the main components of the Revised Behavior Scores System developed to assess the…

  12. Design and Field Testing of a Systematic Procedure for Evaluating Vocational Programs. Final Report.

    ERIC Educational Resources Information Center

    Portland Public Schools, OR.

    The purpose of a project was to design and field-test a system for evaluating the adequacy of the vocational curriculum utilized by the Vocational Village, an alternative school for the training of individuals who have experienced failure in other educational settings. Focus was on the development of an evaluation model which will assess the…

  13. A Model for Implementing the Project Physics Course for Independent Study. Final Report.

    ERIC Educational Resources Information Center

    Bolin, Calvin

    Included are results of a study conducted to assess the possibilities and effectiveness of learning physics at high school level via independent study. The sample was drawn from a regular high school physics class. During the experiment, no instruction was carried out by any teacher. An auto-instructional system was developed and provided for use…

  14. A Model Vocational High Technology in Health Care Demonstration Project. Final Performance Report.

    ERIC Educational Resources Information Center

    Valencia Community Coll., Orlando, FL.

    A unique training program in high tech obstetrical, neonatal, and pediatric nursing care areas was designed to be offered on site at Orlando (Florida) Regional Medical/Arnold Palmer Hospital for Children and Women. The training program offered 16 different courses to 355 employees over the 18-month period of the project. A needs assessment was…

  15. Project Evaluation: Validation of a Scale and Analysis of Its Predictive Capacity

    ERIC Educational Resources Information Center

    Fernandes Malaquias, Rodrigo; de Oliveira Malaquias, Fernanda Francielle

    2014-01-01

    The objective of this study was to validate a scale for assessment of academic projects. As a complement, we examined its predictive ability by comparing the scores of advised/corrected projects based on the model and the final scores awarded to the work by an examining panel (approximately 10 months after the project design). Results of…

  16. Learned Helplessness in Children with Visual Handicaps: A Pilot Study of Expectations, Persistence, and Attributions. Final Report.

    ERIC Educational Resources Information Center

    Head, Dan; And Others

    This report describes the outcomes of a one-year federally funded pilot study of 14 students with low vision or blindness (grades 3-6) and 13 teachers. The study was designed to generate practical classroom assessment procedures for measuring "learned helplessness" and recommendations for a conceptual intervention model for use in the classroom.…

  17. Participatory flood vulnerability assessment: a multi-criteria approach

    NASA Astrophysics Data System (ADS)

    Madruga de Brito, Mariana; Evers, Mariele; Delos Santos Almoradie, Adrian

    2018-01-01

    This paper presents a participatory multi-criteria decision-making (MCDM) approach for flood vulnerability assessment while considering the relationships between vulnerability criteria. The applicability of the proposed framework is demonstrated in the municipalities of Lajeado and Estrela, Brazil. The model was co-constructed by 101 experts from governmental organizations, universities, research institutes, NGOs, and private companies. Participatory methods such as the Delphi survey, focus groups, and workshops were applied. A participatory problem structuration, in which the modellers work closely with end users, was used to establish the structure of the vulnerability index. The preferences of each participant regarding the criteria importance were spatially modelled through the analytical hierarchy process (AHP) and analytical network process (ANP) multi-criteria methods. Experts were also involved at the end of the modelling exercise for validation. The final product is a set of individual and group flood vulnerability maps. Both AHP and ANP proved to be effective for flood vulnerability assessment; however, ANP is preferred as it considers the dependences among criteria. The participatory approach enabled experts to learn from each other and acknowledge different perspectives towards social learning. The findings highlight that to enhance the credibility and deployment of model results, multiple viewpoints should be integrated without forcing consensus.

  18. Rainfall-induced Landslide Susceptibility assessment at the Longnan county

    NASA Astrophysics Data System (ADS)

    Hong, Haoyuan; Zhang, Ying

    2017-04-01

    Landslides are a serious disaster in Longnan county, China. Therefore landslide susceptibility assessment is useful tool for government or decision making. The main objective of this study is to investigate and compare the frequency ratio, support vector machines, and logistic regression. The Longnan county (Jiangxi province, China) was selected as the case study. First, the landslide inventory map with 354 landslide locations was constructed. Then landslide locations were then randomly divided into a ratio of 70/30 for the training and validating the models. Second, fourteen landslide conditioning factors were prepared such as slope, aspect, altitude, topographic wetness index (TWI), stream power index (SPI), sediment transport index (STI), plan curvature, lithology, distance to faults, distance to rivers, distance to roads, land use, normalized difference vegetation index (NDVI), and rainfall. Using the frequency ratio, support vector machines, and logistic regression, a total of three landslide susceptibility models were constructed. Finally, the overall performance of the resulting models was assessed and compared using the Receiver operating characteristic (ROC) curve technique. The result showed that the support vector machines model is the best model in the study area. The success rate is 88.39 %; and prediction rate is 84.06 %.

  19. Linking Aerosol Optical Properties Between Laboratory, Field, and Model Studies

    NASA Astrophysics Data System (ADS)

    Murphy, S. M.; Pokhrel, R. P.; Foster, K. A.; Brown, H.; Liu, X.

    2017-12-01

    The optical properties of aerosol emissions from biomass burning have a significant impact on the Earth's radiative balance. Based on measurements made during the Fourth Fire Lab in Missoula Experiment, our group published a series of parameterizations that related optical properties (single scattering albedo and absorption due to brown carbon at multiple wavelengths) to the elemental to total carbon ratio of aerosols emitted from biomass burning. In this presentation, the ability of these parameterizations to simulate the optical properties of ambient aerosol is assessed using observations collected in 2017 from our mobile laboratory chasing wildfires in the Western United States. The ambient data includes measurements of multi-wavelength absorption, scattering, and extinction, size distribution, chemical composition, and volatility. In addition to testing the laboratory parameterizations, this combination of measurements allows us to assess the ability of core-shell Mie Theory to replicate observations and to assess the impact of brown carbon and mixing state on optical properties. Finally, both laboratory and ambient data are compared to the optical properties generated by a prominent climate model (Community Earth System Model (CESM) coupled with the Community Atmosphere Model (CAM 5)). The discrepancies between lab observations, ambient observations and model output will be discussed.

  20. The NASA Space Radiobiology Risk Assessment Project

    NASA Astrophysics Data System (ADS)

    Cucinotta, Francis A.; Huff, Janice; Ponomarev, Artem; Patel, Zarana; Kim, Myung-Hee

    The current first phase (2006-2011) has the three major goals of: 1) optimizing the conventional cancer risk models currently used based on the double-detriment life-table and radiation quality functions; 2) the integration of biophysical models of acute radiation syndromes; and 3) the development of new systems radiation biology models of cancer processes. The first-phase also includes continued uncertainty assessment of space radiation environmental models and transport codes, and relative biological effectiveness factors (RBE) based on flight data and NSRL results, respectively. The second phase of the (2012-2016) will: 1) develop biophysical models of central nervous system risks (CNS); 2) achieve comphrensive systems biology models of cancer processes using data from proton and heavy ion studies performed at NSRL; and 3) begin to identify computational models of biological countermeasures. Goals for the third phase (2017-2021) include: 1) the development of a systems biology model of cancer risks for operational use at NASA; 2) development of models of degenerative risks, 2) quantitative models of counter-measure impacts on cancer risks; and 3) indiviudal based risk assessments. Finally, we will support a decision point to continue NSRL research in support of NASA's exploration goals beyond 2021, and create an archival of NSRL research results for continued analysis. Details on near term goals, plans for a WEB based data resource of NSRL results, and a space radiation Wikepedia are described.

  1. Predicting Soil Organic Carbon and Total Nitrogen in the Russian Chernozem from Depth and Wireless Color Sensor Measurements

    NASA Astrophysics Data System (ADS)

    Mikhailova, E. A.; Stiglitz, R. Y.; Post, C. J.; Schlautman, M. A.; Sharp, J. L.; Gerard, P. D.

    2017-12-01

    Color sensor technologies offer opportunities for affordable and rapid assessment of soil organic carbon (SOC) and total nitrogen (TN) in the field, but the applicability of these technologies may vary by soil type. The objective of this study was to use an inexpensive color sensor to develop SOC and TN prediction models for the Russian Chernozem (Haplic Chernozem) in the Kursk region of Russia. Twenty-one dried soil samples were analyzed using a Nix Pro™ color sensor that is controlled through a mobile application and Bluetooth to collect CIEL*a*b* (darkness to lightness, green to red, and blue to yellow) color data. Eleven samples were randomly selected to be used to construct prediction models and the remaining ten samples were set aside for cross validation. The root mean squared error (RMSE) was calculated to determine each model's prediction error. The data from the eleven soil samples were used to develop the natural log of SOC (lnSOC) and TN (lnTN) prediction models using depth, L*, a*, and b* for each sample as predictor variables in regression analyses. Resulting residual plots, root mean square errors (RMSE), mean squared prediction error (MSPE) and coefficients of determination ( R 2, adjusted R 2) were used to assess model fit for each of the SOC and total N prediction models. Final models were fit using all soil samples, which included depth and color variables, for lnSOC ( R 2 = 0.987, Adj. R 2 = 0.981, RMSE = 0.003, p-value < 0.001, MSPE = 0.182) and lnTN ( R 2 = 0.980 Adj. R 2 = 0.972, RMSE = 0.004, p-value < 0.001, MSPE = 0.001). Additionally, final models were fit for all soil samples, which included only color variables, for lnSOC ( R 2 = 0.959 Adj. R 2 = 0.949, RMSE = 0.007, p-value < 0.001, MSPE = 0.536) and lnTN ( R 2 = 0.912 Adj. R 2 = 0.890, RMSE = 0.015, p-value < 0.001, MSPE = 0.001). The results suggest that soil color may be used for rapid assessment of SOC and TN in these agriculturally important soils.

  2. Final Environmental Assessment, Horse Creek Bridge Replacement

    DTIC Science & Technology

    2010-10-01

    Final Environmental Assessment Horse Creek Bridge Replacement 78th Civil Engineer Group...Final Environmental Assessment Horse Creek Bridge Replacement 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d...Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 FINDING OF NO SIGNIFICANT IMPACT (FONSI)/ FINDING OF NO PRACTICABLE ALTERNATIVE (FONP A) HORSE

  3. Regression models for predicting peak and continuous three-dimensional spinal loads during symmetric and asymmetric lifting tasks.

    PubMed

    Fathallah, F A; Marras, W S; Parnianpour, M

    1999-09-01

    Most biomechanical assessments of spinal loading during industrial work have focused on estimating peak spinal compressive forces under static and sagittally symmetric conditions. The main objective of this study was to explore the potential of feasibly predicting three-dimensional (3D) spinal loading in industry from various combinations of trunk kinematics, kinetics, and subject-load characteristics. The study used spinal loading, predicted by a validated electromyography-assisted model, from 11 male participants who performed a series of symmetric and asymmetric lifts. Three classes of models were developed: (a) models using workplace, subject, and trunk motion parameters as independent variables (kinematic models); (b) models using workplace, subject, and measured moments variables (kinetic models); and (c) models incorporating workplace, subject, trunk motion, and measured moments variables (combined models). The results showed that peak 3D spinal loading during symmetric and asymmetric lifting were predicted equally well using all three types of regression models. Continuous 3D loading was predicted best using the combined models. When the use of such models is infeasible, the kinematic models can provide adequate predictions. Finally, lateral shear forces (peak and continuous) were consistently underestimated using all three types of models. The study demonstrated the feasibility of predicting 3D loads on the spine under specific symmetric and asymmetric lifting tasks without the need for collecting EMG information. However, further validation and development of the models should be conducted to assess and extend their applicability to lifting conditions other than those presented in this study. Actual or potential applications of this research include exposure assessment in epidemiological studies, ergonomic intervention, and laboratory task assessment.

  4. An integrated approach coupling physically based models and probabilistic method to assess quantitatively landslide susceptibility at different scale: application to different geomorphological environments

    NASA Astrophysics Data System (ADS)

    Vandromme, Rosalie; Thiéry, Yannick; Sedan, Olivier; Bernardie, Séverine

    2016-04-01

    Landslide hazard assessment is the estimation of a target area where landslides of a particular type, volume, runout and intensity may occur within a given period. The first step to analyze landslide hazard consists in assessing the spatial and temporal failure probability (when the information is available, i.e. susceptibility assessment). Two types of approach are generally recommended to achieve this goal: (i) qualitative approach (i.e. inventory based methods and knowledge data driven methods) and (ii) quantitative approach (i.e. data-driven methods or deterministic physically based methods). Among quantitative approaches, deterministic physically based methods (PBM) are generally used at local and/or site-specific scales (1:5,000-1:25,000 and >1:5,000, respectively). The main advantage of these methods is the calculation of probability of failure (safety factor) following some specific environmental conditions. For some models it is possible to integrate the land-uses and climatic change. At the opposite, major drawbacks are the large amounts of reliable and detailed data (especially materials type, their thickness and the geotechnical parameters heterogeneity over a large area) and the fact that only shallow landslides are taking into account. This is why they are often used at site-specific scales (> 1:5,000). Thus, to take into account (i) materials' heterogeneity , (ii) spatial variation of physical parameters, (iii) different landslide types, the French Geological Survey (i.e. BRGM) has developed a physically based model (PBM) implemented in a GIS environment. This PBM couples a global hydrological model (GARDENIA®) including a transient unsaturated/saturated hydrological component with a physically based model computing the stability of slopes (ALICE®, Assessment of Landslides Induced by Climatic Events) based on the Morgenstern-Price method for any slip surface. The variability of mechanical parameters is handled by Monte Carlo approach. The probability to obtain a safety factor below 1 represents the probability of occurrence of a landslide for a given triggering event. The dispersion of the distribution gives the uncertainty of the result. Finally, a map is created, displaying a probability of occurrence for each computing cell of the studied area. In order to take into account the land-uses change, a complementary module integrating the vegetation effects on soil properties has been recently developed. Last years, the model has been applied at different scales for different geomorphological environments: (i) at regional scale (1:50,000-1:25,000) in French West Indies and French Polynesian islands (ii) at local scale (i.e.1:10,000) for two complex mountainous areas; (iii) at the site-specific scale (1:2,000) for one landslide. For each study the 3D geotechnical model has been adapted. The different studies have allowed : (i) to discuss the different factors included in the model especially the initial 3D geotechnical models; (ii) to precise the location of probable failure following different hydrological scenarii; (iii) to test the effects of climatic change and land-use on slopes for two cases. In that way, future changes in temperature, precipitation and vegetation cover can be analyzed, permitting to address the impacts of global change on landslides. Finally, results show that it is possible to obtain reliable information about future slope failures at different scale of work for different scenarii with an integrated approach. The final information about landslide susceptibility (i.e. probability of failure) can be integrated in landslide hazard assessment and could be an essential information source for future land planning. As it has been performed in the ANR Project SAMCO (Society Adaptation for coping with Mountain risks in a global change COntext), this analysis constitutes a first step in the chain for risk assessment for different climate and economical development scenarios, to evaluate the resilience of mountainous areas.

  5. A multi-model assessment of terrestrial biosphere model data needs

    NASA Astrophysics Data System (ADS)

    Gardella, A.; Cowdery, E.; De Kauwe, M. G.; Desai, A. R.; Duveneck, M.; Fer, I.; Fisher, R.; Knox, R. G.; Kooper, R.; LeBauer, D.; McCabe, T.; Minunno, F.; Raiho, A.; Serbin, S.; Shiklomanov, A. N.; Thomas, A.; Walker, A.; Dietze, M.

    2017-12-01

    Terrestrial biosphere models provide us with the means to simulate the impacts of climate change and their uncertainties. Going beyond direct observation and experimentation, models synthesize our current understanding of ecosystem processes and can give us insight on data needed to constrain model parameters. In previous work, we leveraged the Predictive Ecosystem Analyzer (PEcAn) to assess the contribution of different parameters to the uncertainty of the Ecosystem Demography model v2 (ED) model outputs across various North American biomes (Dietze et al., JGR-G, 2014). While this analysis identified key research priorities, the extent to which these priorities were model- and/or biome-specific was unclear. Furthermore, because the analysis only studied one model, we were unable to comment on the effect of variability in model structure to overall predictive uncertainty. Here, we expand this analysis to all biomes globally and a wide sample of models that vary in complexity: BioCro, CABLE, CLM, DALEC, ED2, FATES, G'DAY, JULES, LANDIS, LINKAGES, LPJ-GUESS, MAESPA, PRELES, SDGVM, SIPNET, and TEM. Prior to performing uncertainty analyses, model parameter uncertainties were assessed by assimilating all available trait data from the combination of the BETYdb and TRY trait databases, using an updated multivariate version of PEcAn's Hierarchical Bayesian meta-analysis. Next, sensitivity analyses were performed for all models across a range of sites globally to assess sensitivities for a range of different outputs (GPP, ET, SH, Ra, NPP, Rh, NEE, LAI) at multiple time scales from the sub-annual to the decadal. Finally, parameter uncertainties and model sensitivities were combined to evaluate the fractional contribution of each parameter to the predictive uncertainty for a specific variable at a specific site and timescale. Facilitated by PEcAn's automated workflows, this analysis represents the broadest assessment of the sensitivities and uncertainties in terrestrial models to date, and provides a comprehensive roadmap for constraining model uncertainties through model development and data collection.

  6. Incorporating uncertainty in predictive species distribution modelling.

    PubMed

    Beale, Colin M; Lennon, Jack J

    2012-01-19

    Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates.

  7. Impact of aircraft emissions on air quality in the vicinity of airports. Volume II. An updated model assessment of aircraft generated air pollution at LAX, JFK, and ORD. Final report Jan 1978-Jul 1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamartino, R.J.; Smith, D.G.; Bremer, S.A.

    1980-07-01

    This report documents the results of the Federal Aviation Administration (FAA)/Environmental Protection Agency (EPA) air quality study which has been conducted to assess the impact of aircraft emissions of carbon monoxide (CO), hydrocarbons (HC), and oxides of nitrogen (NOx) in the vicinity of airports. This assessment includes the results of recent modeling and monitoring efforts at Washington National (DCA), Los Angeles International (LAX), Dulles International (IAD), and Lakeland, Florida airports and an updated modeling of aircraft generated pollution at LAX, John F. Kennedy (JFK) and Chicago O'Hare (ORD) airports. The Airport Vicinity Air Pollution (AVAP) model which was designed formore » use at civil airports was used in this assessment. In addition the results of the application of the military version of the AVAP model the Air Quality Assessment Model (AQAM), are summarized. Both the results of the pollution monitoring analyses in Volume I and the modeling studies in Volume II suggest that: maximum hourly average CO concentrations from aircraft are unlikely to exceed 5 parts per million (ppm) in areas of public exposure and are thus small in comparison to the National Ambient Air Quality Standard of 35 ppm; maximum hourly HC concentrations from aircraft can exceed 0.25 ppm over an area several times the size of the airport; and annual average NO2 concentrations from aircraft are estimated to contribute only 10 to 20 percent of the NAAQS limit level.« less

  8. Nursing Assessment Tool for People With Liver Cirrhosis

    PubMed Central

    Reis, Renata Karina; da Silva, Patrícia Costa dos Santos; Silva, Ana Elisa Bauer de Camargo; Atila, Elisabeth

    2016-01-01

    The aim of this study was to describe the process of developing a nursing assessment tool for hospitalized adult patients with liver cirrhosis. A descriptive study was carried out in three stages. First, we conducted a literature review to develop a data collection tool on the basis of the Conceptual Model of Wanda Horta. Second, the data collection tool was assessed through an expert panel. Third, we conducted the pilot testing in hospitalized patients. Most of the comments offered by the panel members were accepted to improve the tool. The final version was in the form of a questionnaire with open-closed questions. The panel members concluded that the tool was useful for accurate nursing diagnosis. Horta's Conceptual Model assisted with the development of this data collection tool to help nurses identify accurate nursing diagnosis in hospitalized patients with liver cirrhosis. We hope that the tool can be used by all nurses in clinical practice. PMID:26425862

  9. Tracing the source of numerical climate model uncertainties in precipitation simulations using a feature-oriented statistical model

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Jones, A. D.; Rhoades, A.

    2017-12-01

    Precipitation is a key component in hydrologic cycles, and changing precipitation regimes contribute to more intense and frequent drought and flood events around the world. Numerical climate modeling is a powerful tool to study climatology and to predict future changes. Despite the continuous improvement in numerical models, long-term precipitation prediction remains a challenge especially at regional scales. To improve numerical simulations of precipitation, it is important to find out where the uncertainty in precipitation simulations comes from. There are two types of uncertainty in numerical model predictions. One is related to uncertainty in the input data, such as model's boundary and initial conditions. These uncertainties would propagate to the final model outcomes even if the numerical model has exactly replicated the true world. But a numerical model cannot exactly replicate the true world. Therefore, the other type of model uncertainty is related the errors in the model physics, such as the parameterization of sub-grid scale processes, i.e., given precise input conditions, how much error could be generated by the in-precise model. Here, we build two statistical models based on a neural network algorithm to predict long-term variation of precipitation over California: one uses "true world" information derived from observations, and the other uses "modeled world" information using model inputs and outputs from the North America Coordinated Regional Downscaling Project (NA CORDEX). We derive multiple climate feature metrics as the predictors for the statistical model to represent the impact of global climate on local hydrology, and include topography as a predictor to represent the local control. We first compare the predictors between the true world and the modeled world to determine the errors contained in the input data. By perturbing the predictors in the statistical model, we estimate how much uncertainty in the model's final outcomes is accounted for by each predictor. By comparing the statistical model derived from true world information and modeled world information, we assess the errors lying in the physics of the numerical models. This work provides a unique insight to assess the performance of numerical climate models, and can be used to guide improvement of precipitation prediction.

  10. A Critical Evaluation of Waste Incineration Plants in Wuhan (China) Based on Site Selection, Environmental Influence, Public Health and Public Participation

    PubMed Central

    Hu, Hui; Li, Xiang; Nguyen, Anh Dung; Kavan, Philip

    2015-01-01

    With the rapid development of the waste incineration industry in China, top priority has been given to the problem of pollution caused by waste incineration. This study is the first attempt to assess all the waste incineration plants in Wuhan, the only national key city in central China, in terms of environmental impact, site selection, public health and public participation. By using a multi-criterion assessment model for economic, social, public health and environmental effects, this study indicates these incineration plants are established without much consideration of the local residents’ health and environment. A location analysis is also applied and some influences of waste incineration plants are illustrated. This study further introduces a signaling game model to prove that public participation is a necessary condition for improving the environmental impact assessment and increasing total welfare of different interest groups in China. This study finally offers some corresponding recommendations for improving the environmental impact assessments of waste incineration projects. PMID:26184242

  11. A Critical Evaluation of Waste Incineration Plants in Wuhan (China) Based on Site Selection, Environmental Influence, Public Health and Public Participation.

    PubMed

    Hu, Hui; Li, Xiang; Nguyen, Anh Dung; Kavan, Philip

    2015-07-08

    With the rapid development of the waste incineration industry in China, top priority has been given to the problem of pollution caused by waste incineration. This study is the first attempt to assess all the waste incineration plants in Wuhan, the only national key city in central China, in terms of environmental impact, site selection, public health and public participation. By using a multi-criterion assessment model for economic, social, public health and environmental effects, this study indicates these incineration plants are established without much consideration of the local residents' health and environment. A location analysis is also applied and some influences of waste incineration plants are illustrated. This study further introduces a signaling game model to prove that public participation is a necessary condition for improving the environmental impact assessment and increasing total welfare of different interest groups in China. This study finally offers some corresponding recommendations for improving the environmental impact assessments of waste incineration projects.

  12. Coupling Processes Between Atmospheric Chemistry and Climate

    NASA Technical Reports Server (NTRS)

    Ko, Malcolm; Weisenstein, Debra; Rodriquez, Jose; Danilin, Michael; Scott, Courtney; Shia, Run-Lie; Eluszkiewicz, Janusz; Sze, Nien-Dak; Stewart, Richard W. (Technical Monitor)

    1999-01-01

    This is the final report for NAS5-97039 for work performed between December 1996 and November 1999. The overall objective of this project is to improve the understanding of coupling processes among atmospheric chemistry, aerosol and climate, all important for quantitative assessments of global change. Among our priority are changes in ozone and stratospheric sulfate aerosol, with emphasis on how ozone in the lower stratosphere would respond to natural or anthropogenic changes. The work emphasizes two important aspects: (1) AER's continued participation in preparation of, and providing scientific input for, various scientific reports connected with assessment of stratospheric ozone and climate. These include participation in various model intercomparison exercises as well as preparation of national and international reports. (2) Continued development of the AER three-wave interactive model to address how the transport circulation will change as ozone and the thermal properties of the atmosphere change, and assess how these new findings will affect our confidence in the ozone assessment results.

  13. How Adverse Outcome Pathways Can Aid the Development and Use of Computational Prediction Models for Regulatory Toxicology.

    PubMed

    Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M E Bette; Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M; Whelan, Maurice

    2017-02-01

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24-25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology.

  14. Predicting stress urinary incontinence during pregnancy: combination of pelvic floor ultrasound parameters and clinical factors.

    PubMed

    Chen, Ling; Luo, Dan; Yu, Xiajuan; Jin, Mei; Cai, Wenzhi

    2018-05-12

    The aim of this study was to develop and validate a predictive tool that combining pelvic floor ultrasound parameters and clinical factors for stress urinary incontinence during pregnancy. A total of 535 women in first or second trimester were included for an interview and transperineal ultrasound assessment from two hospitals. Imaging data sets were analyzed offline to assess for bladder neck vertical position, urethra angles (α, β, and γ angles), hiatal area and bladder neck funneling. All significant continuous variables at univariable analysis were analyzed by receiver-operating characteristics. Three multivariable logistic models were built on clinical factor, and combined with ultrasound parameters. The final predictive model with best performance and fewest variables was selected to establish a nomogram. Internal and external validation of the nomogram were performed by both discrimination represented by C-index and calibration measured by Hosmer-Lemeshow test. A decision curve analysis was conducted to determine the clinical utility of the nomogram. After excluding 14 women with invalid data, 521 women were analyzed. β angle, γ angle and hiatal area had limited predictive value for stress urinary incontinence during pregnancy, with area under curves of 0.558-0.648. The final predictive model included body mass index gain since pregnancy, constipation, previous delivery mode, β angle at rest, and bladder neck funneling. The nomogram based on the final model showed good discrimination with a C-index of 0.789 and satisfactory calibration (P=0.828), both of which were supported by external validation. Decision curve analysis showed that the nomogram was clinical useful. The nomogram incorporating both the pelvic floor ultrasound parameters and clinical factors has been validated to show good discrimination and calibration, and could be an important tool for stress urinary incontinence risk prediction at an early stage of pregnancy. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  15. Evaluation of the Inhalation Carcinogenicity of Ethylene Oxide (Final Report)

    EPA Science Inventory

    EPA has finalized its Evaluation of the Inhalation Carcinogenicity of Ethylene Oxide. This assessment addresses the potential carcinogenicity from long-term inhalation exposure to ethylene oxide. Now final, this assessment updates the carcinogenicity information in EPA’s 1985 Hea...

  16. Using models in Integrated Ecosystem Assessment of coastal areas

    NASA Astrophysics Data System (ADS)

    Solidoro, Cosimo; Bandelj, Vinko; Cossarini, Gianpiero; Melaku Canu, Donata; Libralato, Simone

    2014-05-01

    Numerical Models can greatly contribute to integrated ecological assessment of coastal and marine systems. Indeed, models can: i) assist in the identification of efficient sampling strategy; ii) provide space interpolation and time extrapolation of experiemtanl data which are based on the knowedge on processes dynamics and causal realtionships which is coded within the model, iii) provide estimates of hardly measurable indicators. Furthermore model can provide indication on potential effects of implementation of alternative management policies. Finally, by providing a synthetic representation of an ideal system, based on its essential dynamic, model return a picture of ideal behaviour of a system in the absence of external perturbation, alteration, noise, which might help in the identification of reference behaivuor. As an important example, model based reanalyses of biogeochemical and ecological properties are an urgent need for the estimate of the environmental status and the assessment of efficacy of conservation and environmental policies, also with reference to the enforcement of the European MSFD. However, the use of numerical models, and particularly of ecological models, in modeling and in environmental management still is far from be the rule, possibly because of a lack in realizing the benefits which a full integration of modeling and montoring systems might provide, possibly because of a lack of trust in modeling results, or because many problems still exists in the development, validation and implementation of models. For istance, assessing the validity of model results is a complex process that requires the definition of appropriate indicators, metrics, methodologies and faces with the scarcity of real-time in-situ biogeochemical data. Furthermore, biogeochemical models typically consider dozens of variables which are heavily undersampled. Here we show how the integration of mathematical model and monitoring data can support integrated ecosystem assessment of a waterbody by reviewing applications from a complex coastal ecosystem, the Lagoon of Venice, and explore potential applications to other coastal and open sea system, up to the scale of the Mediterannean Sea.

  17. Final assessment of nursing students in clinical practice: Perspectives of nursing teachers, students and mentors.

    PubMed

    Helminen, Kristiina; Johnson, Martin; Isoaho, Hannu; Turunen, Hannele; Tossavainen, Kerttu

    2017-12-01

    To describe the phenomenon of final assessment of the clinical practice of nursing students and to examine whether there were differences in assessments by the students and their teachers and mentors. Final assessment of students in clinical practice during their education has great importance for ensuring that enough high-quality nursing students are trained, as assessment tasks affect what the nursing student learns during the clinical practice. This study used descriptive, cross-sectional design. The population of this study comprised nursing students (n = 276) and their teachers (n = 108) in five universities of applied sciences in Finland as well as mentors (n = 225) who came from five partner hospitals. A questionnaire developed for this study contained questions about background variables as well as structured questions scored on a four-point scale, which also allowed the respondents to provide additional comments. When comparing the results related to nursing teachers' presence in the final assessment situation, it was found that teachers and mentors evaluated this as being carried out more often than nursing students suggested. Nursing students noted that fair and consistent assessment is carried out more often than nursing teachers thought. Mentors and teachers said that honest and direct criteria-based final assessment was carried out more often than nursing students evaluated. Nursing students and mentors need support from educational institutions and from nursing teachers in order to ensure the completion of a relevant assessment process. The findings of this study highlight an awareness of final assessment process. It is desirable to have a common understanding, for example, of how the assessment should be managed and what the assessment criteria are, as this will ensure a good quality process. © 2017 John Wiley & Sons Ltd.

  18. Progress and challenges in coupled hydrodynamic-ecological estuarine modeling.

    PubMed

    Ganju, Neil K; Brush, Mark J; Rashleigh, Brenda; Aretxabaleta, Alfredo L; Del Barrio, Pilar; Grear, Jason S; Harris, Lora A; Lake, Samuel J; McCardell, Grant; O'Donnell, James; Ralston, David K; Signell, Richard P; Testa, Jeremy M; Vaudrey, Jamie M P

    2016-03-01

    Numerical modeling has emerged over the last several decades as a widely accepted tool for investigations in environmental sciences. In estuarine research, hydrodynamic and ecological models have moved along parallel tracks with regard to complexity, refinement, computational power, and incorporation of uncertainty. Coupled hydrodynamic-ecological models have been used to assess ecosystem processes and interactions, simulate future scenarios, and evaluate remedial actions in response to eutrophication, habitat loss, and freshwater diversion. The need to couple hydrodynamic and ecological models to address research and management questions is clear, because dynamic feedbacks between biotic and physical processes are critical interactions within ecosystems. In this review we present historical and modern perspectives on estuarine hydrodynamic and ecological modeling, consider model limitations, and address aspects of model linkage, skill assessment, and complexity. We discuss the balance between spatial and temporal resolution and present examples using different spatiotemporal scales. Finally, we recommend future lines of inquiry, approaches to balance complexity and uncertainty, and model transparency and utility. It is idealistic to think we can pursue a "theory of everything" for estuarine models, but recent advances suggest that models for both scientific investigations and management applications will continue to improve in terms of realism, precision, and accuracy.

  19. Progress and challenges in coupled hydrodynamic-ecological estuarine modeling

    USGS Publications Warehouse

    Ganju, Neil K.; Brush, Mark J.; Rashleigh, Brenda; Aretxabaleta, Alfredo L.; del Barrio, Pilar; Grear, Jason S.; Harris, Lora A.; Lake, Samuel J.; McCardell, Grant; O'Donnell, James; Ralston, David K.; Signell, Richard P.; Testa, Jeremy; Vaudrey, Jamie M. P.

    2016-01-01

    Numerical modeling has emerged over the last several decades as a widely accepted tool for investigations in environmental sciences. In estuarine research, hydrodynamic and ecological models have moved along parallel tracks with regard to complexity, refinement, computational power, and incorporation of uncertainty. Coupled hydrodynamic-ecological models have been used to assess ecosystem processes and interactions, simulate future scenarios, and evaluate remedial actions in response to eutrophication, habitat loss, and freshwater diversion. The need to couple hydrodynamic and ecological models to address research and management questions is clear because dynamic feedbacks between biotic and physical processes are critical interactions within ecosystems. In this review, we present historical and modern perspectives on estuarine hydrodynamic and ecological modeling, consider model limitations, and address aspects of model linkage, skill assessment, and complexity. We discuss the balance between spatial and temporal resolution and present examples using different spatiotemporal scales. Finally, we recommend future lines of inquiry, approaches to balance complexity and uncertainty, and model transparency and utility. It is idealistic to think we can pursue a “theory of everything” for estuarine models, but recent advances suggest that models for both scientific investigations and management applications will continue to improve in terms of realism, precision, and accuracy.

  20. Progress and challenges in coupled hydrodynamic-ecological estuarine modeling

    PubMed Central

    Ganju, Neil K.; Brush, Mark J.; Rashleigh, Brenda; Aretxabaleta, Alfredo L.; del Barrio, Pilar; Grear, Jason S.; Harris, Lora A.; Lake, Samuel J.; McCardell, Grant; O’Donnell, James; Ralston, David K.; Signell, Richard P.; Testa, Jeremy M.; Vaudrey, Jamie M.P.

    2016-01-01

    Numerical modeling has emerged over the last several decades as a widely accepted tool for investigations in environmental sciences. In estuarine research, hydrodynamic and ecological models have moved along parallel tracks with regard to complexity, refinement, computational power, and incorporation of uncertainty. Coupled hydrodynamic-ecological models have been used to assess ecosystem processes and interactions, simulate future scenarios, and evaluate remedial actions in response to eutrophication, habitat loss, and freshwater diversion. The need to couple hydrodynamic and ecological models to address research and management questions is clear, because dynamic feedbacks between biotic and physical processes are critical interactions within ecosystems. In this review we present historical and modern perspectives on estuarine hydrodynamic and ecological modeling, consider model limitations, and address aspects of model linkage, skill assessment, and complexity. We discuss the balance between spatial and temporal resolution and present examples using different spatiotemporal scales. Finally, we recommend future lines of inquiry, approaches to balance complexity and uncertainty, and model transparency and utility. It is idealistic to think we can pursue a “theory of everything” for estuarine models, but recent advances suggest that models for both scientific investigations and management applications will continue to improve in terms of realism, precision, and accuracy. PMID:27721675

  1. Cognitive and affective influences on perceived risk of ovarian cancer.

    PubMed

    Peipins, Lucy A; McCarty, Frances; Hawkins, Nikki A; Rodriguez, Juan L; Scholl, Lawrence E; Leadbetter, Steven

    2015-03-01

    Studies suggest that both affective and cognitive processes are involved in the perception of vulnerability to cancer and that affect has an early influence in this assessment of risk. We constructed a path model based on a conceptual framework of heuristic reasoning (affect, resemblance, and availability) coupled with cognitive processes involved in developing personal models of cancer causation. From an eligible cohort of 16 700 women in a managed care organization, we randomly selected 2524 women at high, elevated, and average risk of ovarian cancer and administered a questionnaire to test our model (response rate 76.3%). Path analysis delineated the relationships between personal and cognitive characteristics (number of relatives with cancer, age, ideas about cancer causation, perceived resemblance to an affected friend or relative, and ovarian cancer knowledge) and emotional constructs (closeness to an affected relative or friend, time spent processing the cancer experience, and cancer worry) on perceived risk of ovarian cancer. Our final model fit the data well (root mean square error of approximation (RMSEA) = 0.028, comparative fit index (CFI) = 0.99, normed fit index (NFI) = 0.98). This final model (1) demonstrated the nature and direction of relationships between cognitive characteristics and perceived risk; (2) showed that time spent processing the cancer experience was associated with cancer worry; and (3) showed that cancer worry moderately influenced perceived risk. Our results highlight the important role that family cancer experience has on cancer worry and shows how cancer experience translates into personal risk perceptions. This understanding informs the discordance between medical or objective risk assessment and personal risk assessment. Published in 2014. This article is a U.S. Government work and is in the public domain in the USA. Published in 2014. This article is a U.S. Government work and is in the public domain in the USA.

  2. A Risk Assessment Model for Reduced Aircraft Separation: A Quantitative Method to Evaluate the Safety of Free Flight

    NASA Technical Reports Server (NTRS)

    Cassell, Rick; Smith, Alex; Connors, Mary; Wojciech, Jack; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    As new technologies and procedures are introduced into the National Airspace System, whether they are intended to improve efficiency, capacity, or safety level, the quantification of potential changes in safety levels is of vital concern. Applications of technology can improve safety levels and allow the reduction of separation standards. An excellent example is the Precision Runway Monitor (PRM). By taking advantage of the surveillance and display advances of PRM, airports can run instrument parallel approaches to runways separated by 3400 feet with the same level of safety as parallel approaches to runways separated by 4300 feet using the standard technology. Despite a wealth of information from flight operations and testing programs, there is no readily quantifiable relationship between numerical safety levels and the separation standards that apply to aircraft on final approach. This paper presents a modeling approach to quantify the risk associated with reducing separation on final approach. Reducing aircraft separation, both laterally and longitudinally, has been the goal of several aviation R&D programs over the past several years. Many of these programs have focused on technological solutions to improve navigation accuracy, surveillance accuracy, aircraft situational awareness, controller situational awareness, and other technical and operational factors that are vital to maintaining flight safety. The risk assessment model relates different types of potential aircraft accidents and incidents and their contribution to overall accident risk. The framework links accident risks to a hierarchy of failsafe mechanisms characterized by procedures and interventions. The model will be used to assess the overall level of safety associated with reducing separation standards and the introduction of new technology and procedures, as envisaged under the Free Flight concept. The model framework can be applied to various aircraft scenarios, including parallel and in-trail approaches. This research was performed under contract to NASA and in cooperation with the FAA's Safety Division (ASY).

  3. Assessing tomorrow's learners: in competency-based education only a radically different holistic method of assessment will work. Six things we could forget.

    PubMed

    Schuwirth, Lambert; Ash, Julie

    2013-07-01

    In this paper we are challenging six traditional notions about assessment that are unhelpful when designing 'assessment for learning'-programmes for competency-based education. We are arguing for the following: Reductionism is not the only way to assure rigour in high-stakes assessment; holistic judgements can be equally rigorous. Combining results of assessment parts only because they are of the same format (like different stations in an OSCE) is often not defensible; instead there must be a logically justifiable combination. Numbers describe the quality of the assessment. Therefore, manipulating the numbers is usually not the best way to improve its quality. Not every assessment moment needs to be a decision moment, disconnecting both makes combining summative and formative functions of assessment easier. Standardisation is not the only route to equity. Especially with diverse student groups tailoring is more equitable than standardisation. The most important element to standardise is the quality of the process and not the process itself. Finally, most assessment is too much focussed on detecting deficiencies and not on valuing individual student differences. In competency-based education--especially with a focus on learner orientation--this 'deficiency-model' is not as well aligned as a 'differences-model'.

  4. Use of antimüllerian hormone to predict the menopausal transition in HIV-infected women

    PubMed Central

    Scherzer, Rebecca; Greenblatt, Ruth M.; Merhi, Zaher O.; Kassaye, Seble; Lambert-Messerlian, Geralyn; Maki, Pauline M.; Murphy, Kerry; Karim, Roksana; Bacchetti, Peter

    2016-01-01

    BACKGROUND HIV infection has been associated with early menopausal onset, which may have adverse long-term health consequences. Antimüllerian hormone, a biomarker of ovarian reserve and gonadal aging, is reduced in HIV-infected women. OBJECTIVE We sought to assess the relationship of antimüllerian hormone to age of menopause onset in HIV-infected women. STUDY DESIGN We used antimüllerian hormone levels measured in plasma in 2461 HIV-infected participants from the Women’s Interagency HIV Study to model the age at final menstrual period. Multivariable normal mixture models for censored data were used to identify factors associated with age at final menstrual period. RESULTS Higher antimüllerian hormone at age 40 years was associated with later age at final menstrual period, even after multivariable adjustment for smoking, CD4 cell count, plasma HIV RNA, hepatitis C infection, and history of clinical AIDS. Each doubling of antimüllerian hormone was associated with a 1.5-year increase in the age at final menstrual period. Median age at final menstrual period ranged from 45 years for those in the 10th percentile of antimüllerian hormone to 52 years for those in the 90th percentile. Other factors independently associated with earlier age at final menstrual period included smoking, hepatitis C infection, higher HIV RNA levels, and history of clinical AIDS. CONCLUSION Antimüllerian hormone is highly predictive of age at final menstrual period in HIV-infected women. Measuring antimüllerian hormone in HIV-infected women may enable clinicians to predict risk of early menopause, and potentially implement individualized treatment plans to prevent menopause-related comorbidities and to aid in interpretation of symptoms. PMID:27473002

  5. Methods applied in cost-effectiveness models for treatment strategies in type 2 diabetes mellitus and their use in Health Technology Assessments: a systematic review of the literature from 2008 to 2013.

    PubMed

    Charokopou, M; Sabater, F J; Townsend, R; Roudaut, M; McEwan, P; Verheggen, B G

    2016-01-01

    To identify and compare health-economic models that were developed to evaluate the cost-effectiveness of treatments for type 2 diabetes mellitus (T2DM), and their use within Health Technology Assessments (HTAs). In total, six commonly used databases were searched for articles published between October 2008 and January 2013, using a protocolized search strategy and inclusion criteria. The websites of HTA organizations in nine countries, and proceedings from five relevant conferences, were also reviewed. The identified new health-economic models were qualitatively assessed using six criteria that were developed based on technical components, and characteristics related to the disease or the treatments being assessed. Finally, the number of times the models were applied within HTA reports, published literature, and/or major conferences was determined. Thirteen new models were identified and reviewed in depth. Most of these were based on identical key data sources, and applied a similar model structure, either using Markov modeling or microsimulation techniques. The UKPDS equations and panel regressions were frequently used to estimate the occurrence of diabetes-related complications and the probability of developing risk factors in the long term. The qualitative assessment demonstrated that the CARDIFF, Sheffield T2DM and ECHO T2DM models seem technically equipped to appropriately assess the long-term health-economic consequences of chronic treatments for patients with T2DM. It was observed that the CORE model is the most widely described in literature and conferences, and the most often applied model within HTA submissions, followed by the CARDIFF and UKPDS models. This research provides an overview of T2DM models that were developed between 2008 and January 2013. The outcomes of the qualitative assessments, combined with frequent use in local reimbursement decisions, prove the applicability of the CORE, CARDIFF and UKPDS models to address decision problems related to the long-term clinical and economic consequences of new and existing T2DM treatments.

  6. Setting up virgin stress conditions in discrete element models.

    PubMed

    Rojek, J; Karlis, G F; Malinowski, L J; Beer, G

    2013-03-01

    In the present work, a methodology for setting up virgin stress conditions in discrete element models is proposed. The developed algorithm is applicable to discrete or coupled discrete/continuum modeling of underground excavation employing the discrete element method (DEM). Since the DEM works with contact forces rather than stresses there is a need for the conversion of pre-excavation stresses to contact forces for the DEM model. Different possibilities of setting up virgin stress conditions in the DEM model are reviewed and critically assessed. Finally, a new method to obtain a discrete element model with contact forces equivalent to given macroscopic virgin stresses is proposed. The test examples presented show that good results may be obtained regardless of the shape of the DEM domain.

  7. Setting up virgin stress conditions in discrete element models

    PubMed Central

    Rojek, J.; Karlis, G.F.; Malinowski, L.J.; Beer, G.

    2013-01-01

    In the present work, a methodology for setting up virgin stress conditions in discrete element models is proposed. The developed algorithm is applicable to discrete or coupled discrete/continuum modeling of underground excavation employing the discrete element method (DEM). Since the DEM works with contact forces rather than stresses there is a need for the conversion of pre-excavation stresses to contact forces for the DEM model. Different possibilities of setting up virgin stress conditions in the DEM model are reviewed and critically assessed. Finally, a new method to obtain a discrete element model with contact forces equivalent to given macroscopic virgin stresses is proposed. The test examples presented show that good results may be obtained regardless of the shape of the DEM domain. PMID:27087731

  8. A Modelling Framework to Assess the Effect of Pressures on River Abiotic Habitat Conditions and Biota

    PubMed Central

    Kail, Jochem; Guse, Björn; Radinger, Johannes; Schröder, Maria; Kiesel, Jens; Kleinhans, Maarten; Schuurman, Filip; Fohrer, Nicola; Hering, Daniel; Wolter, Christian

    2015-01-01

    River biota are affected by global reach-scale pressures, but most approaches for predicting biota of rivers focus on river reach or segment scale processes and habitats. Moreover, these approaches do not consider long-term morphological changes that affect habitat conditions. In this study, a modelling framework was further developed and tested to assess the effect of pressures at different spatial scales on reach-scale habitat conditions and biota. Ecohydrological and 1D hydrodynamic models were used to predict discharge and water quality at the catchment scale and the resulting water level at the downstream end of a study reach. Long-term reach morphology was modelled using empirical regime equations, meander migration and 2D morphodynamic models. The respective flow and substrate conditions in the study reach were predicted using a 2D hydrodynamic model, and the suitability of these habitats was assessed with novel habitat models. In addition, dispersal models for fish and macroinvertebrates were developed to assess the re-colonization potential and to finally compare habitat suitability and the availability / ability of species to colonize these habitats. Applicability was tested and model performance was assessed by comparing observed and predicted conditions in the lowland Treene River in northern Germany. Technically, it was possible to link the different models, but future applications would benefit from the development of open source software for all modelling steps to enable fully automated model runs. Future research needs concern the physical modelling of long-term morphodynamics, feedback of biota (e.g., macrophytes) on abiotic habitat conditions, species interactions, and empirical data on the hydraulic habitat suitability and dispersal abilities of macroinvertebrates. The modelling framework is flexible and allows for including additional models and investigating different research and management questions, e.g., in climate impact research as well as river restoration and management. PMID:26114430

  9. Development of a microbial contamination susceptibility model for private domestic groundwater sources

    NASA Astrophysics Data System (ADS)

    Hynds, Paul D.; Misstear, Bruce D.; Gill, Laurence W.

    2012-12-01

    Groundwater quality analyses were carried out on samples from 262 private sources in the Republic of Ireland during the period from April 2008 to November 2010, with microbial quality assessed by thermotolerant coliform (TTC) presence. Assessment of potential microbial contamination risk factors was undertaken at all sources, and local meteorological data were also acquired. Overall, 28.9% of wells tested positive for TTC, with risk analysis indicating that source type (i.e., borehole or hand-dug well), local bedrock type, local subsoil type, groundwater vulnerability, septic tank setback distance, and 48 h antecedent precipitation were all significantly associated with TTC presence (p < 0.05). A number of source-specific design parameters were also significantly associated with bacterial presence. Hierarchical logistic regression with stepwise parameter entry was used to develop a private well susceptibility model, with the final model exhibiting a mean predictive accuracy of >80% (TTC present or absent) when compared to an independent validation data set. Model hierarchies of primary significance are source design (20%), septic tank location (11%), hydrogeological setting (10%), and antecedent 120 h precipitation (2%). Sensitivity analysis shows that the probability of contamination is highly sensitive to septic tank setback distance, with probability increasing linearly with decreases in setback distance. Likewise, contamination probability was shown to increase with increasing antecedent precipitation. Results show that while groundwater vulnerability category is a useful indicator of aquifer susceptibility to contamination, its suitability with regard to source contamination is less clear. The final model illustrates that both localized (well-specific) and generalized (aquifer-specific) contamination mechanisms are involved in contamination events, with localized bypass mechanisms dominant. The susceptibility model developed here could be employed in the appropriate location, design, construction, and operation of private groundwater wells, thereby decreasing the contamination risk, and hence health risk, associated with these sources.

  10. Health-related quality-of-life parameters as independent prognostic factors in advanced or metastatic bladder cancer.

    PubMed

    Roychowdhury, D F; Hayden, A; Liepa, A M

    2003-02-15

    This retrospective analysis examined prognostic significance of health-related quality-of-life (HRQoL) parameters combined with baseline clinical factors on outcomes (overall survival, time to progressive disease, and time to treatment failure) in bladder cancer. Outcome and HRQoL (European Organization for Research and Treatment of Cancer Quality of Life Questionnaire C30) data were collected prospectively in a phase III study assessing gemcitabine and cisplatin versus methotrexate, vinblastine, doxorubicin, and cisplatin in locally advanced or metastatic bladder cancer. Prespecified baseline clinical factors (performance status, tumor-node-metastasis staging, visceral metastases [VM], alkaline phosphatase [AP] level, number of metastatic sites, prior radiotherapy, disease measurability, sex, time from diagnosis, and sites of disease) and selected HRQoL parameters (global QoL; all functional scales; symptoms: pain, fatigue, insomnia, dyspnea, anorexia) were evaluated using Cox's proportional hazards model. Factors with individual prognostic value (P <.05) on outcomes in univariate models were assessed for joint prognostic value in a multivariate model. A final model was developed using a backward selection strategy. Patients with baseline HRQoL were included (364 of 405, 90%). The final model predicted longer survival with low/normal AP levels, no VM, high physical functioning, low role functioning, and no anorexia. Positive prognostic factors for time to progressive disease were good performance status, low/normal AP levels, no VM, and minimal fatigue; for time to treatment failure, they were low/normal AP levels, minimal fatigue, and no anorexia. Global QoL was a significant predictor of outcome in univariate analyses but was not retained in the multivariate model. HRQoL parameters are independent prognostic factors for outcome in advanced bladder cancer; their prognostic importance needs further evaluation.

  11. SEADS 3.0 Sectoral Energy/Employment Analysis and Data System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roop, Joseph M.; Anderson, David A.; Schultz, Robert W.

    2007-12-17

    SEADS 3.0, the Sectoral Energy/Employment Analysis and Data System, is a revision and upgrading of SEADS--PC, a software package designed for the analysis of policy that could be described by modifying final demands of consumer, businesses, or governments (Roop, et al., 1995). If a question can be formulated so that implications can be translated into changes in final demands for goods and services, then SEADS 3.0 provides a quick and easy tool to assess preliminary impacts. And SEADS 3.0 should be considered just that: a quick and easy way to get preliminary results. Often a thorough answer, even to suchmore » a simple question as, “What would be the effect on U. S. energy use and employment if the Federal Government doubled R&D expenditures?” requires a more sophisticated analytical framework than the input-output structure embedded in SEADS 3.0. This tool uses a static, input-output model to assess the impacts of changes in final demands on first industry output, then employment and energy use. The employment and energy impacts are derived by multiplying the industry outputs (derived from the changed final demands) by industry-specific energy and employment coefficients. The tool also allows for the specification of regional or state employment impacts, though this option is not available for energy impacts.« less

  12. A novel multisensor traffic state assessment system based on incomplete data.

    PubMed

    Zeng, Yiliang; Lan, Jinhui; Ran, Bin; Jiang, Yaoliang

    2014-01-01

    A novel multisensor system with incomplete data is presented for traffic state assessment. The system comprises probe vehicle detection sensors, fixed detection sensors, and traffic state assessment algorithm. First of all, the validity checking of the traffic flow data is taken as preprocessing of this method. And then a new method based on the history data information is proposed to fuse and recover the incomplete data. According to the characteristics of space complementary of data based on the probe vehicle detector and fixed detector, a fusion model of space matching is presented to estimate the mean travel speed of the road. Finally, the traffic flow data include flow, speed and, occupancy rate, which are detected between Beijing Deshengmen bridge and Drum Tower bridge, are fused to assess the traffic state of the road by using the fusion decision model of rough sets and cloud. The accuracy of experiment result can reach more than 98%, and the result is in accordance with the actual road traffic state. This system is effective to assess traffic state, and it is suitable for the urban intelligent transportation system.

  13. A Novel Multisensor Traffic State Assessment System Based on Incomplete Data

    PubMed Central

    Zeng, Yiliang; Lan, Jinhui; Ran, Bin; Jiang, Yaoliang

    2014-01-01

    A novel multisensor system with incomplete data is presented for traffic state assessment. The system comprises probe vehicle detection sensors, fixed detection sensors, and traffic state assessment algorithm. First of all, the validity checking of the traffic flow data is taken as preprocessing of this method. And then a new method based on the history data information is proposed to fuse and recover the incomplete data. According to the characteristics of space complementary of data based on the probe vehicle detector and fixed detector, a fusion model of space matching is presented to estimate the mean travel speed of the road. Finally, the traffic flow data include flow, speed and, occupancy rate, which are detected between Beijing Deshengmen bridge and Drum Tower bridge, are fused to assess the traffic state of the road by using the fusion decision model of rough sets and cloud. The accuracy of experiment result can reach more than 98%, and the result is in accordance with the actual road traffic state. This system is effective to assess traffic state, and it is suitable for the urban intelligent transportation system. PMID:25162055

  14. Assessing the system value of optimal load shifting

    DOE PAGES

    Merrick, James; Ye, Yinyu; Entriken, Bob

    2017-04-30

    We analyze a competitive electricity market, where consumers exhibit optimal load shifting behavior to maximize utility and producers/suppliers maximize their profit under supply capacity constraints. The associated computationally tractable formulation can be used to inform market design or policy analysis in the context of increasing availability of the smart grid technologies that enable optimal load shifting. Through analytic and numeric assessment of the model, we assess the equilibrium value of optimal electricity load shifting, including how the value changes as more electricity consumers adopt associated technologies. For our illustrative numerical case, derived from the Current Trends scenario of the ERCOTmore » Long Term System Assessment, the average energy arbitrage value per ERCOT customer of optimal load shifting technologies is estimated to be $3 for the 2031 scenario year. We assess the sensitivity of this result to the flexibility of load, along with its relationship to the deployment of renewables. Finally, the model presented can also be a starting point for designing system operation infrastructure that communicates with the devices that schedule loads in response to price signals.« less

  15. Assessing the system value of optimal load shifting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merrick, James; Ye, Yinyu; Entriken, Bob

    We analyze a competitive electricity market, where consumers exhibit optimal load shifting behavior to maximize utility and producers/suppliers maximize their profit under supply capacity constraints. The associated computationally tractable formulation can be used to inform market design or policy analysis in the context of increasing availability of the smart grid technologies that enable optimal load shifting. Through analytic and numeric assessment of the model, we assess the equilibrium value of optimal electricity load shifting, including how the value changes as more electricity consumers adopt associated technologies. For our illustrative numerical case, derived from the Current Trends scenario of the ERCOTmore » Long Term System Assessment, the average energy arbitrage value per ERCOT customer of optimal load shifting technologies is estimated to be $3 for the 2031 scenario year. We assess the sensitivity of this result to the flexibility of load, along with its relationship to the deployment of renewables. Finally, the model presented can also be a starting point for designing system operation infrastructure that communicates with the devices that schedule loads in response to price signals.« less

  16. Vulnerability Assessment of Water Supply Systems: Status, Gaps and Opportunities

    NASA Astrophysics Data System (ADS)

    Wheater, H. S.

    2015-12-01

    Conventional frameworks for assessing the impacts of climate change on water resource systems use cascades of climate and hydrological models to provide 'top-down' projections of future water availability, but these are subject to high uncertainty and are model and scenario-specific. Hence there has been recent interest in 'bottom-up' frameworks, which aim to evaluate system vulnerability to change in the context of possible future climate and/or hydrological conditions. Such vulnerability assessments are generic, and can be combined with updated information from top-down assessments as they become available. While some vulnerability methods use hydrological models to estimate water availability, fully bottom-up schemes have recently been proposed that directly map system vulnerability as a function of feasible changes in water supply characteristics. These use stochastic algorithms, based on reconstruction or reshuffling methods, by which multiple water supply realizations can be generated under feasible ranges of change in water supply conditions. The paper reports recent successes, and points to areas of future improvement. Advances in stochastic modeling and optimization can address some technical limitations in flow reconstruction, while various data mining and system identification techniques can provide possibilities to better condition realizations for consistency with top-down scenarios. Finally, we show that probabilistic and Bayesian frameworks together can provide a potential basis to combine information obtained from fully bottom-up analyses with projections available from climate and/or hydrological models in a fully integrated risk assessment framework for deep uncertainty.

  17. Developmental Trajectories of Religiosity, Sexual Conservatism and Sexual Behavior among Female Adolescents

    PubMed Central

    Aalsma, Matthew C.; Woodrome, Stacy E.; Downs, Sarah M.; Hensel, Devon; Zimet, Gregory D.; Orr, Don P.; Fortenberry, J. Dennis

    2013-01-01

    Understanding the role of socio-sexual cognitions and religiosity on adolescent sexual behavior could guide adolescent sexual health efforts. The present study utilized longitudinal data from 328 young women to assess the role of religion and socio-sexual cognitions on sexual behavior accrual (measuring both coital and non-coital sexual behavior). In the final triple conditional trajectory structural equation model, religiosity declined over time and then increased to baseline levels. Additionally, religiosity predicted decreased sexual conservatism and decreased sexual conservatism predicted increased sexual behavior. The final models are indicative of young women's increasing accrual of sexual experience, decreasing sexual conservatism and initial decreasing religiosity. The results of this study suggest that decreased religiosity affects the accrual of sexual experience through decreased sexual conservatism. Effective strategies of sexual health promotion should include an understanding of the complex role of socio-sexual attitudes with religiosity. PMID:24215966

  18. Incorporating User Input in Template-Based Segmentation

    PubMed Central

    Vidal, Camille; Beggs, Dale; Younes, Laurent; Jain, Sanjay K.; Jedynak, Bruno

    2015-01-01

    We present a simple and elegant method to incorporate user input in a template-based segmentation method for diseased organs. The user provides a partial segmentation of the organ of interest, which is used to guide the template towards its target. The user also highlights some elements of the background that should be excluded from the final segmentation. We derive by likelihood maximization a registration algorithm from a simple statistical image model in which the user labels are modeled as Bernoulli random variables. The resulting registration algorithm minimizes the sum of square differences between the binary template and the user labels, while preventing the template from shrinking, and penalizing for the inclusion of background elements into the final segmentation. We assess the performance of the proposed algorithm on synthetic images in which the amount of user annotation is controlled. We demonstrate our algorithm on the segmentation of the lungs of Mycobacterium tuberculosis infected mice from μCT images. PMID:26146532

  19. Assessment of sustainable urban transport development based on entropy and unascertained measure.

    PubMed

    Li, Yancang; Yang, Jing; Shi, Huawang; Li, Yijie

    2017-01-01

    To find a more effective method for the assessment of sustainable urban transport development, the comprehensive assessment model of sustainable urban transport development was established based on the unascertained measure. On the basis of considering the factors influencing urban transport development, the comprehensive assessment indexes were selected, including urban economical development, transport demand, environment quality and energy consumption, and the assessment system of sustainable urban transport development was proposed. In view of different influencing factors of urban transport development, the index weight was calculated through the entropy weight coefficient method. Qualitative and quantitative analyses were conducted according to the actual condition. Then, the grade was obtained by using the credible degree recognition criterion from which the urban transport development level can be determined. Finally, a comprehensive assessment method for urban transport development was introduced. The application practice showed that the method can be used reasonably and effectively for the comprehensive assessment of urban transport development.

  20. Development of the quality assessment model of EHR software in family medicine practices: research based on user satisfaction.

    PubMed

    Kralj, Damir; Kern, Josipa; Tonkovic, Stanko; Koncar, Miroslav

    2015-09-09

    Family medicine practices (FMPs) make the basis for the Croatian health care system. Use of electronic health record (EHR) software is mandatory and it plays an important role in running these practices, but important functional features still remain uneven and largely left to the will of the software developers. The objective of this study was to develop a novel and comprehensive model for functional evaluation of the EHR software in FMPs, based on current world standards, models and projects, as well as on actual user satisfaction and requirements. Based on previous theoretical and experimental research in this area, we made the initial framework model consisting of six basic categories as a base for online survey questionnaire. Family doctors assessed perceived software quality by using a five-point Likert-type scale. Using exploratory factor analysis and appropriate statistical methods over the collected data, the final optimal structure of the novel model was formed. Special attention was focused on the validity and quality of the novel model. The online survey collected a total of 384 cases. The obtained results indicate both the quality of the assessed software and the quality in use of the novel model. The intense ergonomic orientation of the novel measurement model was particularly emphasised. The resulting novel model is multiple validated, comprehensive and universal. It could be used to assess the user-perceived quality of almost all forms of the ambulatory EHR software and therefore useful to all stakeholders in this area of the health care informatisation.

  1. Psychometrics of the preschooler physical activity parenting practices instrument among a Latino sample

    PubMed Central

    2014-01-01

    Background Latino preschoolers (3-5 year old children) have among the highest rates of obesity. Low levels of physical activity (PA) are a risk factor for obesity. Characterizing what Latino parents do to encourage or discourage their preschooler to be physically active can help inform interventions to increase their PA. The objective was therefore to develop and assess the psychometrics of a new instrument: the Preschooler Physical Activity Parenting Practices (PPAPP) among a Latino sample, to assess parenting practices used to encourage or discourage PA among preschool-aged children. Methods Cross-sectional study of 240 Latino parents who reported the frequency of using PA parenting practices. 95% of respondents were mothers; 42% had more than a high school education. Child mean age was 4.5 (±0.9) years (52% male). Test-retest reliability was assessed in 20%, 2 weeks later. We assessed the fit of a priori models using Confirmatory factor analyses (CFA). In a separate sub-sample (35%), preschool-aged children wore accelerometers to assess associations with their PA and PPAPP subscales. Results The a-priori models showed poor fit to the data. A modified factor structure for encouraging PPAPP had one multiple-item scale: engagement (15 items), and two single-items (have outdoor toys; not enroll in sport-reverse coded). The final factor structure for discouraging PPAPP had 4 subscales: promote inactive transport (3 items), promote screen time (3 items), psychological control (4 items) and restricting for safety (4 items). Test-retest reliability (ICC) for the two scales ranged from 0.56-0.85. Cronbach’s alphas ranged from 0.5-0.9. Several sub-factors correlated in the expected direction with children’s objectively measured PA. Conclusion The final models for encouraging and discouraging PPAPP had moderate to good fit, with moderate to excellent test-retest reliabilities. The PPAPP should be further evaluated to better assess its associations with children’s PA and offers a new tool for measuring PPAPP among Latino families with preschool-aged children. PMID:24428935

  2. Development of a risk-based environmental management tool for drilling discharges. Summary of a four-year project.

    PubMed

    Singsaas, Ivar; Rye, Henrik; Frost, Tone Karin; Smit, Mathijs G D; Garpestad, Eimund; Skare, Ingvild; Bakke, Knut; Veiga, Leticia Falcao; Buffagni, Melania; Follum, Odd-Arne; Johnsen, Ståle; Moltu, Ulf-Einar; Reed, Mark

    2008-04-01

    This paper briefly summarizes the ERMS project and presents the developed model by showing results from environmental fates and risk calculations of a discharge from offshore drilling operations. The developed model calculates environmental risks for the water column and sediments resulting from exposure to toxic stressors (e.g., chemicals) and nontoxic stressors (e.g., suspended particles, sediment burial). The approach is based on existing risk assessment techniques described in the European Union technical guidance document on risk assessment and species sensitivity distributions. The model calculates an environmental impact factor, which characterizes the overall potential impact on the marine environment in terms of potentially impacted water volume and sediment area. The ERMS project started in 2003 and was finalized in 2007. In total, 28 scientific reports and 9 scientific papers have been delivered from the ERMS project (http://www.sintef.no/erms).

  3. Applications of the International Space Station Probabilistic Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Grant, Warren; Lutomski, Michael G.

    2011-01-01

    Recently the International Space Station (ISS) has incorporated more Probabilistic Risk Assessments (PRAs) in the decision making process for significant issues. Future PRAs will have major impact to ISS and future spacecraft development and operations. These PRAs will have their foundation in the current complete ISS PRA model and the current PRA trade studies that are being analyzed as requested by ISS Program stakeholders. ISS PRAs have recently helped in the decision making process for determining reliability requirements for future NASA spacecraft and commercial spacecraft, making crew rescue decisions, as well as making operational requirements for ISS orbital orientation, planning Extravehicular activities (EVAs) and robotic operations. This paper will describe some applications of the ISS PRA model and how they impacted the final decision. This paper will discuss future analysis topics such as life extension, requirements of new commercial vehicles visiting ISS.

  4. Prototype integration of the joint munitions assessment and planning model with the OSD threat methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lynn, R.Y.S.; Bolmarcich, J.J.

    The purpose of this Memorandum is to propose a prototype procedure which the Office of Munitions might employ to exercise, in a supportive joint fashion, two of its High Level Conventional Munitions Models, namely, the OSD Threat Methodology and the Joint Munitions Assessment and Planning (JMAP) model. The joint application of JMAP and the OSD Threat Methodology provides a tool to optimize munitions stockpiles. The remainder of this Memorandum comprises five parts. The first is a description of the structure and use of the OSD Threat Methodology. The second is a description of JMAP and its use. The third discussesmore » the concept of the joint application of JMAP and OSD Threat Methodology. The fourth displays sample output of the joint application. The fifth is a summary and epilogue. Finally, three appendices contain details of the formulation, data, and computer code.« less

  5. Automated antibody structure prediction using Accelrys tools: Results and best practices

    PubMed Central

    Fasnacht, Marc; Butenhof, Ken; Goupil-Lamy, Anne; Hernandez-Guzman, Francisco; Huang, Hongwei; Yan, Lisa

    2014-01-01

    We describe the methodology and results from our participation in the second Antibody Modeling Assessment experiment. During the experiment we predicted the structure of eleven unpublished antibody Fv fragments. Our prediction methods centered on template-based modeling; potential templates were selected from an antibody database based on their sequence similarity to the target in the framework regions. Depending on the quality of the templates, we constructed models of the antibody framework regions either using a single, chimeric or multiple template approach. The hypervariable loop regions in the initial models were rebuilt by grafting the corresponding regions from suitable templates onto the model. For the H3 loop region, we further refined models using ab initio methods. The final models were subjected to constrained energy minimization to resolve severe local structural problems. The analysis of the models submitted show that Accelrys tools allow for the construction of quite accurate models for the framework and the canonical CDR regions, with RMSDs to the X-ray structure on average below 1 Å for most of these regions. The results show that accurate prediction of the H3 hypervariable loops remains a challenge. Furthermore, model quality assessment of the submitted models show that the models are of quite high quality, with local geometry assessment scores similar to that of the target X-ray structures. Proteins 2014; 82:1583–1598. © 2014 The Authors. Proteins published by Wiley Periodicals, Inc. PMID:24833271

  6. Clicker Score Trajectories and Concept Inventory Scores as Predictors for Early Warning Systems for Large STEM Classes

    NASA Astrophysics Data System (ADS)

    Lee, Un Jung; Sbeglia, Gena C.; Ha, Minsu; Finch, Stephen J.; Nehm, Ross H.

    2015-12-01

    Increasing the retention of STEM (science, technology, engineering, and mathematics) majors has recently emerged as a national priority in undergraduate education. Since poor performance in large introductory science and math courses is one significant factor in STEM dropout, early detection of struggling students is needed. Technology-supported "early warning systems" (EWSs) are being developed to meet these needs. Our study explores the utility of two commonly collected data sources—pre-course concept inventory scores and longitudinal clicker scores—for use in EWS, specifically, in determining the time points at which robust predictions of student success can first be established. The pre-course diagnostic assessments, administered to 287 students, included two concept inventories and one attitude assessment. Clicker question scores were also obtained for each of the 37 class sessions. Additionally, student characteristics (sex, ethnicity, and English facility) were gathered in a survey. Our analyses revealed that all variables were predictive of final grades. The correlation of the first 3 weeks of clicker scores with final grades was 0.53, suggesting that this set of variables could be used in an EWS starting at the third week. We also used group-based trajectory models to assess whether trajectory patterns were homogeneous in the class. The trajectory analysis identified three distinct clicker performance patterns that were also significant predictors of final grade. Trajectory analyses of clicker scores, student characteristics, and pre-course diagnostic assessment appear to be valuable data sources for EWS, although further studies in a diversity of instructional contexts are warranted.

  7. Instability risk assessment of construction waste pile slope based on fuzzy entropy

    NASA Astrophysics Data System (ADS)

    Ma, Yong; Xing, Huige; Yang, Mao; Nie, Tingting

    2018-05-01

    Considering the nature and characteristics of construction waste piles, this paper analyzed the factors affecting the stability of the slope of construction waste piles, and established the system of the assessment indexes for the slope failure risks of construction waste piles. Based on the basic principles and methods of fuzzy mathematics, the factor set and the remark set were established. The membership grade of continuous factor indexes is determined using the "ridge row distribution" function, while that for the discrete factor indexes was determined by the Delphi Method. For the weight of factors, the subjective weight was determined by the Analytic Hierarchy Process (AHP) and objective weight by the entropy weight method. And the distance function was introduced to determine the combination coefficient. This paper established a fuzzy comprehensive assessment model of slope failure risks of construction waste piles, and assessed pile slopes in the two dimensions of hazard and vulnerability. The root mean square of the hazard assessment result and vulnerability assessment result was the final assessment result. The paper then used a certain construction waste pile slope as the example for analysis, assessed the risks of the four stages of a landfill, verified the assessment model and analyzed the slope's failure risks and preventive measures against a slide.

  8. Statistical analysis of earthquakes after the 1999 MW 7.7 Chi-Chi, Taiwan, earthquake based on a modified Reasenberg-Jones model

    NASA Astrophysics Data System (ADS)

    Chen, Yuh-Ing; Huang, Chi-Shen; Liu, Jann-Yenq

    2015-12-01

    We investigated the temporal-spatial hazard of the earthquakes after the 1999 September 21 MW = 7.7 Chi-Chi shock in a continental region of Taiwan. The Reasenberg-Jones (RJ) model (Reasenberg and Jones, 1989, 1994) that combines the frequency-magnitude distribution (Gutenberg and Richter, 1944) and time-decaying occurrence rate (Utsu et al., 1995) is conventionally employed for assessing the earthquake hazard after a large shock. However, it is found that the b values in the frequency-magnitude distribution of the earthquakes in the study region dramatically decreased from background values after the Chi-Chi shock, and then gradually increased up. The observation of a time-dependent frequency-magnitude distribution motivated us to propose a modified RJ model (MRJ) to assess the earthquake hazard. To see how the models perform on assessing short-term earthquake hazard, the RJ and MRJ models were separately used to sequentially forecast earthquakes in the study region. To depict the potential rupture area for future earthquakes, we further constructed relative hazard (RH) maps based on the two models. The Receiver Operating Characteristics (ROC) curves (Swets, 1988) finally demonstrated that the RH map based on the MRJ model was, in general, superior to the one based on the original RJ model for exploring the spatial hazard of earthquakes in a short time after the Chi-Chi shock.

  9. Student Mental Models of the Greenhouse Effect: Retention Months After Interventions

    NASA Astrophysics Data System (ADS)

    Harris, S. E.; Gold, A. U.

    2013-12-01

    Individual understanding of climate science, and the greenhouse effect in particular, is one factor important for societal decision-making. Ideally, learning opportunities about the greenhouse effect will not only move people toward expert-like ideas but will also have long-lasting effects for those individuals. We assessed university students' mental models of the greenhouse effect before and after specific learning experiences, on a final exam, then again a few months later. Our aim was to measure retention after students had not necessarily been thinking about, nor studying, the greenhouse effect recently. How sticky were the ideas learned? 164 students in an introductory science course participated in a sequence of two learning activities and assessments regarding the greenhouse effect. The first lesson involved the full class, then, for the second lesson, half the students completed a simulation-based activity and the other half completed a data-driven activity. We assessed student thinking through concept sketches, multiple choice and short answer questions. All students generated concept sketches four times, and completed a set of multiple choice (MCQs) and short answer questions twice. Later, 3-4 months after the course ended, 27 students ('retention students') completed an additional concept sketch and answered the questions again, as a retention assessment. These 27 students were nearly evenly split between the two contrasting second lessons in the sequence and included both high and low-achieving students. We then compared student sketches and scores to 'expert' answers. The general pattern over time showed a significant increase in student scores from before the lesson sequence to after, both on concept sketches and MCQs, then an additional increase in concept sketch score on the final exam (MCQs were not asked on the final exam). The scores for the retention students were not significantly different from the full class. Within the retention group, there was also no difference in scores based on which contrasting lesson a student did. Students in both of the contrasting lessons scored significantly higher on the retention test than on the initial pre-test. Their concept sketch scores on the retention test were slightly lower than their scores on the final exam (not significantly), but matched their post-lesson-sequence scores. Their MCQ scores were slightly higher on the retention test than on the post-lesson-sequence test (also not significantly). These results imply that students both learned and retained new ideas about the greenhouse effect for at least a few months after the end of the course and did not regress to their pre-lesson ideas. Further analysis should show which particular aspects of student mental models changed over the full temporal sequence.

  10. A comparative assessment of tools for ecosystem services quantification and valuation

    USGS Publications Warehouse

    Bagstad, Kenneth J.; Semmens, Darius; Waage, Sissel; Winthrop, Robert

    2013-01-01

    To enter widespread use, ecosystem service assessments need to be quantifiable, replicable, credible, flexible, and affordable. With recent growth in the field of ecosystem services, a variety of decision-support tools has emerged to support more systematic ecosystem services assessment. Despite the growing complexity of the tool landscape, thorough reviews of tools for identifying, assessing, modeling and in some cases monetarily valuing ecosystem services have generally been lacking. In this study, we describe 17 ecosystem services tools and rate their performance against eight evaluative criteria that gauge their readiness for widespread application in public- and private-sector decision making. We describe each of the tools′ intended uses, services modeled, analytical approaches, data requirements, and outputs, as well time requirements to run seven tools in a first comparative concurrent application of multiple tools to a common location – the San Pedro River watershed in southeast Arizona, USA, and northern Sonora, Mexico. Based on this work, we offer conclusions about these tools′ current ‘readiness’ for widespread application within both public- and private-sector decision making processes. Finally, we describe potential pathways forward to reduce the resource requirements for running ecosystem services models, which are essential to facilitate their more widespread use in environmental decision making.

  11. Measuring sustainable development using a multi-criteria model: a case study.

    PubMed

    Boggia, Antonio; Cortina, Carla

    2010-11-01

    This paper shows how Multi-criteria Decision Analysis (MCDA) can help in a complex process such as the assessment of the level of sustainability of a certain area. The paper presents the results of a study in which a model for measuring sustainability was implemented to better aid public policy decisions regarding sustainability. In order to assess sustainability in specific areas, a methodological approach based on multi-criteria analysis has been developed. The aim is to rank areas in order to understand the specific technical and/or financial support that they need to develop sustainable growth. The case study presented is an assessment of the level of sustainability in different areas of an Italian Region using the MCDA approach. Our results show that MCDA is a proper approach for sustainability assessment. The results are easy to understand and the evaluation path is clear and transparent. This is what decision makers need for having support to their decisions. The multi-criteria model for evaluation has been developed respecting the sustainable development economic theory, so that final results can have a clear meaning in terms of sustainability. Copyright 2010 Elsevier Ltd. All rights reserved.

  12. Revisiting thermodynamics and kinetic diffusivities of uranium–niobium with Bayesian uncertainty analysis

    DOE PAGES

    Duong, Thien C.; Hackenberg, Robert E.; Landa, Alex; ...

    2016-09-20

    In this paper, thermodynamic and kinetic diffusivities of uranium–niobium (U–Nb) are re-assessed by means of the CALPHAD (CALculation of PHAse Diagram) methodology. In order to improve the consistency and reliability of the assessments, first-principles calculations are coupled with CALPHAD. In particular, heats of formation of γ -U–Nb are estimated and verified using various density-functional theory (DFT) approaches. These thermochemistry data are then used as constraints to guide the thermodynamic optimization process in such a way that the mutual-consistency between first-principles calculations and CALPHAD assessment is satisfactory. In addition, long-term aging experiments are conducted in order to generate new phase equilibriamore » data at the γ 2/α+γ 2 boundary. These data are meant to verify the thermodynamic model. Assessment results are generally in good agreement with experiments and previous calculations, without showing the artifacts that were observed in previous modeling. The mutual-consistent thermodynamic description is then used to evaluate atomic mobility and diffusivity of γ-U–Nb. Finally, Bayesian analysis is conducted to evaluate the uncertainty of the thermodynamic model and its impact on the system's phase stability.« less

  13. 75 FR 7029 - Notice of Availability of the Final Environmental Assessment for Solar Roof Project

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-16

    ... Assessment for Solar Roof Project AGENCY: United States Geological Survey. ACTION: Notice of availability... Final Environmental Assessment for the Solar Roof Project and by this notice is announcing its... Individuals wishing to receive copies of the Environmental Assessment for the Solar Roof Project should...

  14. 77 FR 37432 - Final Springfield Plateau Regional Restoration Plan and Environmental Assessment and Finding of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-21

    ...-FF03E00000] Final Springfield Plateau Regional Restoration Plan and Environmental Assessment and Finding of... Springfield Plateau Regional Restoration Plan (Plan) and Environmental Assessment and Finding of No... Springfield Plateau Regional Restoration Plan and Environmental Assessment (77 FR 1717). The public comment...

  15. 76 FR 77019 - Final Adjusted Assessment of Annual Needs for the List I Chemicals: Ephedrine, Pseudoephedrine...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-09

    ... Assessment of Annual Needs for the List I Chemicals: Ephedrine, Pseudoephedrine, and Phenylpropanolamine for...: This notice establishes the Final Adjusted 2011 assessment of annual needs for the List I chemicals... assessment of annual needs represents those quantities of ephedrine, pseudoephedrine, and phenylpropanolamine...

  16. 77 FR 55503 - Final Adjusted Assessment of Annual Needs for the List I Chemicals Ephedrine, Pseudoephedrine...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-10

    ... Assessment of Annual Needs for the List I Chemicals Ephedrine, Pseudoephedrine, and Phenylpropanolamine for...: This notice establishes the Final Adjusted 2012 Assessment of Annual Needs for the list I chemicals...: The 2012 Assessment of Annual Needs represents those quantities of ephedrine, pseudoephedrine, and...

  17. An Assessment of Iterative Reconstruction Methods for Sparse Ultrasound Imaging

    PubMed Central

    Valente, Solivan A.; Zibetti, Marcelo V. W.; Pipa, Daniel R.; Maia, Joaquim M.; Schneider, Fabio K.

    2017-01-01

    Ultrasonic image reconstruction using inverse problems has recently appeared as an alternative to enhance ultrasound imaging over beamforming methods. This approach depends on the accuracy of the acquisition model used to represent transducers, reflectivity, and medium physics. Iterative methods, well known in general sparse signal reconstruction, are also suited for imaging. In this paper, a discrete acquisition model is assessed by solving a linear system of equations by an ℓ1-regularized least-squares minimization, where the solution sparsity may be adjusted as desired. The paper surveys 11 variants of four well-known algorithms for sparse reconstruction, and assesses their optimization parameters with the goal of finding the best approach for iterative ultrasound imaging. The strategy for the model evaluation consists of using two distinct datasets. We first generate data from a synthetic phantom that mimics real targets inside a professional ultrasound phantom device. This dataset is contaminated with Gaussian noise with an estimated SNR, and all methods are assessed by their resulting images and performances. The model and methods are then assessed with real data collected by a research ultrasound platform when scanning the same phantom device, and results are compared with beamforming. A distinct real dataset is finally used to further validate the proposed modeling. Although high computational effort is required by iterative methods, results show that the discrete model may lead to images closer to ground-truth than traditional beamforming. However, computing capabilities of current platforms need to evolve before frame rates currently delivered by ultrasound equipments are achievable. PMID:28282862

  18. The Value of GRACE Data in Improving, Assessing and Evaluating Land Surface and Climate Models

    NASA Astrophysics Data System (ADS)

    Yang, Z.

    2011-12-01

    I will review how the Gravity Recovery and Climate Experiment (GRACE) satellite measurements have improved land surface models that are developed for weather, climate, and hydrological studies. GRACE-derived terrestrial water storage (TWS) changes have been successfully used to assess and evaluate the improved representations of land-surface hydrological processes such as groundwater-soil moisture interaction, frozen soil and infiltration, and the topographic control on runoff production, as evident in the simulations from the latest Noah-MP, the Community Land Model, and the Community Climate System Model. GRACE data sets have made it possible to estimate key terrestrial water storage components (snow mass, surface water, groundwater or water table depth), biomass, and surface water fluxes (evapotranspiration, solid precipitation, melt of snow/ice). Many of the examples will draw from my Land, Environment and Atmosphere Dynamics group's work on land surface model developments, snow mass retrieval, and multi-sensor snow data assimilation using the ensemble Karman filter and the ensemble Karman smoother. Finally, I will briefly outline some future directions in using GRACE in land surface modeling.

  19. Introduction to “Global tsunami science: Past and future, Volume I”

    USGS Publications Warehouse

    Geist, Eric L.; Fritz, Hermann; Rabinovich, Alexander B.; Tanioka, Yuichiro

    2016-01-01

    Twenty-five papers on the study of tsunamis are included in Volume I of the PAGEOPH topical issue “Global Tsunami Science: Past and Future”. Six papers examine various aspects of tsunami probability and uncertainty analysis related to hazard assessment. Three papers relate to deterministic hazard and risk assessment. Five more papers present new methods for tsunami warning and detection. Six papers describe new methods for modeling tsunami hydrodynamics. Two papers investigate tsunamis generated by non-seismic sources: landslides and meteorological disturbances. The final three papers describe important case studies of recent and historical events. Collectively, this volume highlights contemporary trends in global tsunami research, both fundamental and applied toward hazard assessment and mitigation.

  20. Research on the influence of parking charging strategy based on multi-level extension theory of group decision making

    NASA Astrophysics Data System (ADS)

    Cheng, Fen; Hu, Wanxin

    2017-05-01

    Based on analysis of the impact of the experience of parking policy at home and abroad, design the impact analysis process of parking strategy. First, using group decision theory to create a parking strategy index system and calculate its weight. Index system includes government, parking operators and travelers. Then, use a multi-level extension theory to analyze the CBD parking strategy. Assess the parking strategy by calculating the correlation of each indicator. Finally, assess the strategy of parking charges through a case. Provide a scientific and reasonable basis for assessing parking strategy. The results showed that the model can effectively analyze multi-target, multi-property parking policy evaluation.

  1. Introduction to "Global Tsunami Science: Past and Future, Volume I"

    NASA Astrophysics Data System (ADS)

    Geist, Eric L.; Fritz, Hermann M.; Rabinovich, Alexander B.; Tanioka, Yuichiro

    2016-12-01

    Twenty-five papers on the study of tsunamis are included in Volume I of the PAGEOPH topical issue "Global Tsunami Science: Past and Future". Six papers examine various aspects of tsunami probability and uncertainty analysis related to hazard assessment. Three papers relate to deterministic hazard and risk assessment. Five more papers present new methods for tsunami warning and detection. Six papers describe new methods for modeling tsunami hydrodynamics. Two papers investigate tsunamis generated by non-seismic sources: landslides and meteorological disturbances. The final three papers describe important case studies of recent and historical events. Collectively, this volume highlights contemporary trends in global tsunami research, both fundamental and applied toward hazard assessment and mitigation.

  2. Seismic‐hazard forecast for 2016 including induced and natural earthquakes in the central and eastern United States

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2016-01-01

    The U.S. Geological Survey (USGS) has produced a one‐year (2016) probabilistic seismic‐hazard assessment for the central and eastern United States (CEUS) that includes contributions from both induced and natural earthquakes that are constructed with probabilistic methods using alternative data and inputs. This hazard assessment builds on our 2016 final model (Petersen et al., 2016) by adding sensitivity studies, illustrating hazard in new ways, incorporating new population data, and discussing potential improvements. The model considers short‐term seismic activity rates (primarily 2014–2015) and assumes that the activity rates will remain stationary over short time intervals. The final model considers different ways of categorizing induced and natural earthquakes by incorporating two equally weighted earthquake rate submodels that are composed of alternative earthquake inputs for catalog duration, smoothing parameters, maximum magnitudes, and ground‐motion models. These alternatives represent uncertainties on how we calculate earthquake occurrence and the diversity of opinion within the science community. In this article, we also test sensitivity to the minimum moment magnitude between M 4 and M 4.7 and the choice of applying a declustered catalog with b=1.0 rather than the full catalog with b=1.3. We incorporate two earthquake rate submodels: in the informed submodel we classify earthquakes as induced or natural, and in the adaptive submodel we do not differentiate. The alternative submodel hazard maps both depict high hazard and these are combined in the final model. Results depict several ground‐shaking measures as well as intensity and include maps showing a high‐hazard level (1% probability of exceedance in 1 year or greater). Ground motions reach 0.6g horizontal peak ground acceleration (PGA) in north‐central Oklahoma and southern Kansas, and about 0.2g PGA in the Raton basin of Colorado and New Mexico, in central Arkansas, and in north‐central Texas near Dallas–Fort Worth. The chance of having levels of ground motions corresponding to modified Mercalli intensity (MMI) VI or greater earthquake shaking is 2%–12% per year in north‐central Oklahoma and southern Kansas and New Madrid similar to the chance of damage at sites in high‐hazard portions of California caused by natural earthquakes. Hazard is also significant in the Raton basin of Colorado/New Mexico; north‐central Arkansas; Dallas–Fort Worth, Texas; and in a few other areas. Hazard probabilities are much lower (by about half or more) for exceeding MMI VII or VIII. Hazard is 3‐ to 10‐fold higher near some areas of active‐induced earthquakes than in the 2014 USGS National Seismic Hazard Model (NSHM), which did not consider induced earthquakes. This study in conjunction with the LandScan TM Database (2013) indicates that about 8 million people live in areas of active injection wells that have a greater than 1% chance of experiencing damaging ground shaking (MMI≥VI) in 2016. The final model has high uncertainty, and engineers, regulators, and industry should use these assessments cautiously to make informed decisions on mitigating the potential effects of induced and natural earthquakes.

  3. Using the Entrustable Professional Activities Framework in the Assessment of Procedural Skills.

    PubMed

    Pugh, Debra; Cavalcanti, Rodrigo B; Halman, Samantha; Ma, Irene W Y; Mylopoulos, Maria; Shanks, David; Stroud, Lynfa

    2017-04-01

    The entrustable professional activity (EPA) framework has been identified as a useful approach to assessment in competency-based education. To apply an EPA framework for assessment, essential skills necessary for entrustment to occur must first be identified. Using an EPA framework, our study sought to (1) define the essential skills required for entrustment for 7 bedside procedures expected of graduates of Canadian internal medicine (IM) residency programs, and (2) develop rubrics for the assessment of these procedural skills. An initial list of essential skills was defined for each procedural EPA by focus groups of experts at 4 academic centers using the nominal group technique. These lists were subsequently vetted by representatives from all Canadian IM training programs through a web-based survey. Consensus (more than 80% agreement) about inclusion of each item was sought using a modified Delphi exercise. Qualitative survey data were analyzed using a framework approach to inform final assessment rubrics for each procedure. Initial lists of essential skills for procedural EPAs ranged from 10 to 24 items. A total of 111 experts completed the national survey. After 2 iterations, consensus was reached on all items. Following qualitative analysis, final rubrics were created, which included 6 to 10 items per procedure. These EPA-based assessment rubrics represent a national consensus by Canadian IM clinician educators. They provide a practical guide for the assessment of procedural skills in a competency-based education model, and a robust foundation for future research on their implementation and evaluation.

  4. Deriving user-informed climate information from climate model ensemble results

    NASA Astrophysics Data System (ADS)

    Huebener, Heike; Hoffmann, Peter; Keuler, Klaus; Pfeifer, Susanne; Ramthun, Hans; Spekat, Arne; Steger, Christian; Warrach-Sagi, Kirsten

    2017-07-01

    Communication between providers and users of climate model simulation results still needs to be improved. In the German regional climate modeling project ReKliEs-De a midterm user workshop was conducted to allow the intended users of the project results to assess the preliminary results and to streamline the final project results to their needs. The user feedback highlighted, in particular, the still considerable gap between climate research output and user-tailored input for climate impact research. Two major requests from the user community addressed the selection of sub-ensembles and some condensed, easy to understand information on the strengths and weaknesses of the climate models involved in the project.

  5. Minimum resolvable power contrast model

    NASA Astrophysics Data System (ADS)

    Qian, Shuai; Wang, Xia; Zhou, Jingjing

    2018-01-01

    Signal-to-noise ratio and MTF are important indexs to evaluate the performance of optical systems. However,whether they are used alone or joint assessment cannot intuitively describe the overall performance of the system. Therefore, an index is proposed to reflect the comprehensive system performance-Minimum Resolvable Radiation Performance Contrast (MRP) model. MRP is an evaluation model without human eyes. It starts from the radiance of the target and the background, transforms the target and background into the equivalent strips,and considers attenuation of the atmosphere, the optical imaging system, and the detector. Combining with the signal-to-noise ratio and the MTF, the Minimum Resolvable Radiation Performance Contrast is obtained. Finally the detection probability model of MRP is given.

  6. On the effect of model parameters on forecast objects

    NASA Astrophysics Data System (ADS)

    Marzban, Caren; Jones, Corinne; Li, Ning; Sandgathe, Scott

    2018-04-01

    Many physics-based numerical models produce a gridded, spatial field of forecasts, e.g., a temperature map. The field for some quantities generally consists of spatially coherent and disconnected objects. Such objects arise in many problems, including precipitation forecasts in atmospheric models, eddy currents in ocean models, and models of forest fires. Certain features of these objects (e.g., location, size, intensity, and shape) are generally of interest. Here, a methodology is developed for assessing the impact of model parameters on the features of forecast objects. The main ingredients of the methodology include the use of (1) Latin hypercube sampling for varying the values of the model parameters, (2) statistical clustering algorithms for identifying objects, (3) multivariate multiple regression for assessing the impact of multiple model parameters on the distribution (across the forecast domain) of object features, and (4) methods for reducing the number of hypothesis tests and controlling the resulting errors. The final output of the methodology is a series of box plots and confidence intervals that visually display the sensitivities. The methodology is demonstrated on precipitation forecasts from a mesoscale numerical weather prediction model.

  7. Stochastic Technology Choice Model for Consequential Life Cycle Assessment.

    PubMed

    Kätelhön, Arne; Bardow, André; Suh, Sangwon

    2016-12-06

    Discussions on Consequential Life Cycle Assessment (CLCA) have relied largely on partial or general equilibrium models. Such models are useful for integrating market effects into CLCA, but also have well-recognized limitations such as the poor granularity of the sectoral definition and the assumption of perfect oversight by all economic agents. Building on the Rectangular-Choice-of-Technology (RCOT) model, this study proposes a new modeling approach for CLCA, the Technology Choice Model (TCM). In this approach, the RCOT model is adapted for its use in CLCA and extended to incorporate parameter uncertainties and suboptimal decisions due to market imperfections and information asymmetry in a stochastic setting. In a case study on rice production, we demonstrate that the proposed approach allows modeling of complex production technology mixes and their expected environmental outcomes under uncertainty, at a high level of detail. Incorporating the effect of production constraints, uncertainty, and suboptimal decisions by economic agents significantly affects technology mixes and associated greenhouse gas (GHG) emissions of the system under study. The case study also shows the model's ability to determine both the average and marginal environmental impacts of a product in response to changes in the quantity of final demand.

  8. Bristol Bay Assessment - Final Report (2014)

    EPA Pesticide Factsheets

    This is the final Bristol Bay assessment developed and peer reviewed by the Office of Research and Development in EPA. The purpose of this assessment is to provide a characterization of the biological and mineral resources of the Bristol Bay watershed.

  9. Pulse Jet Mixing Tests With Noncohesive Solids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, Perry A.; Bamberger, Judith A.; Enderlin, Carl W.

    2012-02-17

    This report summarizes results from pulse jet mixing (PJM) tests with noncohesive solids in Newtonian liquid. The tests were conducted during FY 2007 and 2008 to support the design of mixing systems for the Hanford Waste Treatment and Immobilization Plant (WTP). Tests were conducted at three geometric scales using noncohesive simulants, and the test data were used to develop models predicting two measures of mixing performance for full-scale WTP vessels. The models predict the cloud height (the height to which solids will be lifted by the PJM action) and the critical suspension velocity (the minimum velocity needed to ensure allmore » solids are suspended off the floor, though not fully mixed). From the cloud height, the concentration of solids at the pump inlet can be estimated. The predicted critical suspension velocity for lifting all solids is not precisely the same as the mixing requirement for 'disturbing' a sufficient volume of solids, but the values will be similar and closely related. These predictive models were successfully benchmarked against larger scale tests and compared well with results from computational fluid dynamics simulations. The application of the models to assess mixing in WTP vessels is illustrated in examples for 13 distinct designs and selected operational conditions. The values selected for these examples are not final; thus, the estimates of performance should not be interpreted as final conclusions of design adequacy or inadequacy. However, this work does reveal that several vessels may require adjustments to design, operating features, or waste feed properties to ensure confidence in operation. The models described in this report will prove to be valuable engineering tools to evaluate options as designs are finalized for the WTP. Revision 1 refines data sets used for model development and summarizes models developed since the completion of Revision 0.« less

  10. The Risk Assessment Study for Electric Power Marketing Competitiveness Based on Cloud Model and TOPSIS

    NASA Astrophysics Data System (ADS)

    Li, Cunbin; Wang, Yi; Lin, Shuaishuai

    2017-09-01

    With the rapid development of the energy internet and the deepening of the electric power reform, the traditional marketing mode of electric power does not apply to most of electric power enterprises, so must seek a breakthrough, however, in the face of increasingly complex marketing information, how to make a quick, reasonable transformation, makes the electric power marketing competitiveness assessment more accurate and objective becomes a big problem. In this paper, cloud model and TOPSIS method is proposed. Firstly, build the electric power marketing competitiveness evaluation index system. Then utilize the cloud model to transform the qualitative evaluation of the marketing data into quantitative values and use the entropy weight method to weaken the subjective factors of evaluation index weight. Finally, by TOPSIS method the closeness degrees of alternatives are obtained. This method provides a novel solution for the electric power marketing competitiveness evaluation. Through the case analysis the effectiveness and feasibility of this model are verified.

  11. An Integrated MCDM Model for Conveyor Equipment Evaluation and Selection in an FMC Based on a Fuzzy AHP and Fuzzy ARAS in the Presence of Vagueness.

    PubMed

    Nguyen, Huu-Tho; Dawal, Siti Zawiah Md; Nukman, Yusoff; Rifai, Achmad P; Aoyama, Hideki

    2016-01-01

    The conveyor system plays a vital role in improving the performance of flexible manufacturing cells (FMCs). The conveyor selection problem involves the evaluation of a set of potential alternatives based on qualitative and quantitative criteria. This paper presents an integrated multi-criteria decision making (MCDM) model of a fuzzy AHP (analytic hierarchy process) and fuzzy ARAS (additive ratio assessment) for conveyor evaluation and selection. In this model, linguistic terms represented as triangular fuzzy numbers are used to quantify experts' uncertain assessments of alternatives with respect to the criteria. The fuzzy set is then integrated into the AHP to determine the weights of the criteria. Finally, a fuzzy ARAS is used to calculate the weights of the alternatives. To demonstrate the effectiveness of the proposed model, a case study is performed of a practical example, and the results obtained demonstrate practical potential for the implementation of FMCs.

  12. An Integrated MCDM Model for Conveyor Equipment Evaluation and Selection in an FMC Based on a Fuzzy AHP and Fuzzy ARAS in the Presence of Vagueness

    PubMed Central

    Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; P. Rifai, Achmad; Aoyama, Hideki

    2016-01-01

    The conveyor system plays a vital role in improving the performance of flexible manufacturing cells (FMCs). The conveyor selection problem involves the evaluation of a set of potential alternatives based on qualitative and quantitative criteria. This paper presents an integrated multi-criteria decision making (MCDM) model of a fuzzy AHP (analytic hierarchy process) and fuzzy ARAS (additive ratio assessment) for conveyor evaluation and selection. In this model, linguistic terms represented as triangular fuzzy numbers are used to quantify experts’ uncertain assessments of alternatives with respect to the criteria. The fuzzy set is then integrated into the AHP to determine the weights of the criteria. Finally, a fuzzy ARAS is used to calculate the weights of the alternatives. To demonstrate the effectiveness of the proposed model, a case study is performed of a practical example, and the results obtained demonstrate practical potential for the implementation of FMCs. PMID:27070543

  13. Universal model of slow pyrolysis technology producing biochar and heat from standard biomass needed for the techno-economic assessment.

    PubMed

    Klinar, Dušan

    2016-04-01

    Biochar as a soil amendment and carbon sink becomes in last period one of the vast, interesting product of slow pyrolysis. Simplest and most used industrial process arrangement is a production of biochar and heat at the same time. Proposed mass and heat balance model consist of heat consumers (heat demand side) and heat generation-supply side. Direct burning of all generated uncondensed volatiles from biomass provides heat. Calculation of the mass and heat balance of both sides reveals the internal distribution of masses and energy inside process streams and units. Thermodynamic calculations verified not only the concept but also numerical range of the results. The comparisons with recent published scientific and vendors data prove its general applicability and reliability. The model opens the possibility for process efficiency innovations. Finally, the model was adapted to give more investors favorable results and support techno-economic assessments entirely. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Large-scale functional models of visual cortex for remote sensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brumby, Steven P; Kenyon, Garrett; Rasmussen, Craig E

    Neuroscience has revealed many properties of neurons and of the functional organization of visual cortex that are believed to be essential to human vision, but are missing in standard artificial neural networks. Equally important may be the sheer scale of visual cortex requiring {approx}1 petaflop of computation. In a year, the retina delivers {approx}1 petapixel to the brain, leading to massively large opportunities for learning at many levels of the cortical system. We describe work at Los Alamos National Laboratory (LANL) to develop large-scale functional models of visual cortex on LANL's Roadrunner petaflop supercomputer. An initial run of a simplemore » region VI code achieved 1.144 petaflops during trials at the IBM facility in Poughkeepsie, NY (June 2008). Here, we present criteria for assessing when a set of learned local representations is 'complete' along with general criteria for assessing computer vision models based on their projected scaling behavior. Finally, we extend one class of biologically-inspired learning models to problems of remote sensing imagery.« less

  15. Increasing URM Undergraduate Student Success through Assessment-Driven Interventions: A Multiyear Study Using Freshman-Level General Biology as a Model System.

    PubMed

    Carmichael, Mary C; St Clair, Candace; Edwards, Andrea M; Barrett, Peter; McFerrin, Harris; Davenport, Ian; Awad, Mohamed; Kundu, Anup; Ireland, Shubha Kale

    2016-01-01

    Xavier University of Louisiana leads the nation in awarding BS degrees in the biological sciences to African-American students. In this multiyear study with ∼5500 participants, data-driven interventions were adopted to improve student academic performance in a freshman-level general biology course. The three hour-long exams were common and administered concurrently to all students. New exam questions were developed using Bloom's taxonomy, and exam results were analyzed statistically with validated assessment tools. All but the comprehensive final exam were returned to students for self-evaluation and remediation. Among other approaches, course rigor was monitored by using an identical set of 60 questions on the final exam across 10 semesters. Analysis of the identical sets of 60 final exam questions revealed that overall averages increased from 72.9% (2010) to 83.5% (2015). Regression analysis demonstrated a statistically significant correlation between high-risk students and their averages on the 60 questions. Additional analysis demonstrated statistically significant improvements for at least one letter grade from midterm to final and a 20% increase in the course pass rates over time, also for the high-risk population. These results support the hypothesis that our data-driven interventions and assessment techniques are successful in improving student retention, particularly for our academically at-risk students. © 2016 M. C. Carmichael et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  16. Perceived threat and corroboration: key factors that improve a predictive model of trust in internet-based health information and advice.

    PubMed

    Harris, Peter R; Sillence, Elizabeth; Briggs, Pam

    2011-07-27

    How do people decide which sites to use when seeking health advice online? We can assume, from related work in e-commerce, that general design factors known to affect trust in the site are important, but in this paper we also address the impact of factors specific to the health domain. The current study aimed to (1) assess the factorial structure of a general measure of Web trust, (2) model how the resultant factors predicted trust in, and readiness to act on, the advice found on health-related websites, and (3) test whether adding variables from social cognition models to capture elements of the response to threatening, online health-risk information enhanced the prediction of these outcomes. Participants were asked to recall a site they had used to search for health-related information and to think of that site when answering an online questionnaire. The questionnaire consisted of a general Web trust questionnaire plus items assessing appraisals of the site, including threat appraisals, information checking, and corroboration. It was promoted on the hungersite.com website. The URL was distributed via Yahoo and local print media. We assessed the factorial structure of the measures using principal components analysis and modeled how well they predicted the outcome measures using structural equation modeling (SEM) with EQS software. We report an analysis of the responses of participants who searched for health advice for themselves (N = 561). Analysis of the general Web trust questionnaire revealed 4 factors: information quality, personalization, impartiality, and credible design. In the final SEM model, information quality and impartiality were direct predictors of trust. However, variables specific to eHealth (perceived threat, coping, and corroboration) added substantially to the ability of the model to predict variance in trust and readiness to act on advice on the site. The final model achieved a satisfactory fit: χ(2) (5) = 10.8 (P = .21), comparative fit index = .99, root mean square error of approximation = .052. The model accounted for 66% of the variance in trust and 49% of the variance in readiness to act on the advice. Adding variables specific to eHealth enhanced the ability of a model of trust to predict trust and readiness to act on advice.

  17. Perceived Threat and Corroboration: Key Factors That Improve a Predictive Model of Trust in Internet-based Health Information and Advice

    PubMed Central

    Harris, Peter R; Briggs, Pam

    2011-01-01

    Background How do people decide which sites to use when seeking health advice online? We can assume, from related work in e-commerce, that general design factors known to affect trust in the site are important, but in this paper we also address the impact of factors specific to the health domain. Objective The current study aimed to (1) assess the factorial structure of a general measure of Web trust, (2) model how the resultant factors predicted trust in, and readiness to act on, the advice found on health-related websites, and (3) test whether adding variables from social cognition models to capture elements of the response to threatening, online health-risk information enhanced the prediction of these outcomes. Methods Participants were asked to recall a site they had used to search for health-related information and to think of that site when answering an online questionnaire. The questionnaire consisted of a general Web trust questionnaire plus items assessing appraisals of the site, including threat appraisals, information checking, and corroboration. It was promoted on the hungersite.com website. The URL was distributed via Yahoo and local print media. We assessed the factorial structure of the measures using principal components analysis and modeled how well they predicted the outcome measures using structural equation modeling (SEM) with EQS software. Results We report an analysis of the responses of participants who searched for health advice for themselves (N = 561). Analysis of the general Web trust questionnaire revealed 4 factors: information quality, personalization, impartiality, and credible design. In the final SEM model, information quality and impartiality were direct predictors of trust. However, variables specific to eHealth (perceived threat, coping, and corroboration) added substantially to the ability of the model to predict variance in trust and readiness to act on advice on the site. The final model achieved a satisfactory fit: χ2 5 = 10.8 (P = .21), comparative fit index = .99, root mean square error of approximation = .052. The model accounted for 66% of the variance in trust and 49% of the variance in readiness to act on the advice. Conclusions Adding variables specific to eHealth enhanced the ability of a model of trust to predict trust and readiness to act on advice. PMID:21795237

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shum, D.K.M.

    This paper examines various issues that would impact the incorporation of warm prestress (WPS) effects in the fracture-margin assessment of reactor pressure vessels (RPVs). By way of an example problem, possible beneficial effects of including type-I WPS in the assessment of an RPV subjected to a small break loss of coolant accident are described. In addition, the need to consider possible loss of constraint effects when interpreting available small specimen WPS-enhanced fracture toughness data is demonstrated through two- and three-dimensional local crack-lip field analyses of a compact tension specimen. Finally, a hybrid correlative-predictive model of WPS base on J-Q theorymore » and the Ritchie-Knott-Rice model is applied to a small scale yielding boundary layer formulation to investigate near crack-tip fields under varying degrees of loading and unloading.« less

  19. Bayesian inference in an item response theory model with a generalized student t link function

    NASA Astrophysics Data System (ADS)

    Azevedo, Caio L. N.; Migon, Helio S.

    2012-10-01

    In this paper we introduce a new item response theory (IRT) model with a generalized Student t-link function with unknown degrees of freedom (df), named generalized t-link (GtL) IRT model. In this model we consider only the difficulty parameter in the item response function. GtL is an alternative to the two parameter logit and probit models, since the degrees of freedom (df) play a similar role to the discrimination parameter. However, the behavior of the curves of the GtL is different from those of the two parameter models and the usual Student t link, since in GtL the curve obtained from different df's can cross the probit curves in more than one latent trait level. The GtL model has similar proprieties to the generalized linear mixed models, such as the existence of sufficient statistics and easy parameter interpretation. Also, many techniques of parameter estimation, model fit assessment and residual analysis developed for that models can be used for the GtL model. We develop fully Bayesian estimation and model fit assessment tools through a Metropolis-Hastings step within Gibbs sampling algorithm. We consider a prior sensitivity choice concerning the degrees of freedom. The simulation study indicates that the algorithm recovers all parameters properly. In addition, some Bayesian model fit assessment tools are considered. Finally, a real data set is analyzed using our approach and other usual models. The results indicate that our model fits the data better than the two parameter models.

  20. AdViSHE: A Validation-Assessment Tool of Health-Economic Models for Decision Makers and Model Users.

    PubMed

    Vemer, P; Corro Ramos, I; van Voorn, G A K; Al, M J; Feenstra, T L

    2016-04-01

    A trade-off exists between building confidence in health-economic (HE) decision models and the use of scarce resources. We aimed to create a practical tool providing model users with a structured view into the validation status of HE decision models, to address this trade-off. A Delphi panel was organized, and was completed by a workshop during an international conference. The proposed tool was constructed iteratively based on comments from, and the discussion amongst, panellists. During the Delphi process, comments were solicited on the importance and feasibility of possible validation techniques for modellers, their relevance for decision makers, and the overall structure and formulation in the tool. The panel consisted of 47 experts in HE modelling and HE decision making from various professional and international backgrounds. In addition, 50 discussants actively engaged in the discussion at the conference workshop and returned 19 questionnaires with additional comments. The final version consists of 13 items covering all relevant aspects of HE decision models: the conceptual model, the input data, the implemented software program, and the model outcomes. Assessment of the Validation Status of Health-Economic decision models (AdViSHE) is a validation-assessment tool in which model developers report in a systematic way both on validation efforts performed and on their outcomes. Subsequently, model users can establish whether confidence in the model is justified or whether additional validation efforts should be undertaken. In this way, AdViSHE enhances transparency of the validation status of HE models and supports efficient model validation.

  1. An improved probit method for assessment of domino effect to chemical process equipment caused by overpressure.

    PubMed

    Mingguang, Zhang; Juncheng, Jiang

    2008-10-30

    Overpressure is one important cause of domino effect in accidents of chemical process equipments. Damage probability and relative threshold value are two necessary parameters in QRA of this phenomenon. Some simple models had been proposed based on scarce data or oversimplified assumption. Hence, more data about damage to chemical process equipments were gathered and analyzed, a quantitative relationship between damage probability and damage degrees of equipment was built, and reliable probit models were developed associated to specific category of chemical process equipments. Finally, the improvements of present models were evidenced through comparison with other models in literatures, taking into account such parameters: consistency between models and data, depth of quantitativeness in QRA.

  2. High-fidelity meshes from tissue samples for diffusion MRI simulations.

    PubMed

    Panagiotaki, Eleftheria; Hall, Matt G; Zhang, Hui; Siow, Bernard; Lythgoe, Mark F; Alexander, Daniel C

    2010-01-01

    This paper presents a method for constructing detailed geometric models of tissue microstructure for synthesizing realistic diffusion MRI data. We construct three-dimensional mesh models from confocal microscopy image stacks using the marching cubes algorithm. Random-walk simulations within the resulting meshes provide synthetic diffusion MRI measurements. Experiments optimise simulation parameters and complexity of the meshes to achieve accuracy and reproducibility while minimizing computation time. Finally we assess the quality of the synthesized data from the mesh models by comparison with scanner data as well as synthetic data from simple geometric models and simplified meshes that vary only in two dimensions. The results support the extra complexity of the three-dimensional mesh compared to simpler models although sensitivity to the mesh resolution is quite robust.

  3. Antenna gain of actively compensated free-space optical communication systems under strong turbulence conditions.

    PubMed

    Juarez, Juan C; Brown, David M; Young, David W

    2014-05-19

    Current Strehl ratio models for actively compensated free-space optical communications terminals do not accurately predict system performance under strong turbulence conditions as they are based on weak turbulence theory. For evaluation of compensated systems, we present an approach for simulating the Strehl ratio with both low-order (tip/tilt) and higher-order (adaptive optics) correction. Our simulation results are then compared to the published models and their range of turbulence validity is assessed. Finally, we propose a new Strehl ratio model and antenna gain equation that are valid for general turbulence conditions independent of the degree of compensation.

  4. Artificial neural networks modelling the prednisolone nanoprecipitation in microfluidic reactors.

    PubMed

    Ali, Hany S M; Blagden, Nicholas; York, Peter; Amani, Amir; Brook, Toni

    2009-06-28

    This study employs artificial neural networks (ANNs) to create a model to identify relationships between variables affecting drug nanoprecipitation using microfluidic reactors. The input variables examined were saturation levels of prednisolone, solvent and antisolvent flow rates, microreactor inlet angles and internal diameters, while particle size was the single output. ANNs software was used to analyse a set of data obtained by random selection of the variables. The developed model was then assessed using a separate set of validation data and provided good agreement with the observed results. The antisolvent flow rate was found to have the dominant role on determining final particle size.

  5. 78 FR 52909 - Consumer Financial Protection Bureau Notice of Availability of Final Environmental Assessment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-27

    ...: [email protected] . Mail/Hand Delivery/Courier: Michael Davis, Project Manager, Consumer Financial... application of best management practices (BMP) to minimize short term air quality and noise impact during... of Final Environmental Assessment (FINAL EA) and a Finding of No Significant Impact (FONSI) for...

  6. Statistical Model of Dynamic Markers of the Alzheimer's Pathological Cascade.

    PubMed

    Balsis, Steve; Geraci, Lisa; Benge, Jared; Lowe, Deborah A; Choudhury, Tabina K; Tirso, Robert; Doody, Rachelle S

    2018-05-05

    Alzheimer's disease (AD) is a progressive disease reflected in markers across assessment modalities, including neuroimaging, cognitive testing, and evaluation of adaptive function. Identifying a single continuum of decline across assessment modalities in a single sample is statistically challenging because of the multivariate nature of the data. To address this challenge, we implemented advanced statistical analyses designed specifically to model complex data across a single continuum. We analyzed data from the Alzheimer's Disease Neuroimaging Initiative (ADNI; N = 1,056), focusing on indicators from the assessments of magnetic resonance imaging (MRI) volume, fluorodeoxyglucose positron emission tomography (FDG-PET) metabolic activity, cognitive performance, and adaptive function. Item response theory was used to identify the continuum of decline. Then, through a process of statistical scaling, indicators across all modalities were linked to that continuum and analyzed. Findings revealed that measures of MRI volume, FDG-PET metabolic activity, and adaptive function added measurement precision beyond that provided by cognitive measures, particularly in the relatively mild range of disease severity. More specifically, MRI volume, and FDG-PET metabolic activity become compromised in the very mild range of severity, followed by cognitive performance and finally adaptive function. Our statistically derived models of the AD pathological cascade are consistent with existing theoretical models.

  7. Vodcasts and Active-Learning Exercises in a “Flipped Classroom” Model of a Renal Pharmacotherapy Module

    PubMed Central

    Fox, Jeremy

    2012-01-01

    Objective. To implement a “flipped classroom” model for a renal pharmacotherapy topic module and assess the impact on pharmacy students’ performance and attitudes. Design. Students viewed vodcasts (video podcasts) of lectures prior to the scheduled class and then discussed interactive cases of patients with end-stage renal disease in class. A process-oriented guided inquiry learning (POGIL) activity was developed and implemented that complemented, summarized, and allowed for application of the material contained in the previously viewed lectures. Assessment. Students’ performance on the final examination significantly improved compared to performance of students the previous year who completed the same module in a traditional classroom setting. Students’ opinions of the POGIL activity and the flipped classroom instructional model were mostly positive. Conclusion. Implementing a flipped classroom model to teach a renal pharmacotherapy module resulted in improved student performance and favorable student perceptions about the instructional approach. Some of the factors that may have contributed to students’ improved scores included: student mediated contact with the course material prior to classes, benchmark and formative assessments administered during the module, and the interactive class activities. PMID:23275661

  8. The factorial structure of job-related affective well-being: Polish adaptation of the Warr's measure.

    PubMed

    Mielniczuk, Emilia; Łaguna, Mariola

    2018-02-16

    The first aim of the study reported in this article was to test the factorial structure of job-related affect in a Polish sample. The second aim was to develop the Polish adaptation of the Warr's job-related affective well-being measure published in 1990, which is designed to assess 4 types of affect at work: anxiety, comfort, depression, enthusiasm. A longitudinal study design with 2 measurement times was used for verifying the psychometric properties of the Polish version of the measure. The final sample consisted of 254 Polish employees from different professions. Participants were asked to fill in a set of questionnaires consisting of measures capturing job-related affective well-being, mood, and turnover intention. The first step of analysis was to test the theoretically-based structure of the job-related affective well-being measure in a Polish sample. The confirmatory factor analysis revealed that a 4-factor model best describes the structure of the measure in comparison to 5 alternative models. Next, reliability of this measure was assessed. All scales achieved good internal consistency and acceptable test-retest reliability after 2 weeks. Finally, the convergent and discriminant validity as well as the criterion and predictive validity of all job-related affective well-being scales was confirmed, based on correlations between job-related affect and mood as well as turnover intention. The results suggest that the Polish adaptation of Warr's job-related affective well-being measure can be used by scientists as well as by practitioners who aim at assessing 4 types of affective well-being at a work context. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  9. Preliminary performance assessment for the Waste Isolation Pilot Plant, December 1992. Volume 5, Uncertainty and sensitivity analyses of gas and brine migration for undisturbed performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-08-01

    Before disposing of transuranic radioactive waste in the Waste Isolation Pilot Plant (WIPP), the United States Department of Energy (DOE) must evaluate compliance with applicable long-term regulations of the United States Environmental Protection Agency (EPA). Sandia National Laboratories is conducting iterative performance assessments (PAs) of the WIPP for the DOE to provide interim guidance while preparing for a final compliance evaluation. This volume of the 1992 PA contains results of uncertainty and sensitivity analyses with respect to migration of gas and brine from the undisturbed repository. Additional information about the 1992 PA is provided in other volumes. Volume 1 containsmore » an overview of WIPP PA and results of a preliminary comparison with 40 CFR 191, Subpart B. Volume 2 describes the technical basis for the performance assessment, including descriptions of the linked computational models used in the Monte Carlo analyses. Volume 3 contains the reference data base and values for input parameters used in consequence and probability modeling. Volume 4 contains uncertainty and sensitivity analyses with respect to the EPA`s Environmental Standards for the Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes (40 CFR 191, Subpart B). Finally, guidance derived from the entire 1992 PA is presented in Volume 6. Results of the 1992 uncertainty and sensitivity analyses indicate that, conditional on the modeling assumptions and the assigned parameter-value distributions, the most important parameters for which uncertainty has the potential to affect gas and brine migration from the undisturbed repository are: initial liquid saturation in the waste, anhydrite permeability, biodegradation-reaction stoichiometry, gas-generation rates for both corrosion and biodegradation under inundated conditions, and the permeability of the long-term shaft seal.« less

  10. The delta cooperative model: a dynamic and innovative team-work activity to develop research skills in microbiology.

    PubMed

    Rios-Velazquez, Carlos; Robles-Suarez, Reynaldo; Gonzalez-Negron, Alberto J; Baez-Santos, Ivan

    2006-05-01

    The Delta Cooperative Model (DCM) is a dynamic and innovative teamwork design created to develop fundamentals in research skills. High school students in the DCM belong to the Upward Bound Science and Math (UBSM) program at the Inter American University, Ponce Campus. After workshops on using the scientific method, students were organized into groups of three students with similar research interests. Each student had to take on a role within the group as either a researcher, data analyst, or research editor. Initially, each research team developed hypothesis-driven ideas on their proposed project. In intrateam research meetings, they emphasized team-specific tasks. Next, interteam meetings were held to present ideas and receive critical input. Finally, oral and poster research presentations were conducted at the UBSM science fair. Several team research projects covered topics in medical, environmental, and general microbiology. The three major assessment areas for the workshop and DCM included: (i) student's perception of the workshops' effectiveness in developing skills, content, and values; (ii) research team self- and group participation evaluation, and (iii) oral and poster presentation during the science fair. More than 91% of the students considered the workshops effective in the presentation of scientific method fundamentals. The combination of the workshop and the DCM increased student's knowledge by 55% from pre- to posttests. Two rubrics were designed to assess the oral presentation and poster set-up. The poster and oral presentation scores averaged 83% and 75% respectively. Finally, we present a team assessment instrument that allows the self- and group evaluation of each research team. While the DCM has educational plasticity and versatility, here we document how the this model has been successfully incorporated in training and engaging students in scientific research in microbiology.

  11. An Exposure Assessment of Polybrominated Diphenyl Ethers (Pbde) (Final)

    EPA Science Inventory

    EPA announced the availability of the final report, An Exposure Assessment of Polybrominated Diphenyl Ethers. This report provides a comprehensive assessment of the exposure of Americans to this class of persistent organic pollutants. Individual chapters in this document ...

  12. ORAM-SENTINEL{trademark} demonstration at Fitzpatrick. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, L.K.; Anderson, V.M.; Mohammadi, K.

    1998-06-01

    New York Power Authority, in cooperation with EPRI, installed the ORAM-SENTINEL{trademark} software at James A. Fitzpatrick (JAF) Nuclear Power Plant. This software incorporates models of safety systems and support systems that are used for defense-in-depth in the plant during outage and on-line periods. A secondary goal was to include some pre-analyzed risk results to validate the methodology for quantitative assessment of the plant risks during proposed on-line maintenance. During the past year, New York Power Authority personnel have become familiar with the formal computerized Safety Assessment process associated with on-line and outage maintenance. The report describes techniques and lessons learnedmore » during development of the ORAM-SENTINEL model at JAF. It overviews the systems important to the Safety Function Assessment Process and provides details on development of the Plant Transient Assessment process using the station emergency operating procedures. The assessment results are displayed by color (green, yellow, orange, red) to show decreasing safety conditions. The report describes use of the JAF Probabilistic Safety Assessment within the ORAM-SENTINEL code to calculate an instantaneous core damage frequency and the criteria by which this frequency is translated to a color indicator.« less

  13. Risk assessment for tephra dispersal and sedimentation: the example of four Icelandic volcanoes

    NASA Astrophysics Data System (ADS)

    Biass, Sebastien; Scaini, Chiara; Bonadonna, Costanza; Smith, Kate; Folch, Arnau; Höskuldsson, Armann; Galderisi, Adriana

    2014-05-01

    In order to assist the elaboration of proactive measures for the management of future Icelandic volcanic eruptions, we developed a new approach to assess the impact associated with tephra dispersal and sedimentation at various scales and for multiple sources. Target volcanoes are Hekla, Katla, Eyjafjallajökull and Askja, selected for their high probabilities of eruption and/or their high potential impact. We combined stratigraphic studies, probabilistic strategies and numerical modelling to develop comprehensive eruption scenarios and compile hazard maps for local ground deposition and regional atmospheric concentration using both TEPHRA2 and FALL3D models. New algorithms for the identification of comprehensive probability density functions of eruptive source parameters were developed for both short and long-lasting activity scenarios. A vulnerability assessment of socioeconomic and territorial aspects was also performed at both national and continental scales. The identification of relevant vulnerability indicators allowed for the identification of the most critical areas and territorial nodes. At a national scale, the vulnerability of economic activities and the accessibility to critical infrastructures was assessed. At a continental scale, we assessed the vulnerability of the main airline routes and airports. Resulting impact and risk were finally assessed by combining hazard and vulnerability analysis.

  14. Hysteresis in the Central African Rainforest

    NASA Astrophysics Data System (ADS)

    Pietsch, Stephan Alexander; Elias Bednar, Johannes; Gautam, Sishir; Petritsch, Richard; Schier, Franziska; Stanzl, Patrick

    2014-05-01

    Past climate change caused severe disturbances of the Central African rainforest belt, with forest fragmentation and re-expansion due to drier and wetter climate conditions. Besides climate, human induced forest degradation affected biodiversity, structure and carbon storage of Congo basin rainforests. Information on climatically stable, mature rainforest, unaffected by human induced disturbances, provides means of assessing the impact of forest degradation and may serve as benchmarks of carbon carrying capacity over regions with similar site and climate conditions. BioGeoChemical (BGC) ecosystem models explicitly consider the impacts of site and climate conditions and may assess benchmark levels over regions devoid of undisturbed conditions. We will present a BGC-model validation for the Western Congolian Lowland Rainforest (WCLRF) using field data from a recently confirmed forest refuge, show model - data comparisons for disturbed und undisturbed forests under different site and climate conditions as well as for sites with repeated assessment of biodiversity and standing biomass during recovery from intensive exploitation. We will present climatic thresholds for WCLRF stability, analyse the relationship between resilience, standing C-stocks and change in climate and finally provide evidence of hysteresis.

  15. Cost-Utility Analysis: Current Methodological Issues and Future Perspectives

    PubMed Central

    Nuijten, Mark J. C.; Dubois, Dominique J.

    2011-01-01

    The use of cost–effectiveness as final criterion in the reimbursement process for listing of new pharmaceuticals can be questioned from a scientific and policy point of view. There is a lack of consensus on main methodological issues and consequently we may question the appropriateness of the use of cost–effectiveness data in health care decision-making. Another concern is the appropriateness of the selection and use of an incremental cost–effectiveness threshold (Cost/QALY). In this review, we focus mainly on only some key methodological concerns relating to discounting, the utility concept, cost assessment, and modeling methodologies. Finally we will consider the relevance of some other important decision criteria, like social values and equity. PMID:21713127

  16. Assessing uncertainty in SRTM elevations for global flood modelling

    NASA Astrophysics Data System (ADS)

    Hawker, L. P.; Rougier, J.; Neal, J. C.; Bates, P. D.

    2017-12-01

    The SRTM DEM is widely used as the topography input to flood models in data-sparse locations. Understanding spatial error in the SRTM product is crucial in constraining uncertainty about elevations and assessing the impact of these upon flood prediction. Assessment of SRTM error was carried out by Rodriguez et al (2006), but this did not explicitly quantify the spatial structure of vertical errors in the DEM, and nor did it distinguish between errors over different types of landscape. As a result, there is a lack of information about spatial structure of vertical errors of the SRTM in the landscape that matters most to flood models - the floodplain. Therefore, this study attempts this task by comparing SRTM, an error corrected SRTM product (The MERIT DEM of Yamazaki et al., 2017) and near truth LIDAR elevations for 3 deltaic floodplains (Mississippi, Po, Wax Lake) and a large lowland region (the Fens, UK). Using the error covariance function, calculated by comparing SRTM elevations to the near truth LIDAR, perturbations of the 90m SRTM DEM were generated, producing a catalogue of plausible DEMs. This allows modellers to simulate a suite of plausible DEMs at any aggregated block size above native SRTM resolution. Finally, the generated DEM's were input into a hydrodynamic model of the Mekong Delta, built using the LISFLOOD-FP hydrodynamic model, to assess how DEM error affects the hydrodynamics and inundation extent across the domain. The end product of this is an inundation map with the probability of each pixel being flooded based on the catalogue of DEMs. In a world of increasing computer power, but a lack of detailed datasets, this powerful approach can be used throughout natural hazard modelling to understand how errors in the SRTM DEM can impact the hazard assessment.

  17. Bridging the gap between life cycle inventory and impact assessment for toxicological assessments of pesticides used in crop production.

    PubMed

    van Zelm, Rosalie; Larrey-Lassalle, Pyrène; Roux, Philippe

    2014-04-01

    In Life Cycle Assessment (LCA), the Life Cycle Inventory (LCI) provides emission data to the various environmental compartments and Life Cycle Impact Assessment (LCIA) determines the final distribution, fate and effects. Due to the overlap between the Technosphere (anthropogenic system) and Ecosphere (environment) in agricultural case studies, it is, however, complicated to establish what LCI needs to capture and where LCIA takes over. This paper aims to provide guidance and improvements of LCI/LCIA boundary definitions, in the dimensions of space and time. For this, a literature review was conducted to provide a clear overview of available methods and models for both LCI and LCIA regarding toxicological assessments of pesticides used in crop production. Guidelines are provided to overcome the gaps between LCI and LCIA modeling, and prevent the overlaps in their respective operational spheres. The proposed framework provides a starting point for LCA practitioners to gather the right data and use the proper models to include all relevant emission and exposure routes where possible. It is also able to predict a clear distinction between efficient and inefficient management practices (e.g. using different application rates, washing and rinsing management, etc.). By applying this framework for toxicological assessments of pesticides, LCI and LCIA can be directly linked, removing any overlaps or gaps in between the two distinct LCA steps. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Factors underlying the psychological and behavioral characteristics of Office of Strategic Services candidates: the assessment of men data revisited.

    PubMed

    Lenzenweger, Mark F

    2015-01-01

    During World War II, the Office of Strategic Services (OSS), the forerunner of the Central Intelligence Agency, sought the assistance of clinical psychologists and psychiatrists to establish an assessment program for evaluating candidates for the OSS. The assessment team developed a novel and rigorous program to evaluate OSS candidates. It is described in Assessment of Men: Selection of Personnel for the Office of Strategic Services (OSS Assessment Staff, 1948). This study examines the sole remaining multivariate data matrix that includes all final ratings for a group of candidates (n = 133) assessed near the end of the assessment program. It applies the modern statistical methods of both exploratory and confirmatory factor analysis to this rich and highly unique data set. An exploratory factor analysis solution suggested 3 factors underlie the OSS assessment staff ratings. Confirmatory factor analysis results of multiple plausible substantive models reveal that a 3-factor model provides the best fit to these data. The 3 factors are emotional/interpersonal factors (social relations, emotional stability, security), intelligence processing (effective IQ, propaganda skills, observing and reporting), and agency/surgency (motivation, energy and initiative, leadership, physical ability). These factors are discussed in terms of their potential utility for personnel selection within the intelligence community.

  19. Therapeutic risk management of the suicidal patient: augmenting clinical suicide risk assessment with structured instruments.

    PubMed

    Homaifar, Beeta; Matarazzo, Bridget; Wortzel, Hal S

    2013-09-01

    This column is the second in a series presenting a model for therapeutic risk management of the suicidal patient. As discussed in the first part of the series, the model involves several elements including augmenting clinical risk assessment with structured instruments, stratifying risk in terms of both severity and temporality, and developing and documenting a safety plan. This column explores in more detail how to augment clinical risk assessment with structured instruments. Unstructured clinical interviews have the potential to miss important aspects of suicide risk assessment. By augmenting the free-form clinical interview with structured instruments that demonstrate reliability and validity, a more nuanced and multifaceted approach to suicide risk assessment is achieved. Incorporating structured instruments into practice also serves a medicolegal function, since these instruments may become a living part of the medical record, establishing baseline levels of suicidal thoughts and behaviors and facilitating future clinical determinations regarding safety needs. We describe several instruments used in a multidisciplinary suicide consultation service, each of which has demonstrated relevance to suicide risk assessment and screening, ease of administration, and strong psychometric properties. In addition, we emphasize the importance of viewing suicide risk assessment as an ongoing process rather than as a singular event. Finally, we discuss special considerations in the evolving practice of risk assessment.

  20. 75 FR 79412 - Final Revised Assessment of Annual Needs for the List I Chemicals Ephedrine, Pseudoephedrine, and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-20

    ... Assessment of Annual Needs for the List I Chemicals Ephedrine, Pseudoephedrine, and Phenylpropanolamine for... Annual Needs for 2010. SUMMARY: This notice establishes the Final Revised 2010 Assessment of Annual Needs...). The 2010 Assessment of Annual Needs represents those quantities of ephedrine, pseudoephedrine, and...

  1. The assessment of personality disorders: implications for cognitive and behavior therapy.

    PubMed

    Van Velzen, C J; Emmelkamp, P M

    1996-08-01

    This article reviews the comorbidity of personality disorders (PDs) and Axis I disorders and discusses implications for assessment and treatment. Pros and cons of various assessment methods are discussed. The co-occurrence of PDs with Axis I disorders is considerable; roughly half of patients with anxiety disorders, depressive disorders or eating disorders received a PD diagnosis. Comorbidity models are discussed and implications for assessment and treatment are provided. Regarding the impact of PDs on cognitive-behavioral treatment outcome for Axis I disorders, conflicting results are found due to differences in assessment methods, treatment strategies, and patient samples. It is argued that additional Axis I pathology should be taken into account when studying the impact of PDs on treatment outcome for the target Axis I disorders. Finally, it is argued that the interpersonal behavior of the PD patient and the therapeutic relationship deserve more attention in the assessment and treatment of patients with PDs.

  2. IRIS Toxicological Review of Benzo[a]pyrene (Final Report) ...

    EPA Pesticide Factsheets

    EPA has finalized the Integrated Risk Information System (IRIS) assessment of benzo[a]pyrene. This assessment addresses the potential cancer and noncancer human health effects from long-term exposure to benzo[a]pyrene. Now final, this assessment will update the toxicological information on benzo[a]pyrene posted in 1987. EPA’s program and regional offices may use this assessment to inform decisions to protect human health. EPA is undertaking an update of the Integrated Risk Information System (IRIS) health assessment for benzo[a]pyrene (BaP). The outcome of this project is an updated Toxicological Review and IRIS Summary for BaP that will be entered into the IRIS database.

  3. Contribution of Submarine Groundwater on the Water-Food Nexus in Coastal Ecosystems: Effects on Biodiversity and Fishery Production

    NASA Astrophysics Data System (ADS)

    Shoji, J.; Sugimoto, R.; Honda, H.; Tominaga, O.; Taniguchi, M.

    2014-12-01

    In the past decade, machine-learning methods for empirical rainfall-runoff modeling have seen extensive development. However, the majority of research has focused on a small number of methods, such as artificial neural networks, while not considering other approaches for non-parametric regression that have been developed in recent years. These methods may be able to achieve comparable predictive accuracy to ANN's and more easily provide physical insights into the system of interest through evaluation of covariate influence. Additionally, these methods could provide a straightforward, computationally efficient way of evaluating climate change impacts in basins where data to support physical hydrologic models is limited. In this paper, we use multiple regression and machine-learning approaches to predict monthly streamflow in five highly-seasonal rivers in the highlands of Ethiopia. We find that generalized additive models, random forests, and cubist models achieve better predictive accuracy than ANNs in many basins assessed and are also able to outperform physical models developed for the same region. We discuss some challenges that could hinder the use of such models for climate impact assessment, such as biases resulting from model formulation and prediction under extreme climate conditions, and suggest methods for preventing and addressing these challenges. Finally, we demonstrate how predictor variable influence can be assessed to provide insights into the physical functioning of data-sparse watersheds.

  4. Usefulness and limitations of global flood risk models

    NASA Astrophysics Data System (ADS)

    Ward, Philip; Jongman, Brenden; Salamon, Peter; Simpson, Alanna; Bates, Paul; De Groeve, Tom; Muis, Sanne; Coughlan de Perez, Erin; Rudari, Roberto; Mark, Trigg; Winsemius, Hessel

    2016-04-01

    Global flood risk models are now a reality. Initially, their development was driven by a demand from users for first-order global assessments to identify risk hotspots. Relentless upward trends in flood damage over the last decade have enhanced interest in such assessments. The adoption of the Sendai Framework for Disaster Risk Reduction and the Warsaw International Mechanism for Loss and Damage Associated with Climate Change Impacts have made these efforts even more essential. As a result, global flood risk models are being used more and more in practice, by an increasingly large number of practitioners and decision-makers. However, they clearly have their limits compared to local models. To address these issues, a team of scientists and practitioners recently came together at the Global Flood Partnership meeting to critically assess the question 'What can('t) we do with global flood risk models?'. The results of this dialogue (Ward et al., 2013) will be presented, opening a discussion on similar broader initiatives at the science-policy interface in other natural hazards. In this contribution, examples are provided of successful applications of global flood risk models in practice (for example together with the World Bank, Red Cross, and UNISDR), and limitations and gaps between user 'wish-lists' and model capabilities are discussed. Finally, a research agenda is presented for addressing these limitations and reducing the gaps. Ward et al., 2015. Nature Climate Change, doi:10.1038/nclimate2742

  5. Evaluation of a training manual for the acquisition of behavioral assessment interviewing skills.

    PubMed Central

    Miltenberger, R G; Fuqua, R W

    1985-01-01

    Two procedures were used to teach behavioral assessment interviewing skills: a training manual and one-to-one instruction that included modeling, rehearsal, and feedback. Two graduate students and two advanced undergraduates were trained with each procedure. Interviewing skills were recorded in simulated assessment interviews conducted by each student across baseline and treatment conditions. Each training procedure was evaluated in a multiple baseline across students design. The results showed that both procedures were effective for training behavioral interviewing skills, with all students reaching a level of 90%-100% correct responding. Finally, a group of experts in behavior analysis rated each interviewing skill as relevant to the conduct of an assessment interview and a group of behavioral clinicians socially validated the outcomes of the two procedures. PMID:4086413

  6. Multi-Functional Sandwich Composites for Spacecraft Applications: An Initial Assessment

    NASA Technical Reports Server (NTRS)

    Adams, Daniel O.; Webb, Nicholas Jason; Yarger, Cody B.; Hunter, Abigail; Oborn, Kelli D.

    2007-01-01

    Current spacecraft implement relatively uncoupled material and structural systems to address a variety of design requirements, including structural integrity, damage tolerance, radiation protection, debris shielding and thermal insulation. This investigation provided an initial assessment of multi-functional sandwich composites to integrate these diverse requirements. The need for radiation shielding was addressed through the selection of polymeric constituents with high hydrogen content. To provide increased damage tolerance and debris shielding, manufacturing techniques were developed to incorporate transverse stitching reinforcement, internal layers, and a self-healing ionomer membrane. To assess the effects of a space environment, thermal expansion behavior of the candidate foam materials was investigated under a vacuum and increasing temperature. Finally, a thermal expansion model was developed for foam under vacuum conditions and its predictive capability assessed.

  7. Impact of earthquake source complexity and land elevation data resolution on tsunami hazard assessment and fatality estimation

    NASA Astrophysics Data System (ADS)

    Muhammad, Ario; Goda, Katsuichiro

    2018-03-01

    This study investigates the impact of model complexity in source characterization and digital elevation model (DEM) resolution on the accuracy of tsunami hazard assessment and fatality estimation through a case study in Padang, Indonesia. Two types of earthquake source models, i.e. complex and uniform slip models, are adopted by considering three resolutions of DEMs, i.e. 150 m, 50 m, and 10 m. For each of the three grid resolutions, 300 complex source models are generated using new statistical prediction models of earthquake source parameters developed from extensive finite-fault models of past subduction earthquakes, whilst 100 uniform slip models are constructed with variable fault geometry without slip heterogeneity. The results highlight that significant changes to tsunami hazard and fatality estimates are observed with regard to earthquake source complexity and grid resolution. Coarse resolution (i.e. 150 m) leads to inaccurate tsunami hazard prediction and fatality estimation, whilst 50-m and 10-m resolutions produce similar results. However, velocity and momentum flux are sensitive to the grid resolution and hence, at least 10-m grid resolution needs to be implemented when considering flow-based parameters for tsunami hazard and risk assessments. In addition, the results indicate that the tsunami hazard parameters and fatality number are more sensitive to the complexity of earthquake source characterization than the grid resolution. Thus, the uniform models are not recommended for probabilistic tsunami hazard and risk assessments. Finally, the findings confirm that uncertainties of tsunami hazard level and fatality in terms of depth, velocity and momentum flux can be captured and visualized through the complex source modeling approach. From tsunami risk management perspectives, this indeed creates big data, which are useful for making effective and robust decisions.

  8. Quality assessment of a new surgical simulator for neuroendoscopic training.

    PubMed

    Filho, Francisco Vaz Guimarães; Coelho, Giselle; Cavalheiro, Sergio; Lyra, Marcos; Zymberg, Samuel T

    2011-04-01

    Ideal surgical training models should be entirely reliable, atoxic, easy to handle, and, if possible, low cost. All available models have their advantages and disadvantages. The choice of one or another will depend on the type of surgery to be performed. The authors created an anatomical model called the S.I.M.O.N.T. (Sinus Model Oto-Rhino Neuro Trainer) Neurosurgical Endotrainer, which can provide reliable neuroendoscopic training. The aim in the present study was to assess both the quality of the model and the development of surgical skills by trainees. The S.I.M.O.N.T. is built of a synthetic thermoretractable, thermosensible rubber called Neoderma, which, combined with different polymers, produces more than 30 different formulas. Quality assessment of the model was based on qualitative and quantitative data obtained from training sessions with 9 experienced and 13 inexperienced neurosurgeons. The techniques used for evaluation were face validation, retest and interrater reliability, and construct validation. The experts considered the S.I.M.O.N.T. capable of reproducing surgical situations as if they were real and presenting great similarity with the human brain. Surgical results of serial training showed that the model could be considered precise. Finally, development and improvement in surgical skills by the trainees were observed and considered relevant to further training. It was also observed that the probability of any single error was dramatically decreased after each training session, with a mean reduction of 41.65% (range 38.7%-45.6%). Neuroendoscopic training has some specific requirements. A unique set of instruments is required, as is a model that can resemble real-life situations. The S.I.M.O.N.T. is a new alternative model specially designed for this purpose. Validation techniques followed by precision assessments attested to the model's feasibility.

  9. An AgMIP framework for improved agricultural representation in integrated assessment models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruane, Alex C.; Rosenzweig, Cynthia; Asseng, Senthold

    Integrated assessment models (IAMs) hold great potential to assess how future agricultural systems will be shaped by socioeconomic development, technological innovation, and changing climate conditions. By coupling with climate and crop model emulators, IAMs have the potential to resolve important agricultural feedback loops and identify unintended consequences of socioeconomic development for agricultural systems. Here we propose a framework to develop robust representation of agricultural system responses within IAMs, linking downstream applications with model development and the coordinated evaluation of key climate responses from local to global scales. We survey the strengths and weaknesses of protocol-based assessments linked to the Agriculturalmore » Model Intercomparison and Improvement Project (AgMIP), each utilizing multiple sites and models to evaluate crop response to core climate changes including shifts in carbon dioxide concentration, temperature, and water availability, with some studies further exploring how climate responses are affected by nitrogen levels and adaptation in farm systems. Site-based studies with carefully calibrated models encompass the largest number of activities; however they are limited in their ability to capture the full range of global agricultural system diversity. Representative site networks provide more targeted response information than broadly-sampled networks, with limitations stemming from difficulties in covering the diversity of farming systems. Global gridded crop models provide comprehensive coverage, although with large challenges for calibration and quality control of inputs. Diversity in climate responses underscores that crop model emulators must distinguish between regions and farming system while recognizing model uncertainty. Finally, to bridge the gap between bottom-up and top-down approaches we recommend the deployment of a hybrid climate response system employing a representative network of sites to bias-correct comprehensive gridded simulations, opening the door to accelerated development and a broad range of applications.« less

  10. Combined approaches to flexible fitting and assessment in virus capsids undergoing conformational change☆

    PubMed Central

    Pandurangan, Arun Prasad; Shakeel, Shabih; Butcher, Sarah Jane; Topf, Maya

    2014-01-01

    Fitting of atomic components into electron cryo-microscopy (cryoEM) density maps is routinely used to understand the structure and function of macromolecular machines. Many fitting methods have been developed, but a standard protocol for successful fitting and assessment of fitted models has yet to be agreed upon among the experts in the field. Here, we created and tested a protocol that highlights important issues related to homology modelling, density map segmentation, rigid and flexible fitting, as well as the assessment of fits. As part of it, we use two different flexible fitting methods (Flex-EM and iMODfit) and demonstrate how combining the analysis of multiple fits and model assessment could result in an improved model. The protocol is applied to the case of the mature and empty capsids of Coxsackievirus A7 (CAV7) by flexibly fitting homology models into the corresponding cryoEM density maps at 8.2 and 6.1 Å resolution. As a result, and due to the improved homology models (derived from recently solved crystal structures of a close homolog – EV71 capsid – in mature and empty forms), the final models present an improvement over previously published models. In close agreement with the capsid expansion observed in the EV71 structures, the new CAV7 models reveal that the expansion is accompanied by ∼5° counterclockwise rotation of the asymmetric unit, predominantly contributed by the capsid protein VP1. The protocol could be applied not only to viral capsids but also to many other complexes characterised by a combination of atomic structure modelling and cryoEM density fitting. PMID:24333899

  11. An AgMIP framework for improved agricultural representation in integrated assessment models

    NASA Astrophysics Data System (ADS)

    Ruane, Alex C.; Rosenzweig, Cynthia; Asseng, Senthold; Boote, Kenneth J.; Elliott, Joshua; Ewert, Frank; Jones, James W.; Martre, Pierre; McDermid, Sonali P.; Müller, Christoph; Snyder, Abigail; Thorburn, Peter J.

    2017-12-01

    Integrated assessment models (IAMs) hold great potential to assess how future agricultural systems will be shaped by socioeconomic development, technological innovation, and changing climate conditions. By coupling with climate and crop model emulators, IAMs have the potential to resolve important agricultural feedback loops and identify unintended consequences of socioeconomic development for agricultural systems. Here we propose a framework to develop robust representation of agricultural system responses within IAMs, linking downstream applications with model development and the coordinated evaluation of key climate responses from local to global scales. We survey the strengths and weaknesses of protocol-based assessments linked to the Agricultural Model Intercomparison and Improvement Project (AgMIP), each utilizing multiple sites and models to evaluate crop response to core climate changes including shifts in carbon dioxide concentration, temperature, and water availability, with some studies further exploring how climate responses are affected by nitrogen levels and adaptation in farm systems. Site-based studies with carefully calibrated models encompass the largest number of activities; however they are limited in their ability to capture the full range of global agricultural system diversity. Representative site networks provide more targeted response information than broadly-sampled networks, with limitations stemming from difficulties in covering the diversity of farming systems. Global gridded crop models provide comprehensive coverage, although with large challenges for calibration and quality control of inputs. Diversity in climate responses underscores that crop model emulators must distinguish between regions and farming system while recognizing model uncertainty. Finally, to bridge the gap between bottom-up and top-down approaches we recommend the deployment of a hybrid climate response system employing a representative network of sites to bias-correct comprehensive gridded simulations, opening the door to accelerated development and a broad range of applications.

  12. Phthalates and Cumulative Risk Assessment (NAS Final Report)

    EPA Science Inventory

    On December 18, 2008, the National Academy of Sciences' National Research Council released a final report, requested and sponsored by the EPA, entitled Phthalates and Cumulative Risk Assessment: The Task Ahead.

    Risk assessment has become a dominant public policy ...

  13. Rigorous Screening Technology for Identifying Suitable CO2 Storage Sites II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George J. Koperna Jr.; Vello A. Kuuskraa; David E. Riestenberg

    2009-06-01

    This report serves as the final technical report and users manual for the 'Rigorous Screening Technology for Identifying Suitable CO2 Storage Sites II SBIR project. Advanced Resources International has developed a screening tool by which users can technically screen, assess the storage capacity and quantify the costs of CO2 storage in four types of CO2 storage reservoirs. These include CO2-enhanced oil recovery reservoirs, depleted oil and gas fields (non-enhanced oil recovery candidates), deep coal seems that are amenable to CO2-enhanced methane recovery, and saline reservoirs. The screening function assessed whether the reservoir could likely serve as a safe, long-term CO2more » storage reservoir. The storage capacity assessment uses rigorous reservoir simulation models to determine the timing, ultimate storage capacity, and potential for enhanced hydrocarbon recovery. Finally, the economic assessment function determines both the field-level and pipeline (transportation) costs for CO2 sequestration in a given reservoir. The screening tool has been peer reviewed at an Electrical Power Research Institute (EPRI) technical meeting in March 2009. A number of useful observations and recommendations emerged from the Workshop on the costs of CO2 transport and storage that could be readily incorporated into a commercial version of the Screening Tool in a Phase III SBIR.« less

  14. Applicability Analysis of Validation Evidence for Biomedical Computational Models

    DOE PAGES

    Pathmanathan, Pras; Gray, Richard A.; Romero, Vicente J.; ...

    2017-09-07

    Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibilitymore » of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. In this paper, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. Finally, the proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.« less

  15. Applicability Analysis of Validation Evidence for Biomedical Computational Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pathmanathan, Pras; Gray, Richard A.; Romero, Vicente J.

    Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibilitymore » of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. In this paper, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. Finally, the proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.« less

  16. Comparative and Predictive Multimedia Assessments Using Monte Carlo Uncertainty Analyses

    NASA Astrophysics Data System (ADS)

    Whelan, G.

    2002-05-01

    Multiple-pathway frameworks (sometimes referred to as multimedia models) provide a platform for combining medium-specific environmental models and databases, such that they can be utilized in a more holistic assessment of contaminant fate and transport in the environment. These frameworks provide a relatively seamless transfer of information from one model to the next and from databases to models. Within these frameworks, multiple models are linked, resulting in models that consume information from upstream models and produce information to be consumed by downstream models. The Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) is an example, which allows users to link their models to other models and databases. FRAMES is an icon-driven, site-layout platform that is an open-architecture, object-oriented system that interacts with environmental databases; helps the user construct a Conceptual Site Model that is real-world based; allows the user to choose the most appropriate models to solve simulation requirements; solves the standard risk paradigm of release transport and fate; and exposure/risk assessments to people and ecology; and presents graphical packages for analyzing results. FRAMES is specifically designed allow users to link their own models into a system, which contains models developed by others. This paper will present the use of FRAMES to evaluate potential human health exposures using real site data and realistic assumptions from sources, through the vadose and saturated zones, to exposure and risk assessment at three real-world sites, using the Multimedia Environmental Pollutant Assessment System (MEPAS), which is a multimedia model contained within FRAMES. These real-world examples use predictive and comparative approaches coupled with a Monte Carlo analysis. A predictive analysis is where models are calibrated to monitored site data, prior to the assessment, and a comparative analysis is where models are not calibrated but based solely on literature or judgement and is usually used to compare alternatives. In many cases, a combination is employed where the model is calibrated to a portion of the data (e.g., to determine hydrodynamics), then used to compare alternatives. Three subsurface-based multimedia examples are presented, increasing in complexity. The first presents the application of a predictive, deterministic assessment; the second presents a predictive and comparative, Monte Carlo analysis; and the third presents a comparative, multi-dimensional Monte Carlo analysis. Endpoints are typically presented in terms of concentration, hazard, risk, and dose, and because the vadose zone model typically represents a connection between a source and the aquifer, it does not generally represent the final medium in a multimedia risk assessment.

  17. Probabilistic Seismic Hazard Assessment of the Chiapas State (SE Mexico)

    NASA Astrophysics Data System (ADS)

    Rodríguez-Lomelí, Anabel Georgina; García-Mayordomo, Julián

    2015-04-01

    The Chiapas State, in southeastern Mexico, is a very active seismic region due to the interaction of three tectonic plates: Northamerica, Cocos and Caribe. We present a probabilistic seismic hazard assessment (PSHA) specifically performed to evaluate seismic hazard in the Chiapas state. The PSHA was based on a composited seismic catalogue homogenized to Mw and was used a logic tree procedure for the consideration of different seismogenic source models and ground motion prediction equations (GMPEs). The results were obtained in terms of peak ground acceleration as well as spectral accelerations. The earthquake catalogue was compiled from the International Seismological Center and the Servicio Sismológico Nacional de México sources. Two different seismogenic source zones (SSZ) models were devised based on a revision of the tectonics of the region and the available geomorphological and geological maps. The SSZ were finally defined by the analysis of geophysical data, resulting two main different SSZ models. The Gutenberg-Richter parameters for each SSZ were calculated from the declustered and homogenized catalogue, while the maximum expected earthquake was assessed from both the catalogue and geological criteria. Several worldwide and regional GMPEs for subduction and crustal zones were revised. For each SSZ model we considered four possible combinations of GMPEs. Finally, hazard was calculated in terms of PGA and SA for 500-, 1000-, and 2500-years return periods for each branch of the logic tree using the CRISIS2007 software. The final hazard maps represent the mean values obtained from the two seismogenic and four attenuation models considered in the logic tree. For the three return periods analyzed, the maps locate the most hazardous areas in the Chiapas Central Pacific Zone, the Pacific Coastal Plain and in the Motagua and Polochic Fault Zone; intermediate hazard values in the Chiapas Batholith Zone and in the Strike-Slip Faults Province. The hazard decreases towards the northeast across the Reverse Faults Province and up to Yucatan Platform, where the lowest values are reached. We also produced uniform hazard spectra (UHS) for the three main cities of Chiapas. Tapachula city presents the highest spectral accelerations, while Tuxtla Gutierrez and San Cristobal de las Casas cities show similar values. We conclude that seismic hazard in Chiapas is chiefly controlled by the subduction of the Cocos beneath Northamerica and Caribe tectonic plates, that makes the coastal areas the most hazardous. Additionally, the Motagua and Polochic Fault Zones are also important, increasing the hazard particularly in southeastern Chiapas.

  18. Occlusal factors are not related to self-reported bruxism.

    PubMed

    Manfredini, Daniele; Visscher, Corine M; Guarda-Nardini, Luca; Lobbezoo, Frank

    2012-01-01

    To estimate the contribution of various occlusal features of the natural dentition that may identify self-reported bruxers compared to nonbruxers. Two age- and sex-matched groups of self-reported bruxers (n = 67) and self-reported nonbruxers (n = 75) took part in the study. For each patient, the following occlusal features were clinically assessed: retruded contact position (RCP) to intercuspal contact position (ICP) slide length (< 2 mm was considered normal), vertical overlap (< 0 mm was considered an anterior open bite; > 4 mm, a deep bite), horizontal overlap (> 4 mm was considered a large horizontal overlap), incisor dental midline discrepancy (< 2 mm was considered normal), and the presence of a unilateral posterior crossbite, mediotrusive interferences, and laterotrusive interferences. A multiple logistic regression model was used to identify the significant associations between the assessed occlusal features (independent variables) and self-reported bruxism (dependent variable). Accuracy values to predict self-reported bruxism were unacceptable for all occlusal variables. The only variable remaining in the final regression model was laterotrusive interferences (P = .030). The percentage of explained variance for bruxism by the final multiple regression model was 4.6%. This model including only one occlusal factor showed low positive (58.1%) and negative predictive values (59.7%), thus showing a poor accuracy to predict the presence of self-reported bruxism (59.2%). This investigation suggested that the contribution of occlusion to the differentiation between bruxers and nonbruxers is negligible. This finding supports theories that advocate a much diminished role for peripheral anatomical-structural factors in the pathogenesis of bruxism.

  19. The rate of equilibration of viscous aerosol particles

    NASA Astrophysics Data System (ADS)

    O'Meara, Simon; Topping, David O.; McFiggans, Gordon

    2016-04-01

    The proximity of atmospheric aerosol particles to equilibrium with their surrounding condensable vapours can substantially impact their transformations, fate and impacts and is the subject of vibrant research activity. In this study we first compare equilibration timescales estimated by three different models for diffusion through aerosol particles to assess any sensitivity to choice of model framework. Equilibration times for diffusion coefficients with varying dependencies on composition are compared for the first time. We show that even under large changes in the saturation ratio of a semi-volatile component (es) of 1-90 % predicted equilibration timescales are in agreement, including when diffusion coefficients vary with composition. For condensing water and a diffusion coefficient dependent on composition, a plasticising effect is observed, leading to a decreased estimated equilibration time with increasing final es. Above 60 % final es maximum equilibration times of around 1 s are estimated for comparatively large particles (10 µm) containing a relatively low diffusivity component (1 × 10-25 m2 s-1 in pure form). This, as well as other results here, questions whether particle-phase diffusion through water-soluble particles can limit hygroscopic growth in the ambient atmosphere. In the second part of this study, we explore sensitivities associated with the use of particle radius measurements to infer diffusion coefficient dependencies on composition using a diffusion model. Given quantified similarities between models used in this study, our results confirm considerations that must be taken into account when designing such experiments. Although quantitative agreement of equilibration timescales between models is found, further work is necessary to determine their suitability for assessing atmospheric impacts, such as their inclusion in polydisperse aerosol simulations.

  20. Using Rasch model to analyze the ability of pre-university students in vector

    NASA Astrophysics Data System (ADS)

    Ibrahim, Faridah Mohamed; Shariff, Asma Ahmad; Tahir, Rohayatimah Muhammad

    2015-10-01

    Evaluating students' performance only from overall examination marks does not give accurate evidence of their achievement on a particular subject. For a more detailed analysis, an instrument called Rasch Measurement Model (Rasch Model), widely used in education research, may be applied. Using the analysis map, the level of each student's ability and the level of the questions difficulty can be measured. This paper describes how the Rasch Model is used to evaluate students' achivement and performance in Vector, a subject taken by students enrolled in the Physical Science Program at the Centre for Foundation Studies in Science, University of Malaya. Usually, students' understanding of the subject and performance are assessed and examined at the end of the semester in the final examination, apart from continuous assessment done throughout the course. In order to evaluate the individual achievement and get a better and accurate evidence on the performance, 28 male and 28 female students' marks were taken randomly from the final examination results and analysed using the Rasch Model. Observation made from the map showed that more than half of the questions were categorized as difficult while the two most difficult questions could be answered correctly by 33.9% of the students. Results showed that the students performed very well and their achievement was above expectation. About 27% of the sudents could be considered as having very high ability in answering all the questions, with one student being able to answer well, obtaining perfect score. However, two students were found to be misfits since they were able to answer difficult questions but gave poor response to easy ones.

  1. Facilitating Deployment of Community Solar PV systems on Rooftops and Vacant Land in Northeast IL - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stone, Deborah; Oakleaf, Laura

    The Cook County Community Solar project set out to unlock the potential of community solar in the Chicago region with lessons that could be applied nationally. One of the first steps was to prove out the potential market. This was done through an opportunity assessment which showed there is over 9,000 megawatts worth of site capacity available for community solar projects in Cook County – nearly enough to offset all of Cook County’s residential electricity use. The assessment also showed that almost 75% of Cook County households are not able to invest directly in solar photovoltaic systems due to amore » variety of issues from physical barriers such as shading, or condition of the roof, to financial barriers such as lack of roof ownership, or the up-front costs of installation. Because of these barriers, community solar is an essential part of making the benefits of renewable energy available to all of the residents of Cook County. In addition to the opportunity assessment the project team also worked with the over 200 individuals who participated in the stakeholder advisory group to develop a number of other products including: 1) an Economic & Policy Barriers Resolutions and Work Plan document which laid out best practices to address the policy barriers that existed at the time (May of 2016) 2) Value Proposition Report I and Report II which summarize the value of community solar to potential developers and subscribers, 3) The Community Solar Business Case Tool, which provides a flexible financial model that projects the costs and befits to the system developer and subscriber for a project, 4) Bill Crediting Analysis and the 5) Final Report. The Final Report contains 15 case studies which prove that community solar projects are economically feasible in Cook County with a variety of sites, solar designs, ownership and subscriber models.« less

  2. Reciprocal benefit to senior and junior peers: An outcome of a pilot research workshop at medical university.

    PubMed

    Ahsin, Sadia; Abbas, Seyyeda; Zaidi, Noshin; Azad, Nadia; Kaleem, Fatima

    2015-08-01

    A study was planned to explore and evaluate the role of senior peers in the learning process of their juniors during a Research Methodology workshop, and to assess educational advantages for seniors in leading roles. Twenty medical students participated with 15 juniors (1st to 3rd year) and 5 seniors (final/fourth year) divided into 5 groups with one senior student each at Foundation University Medical College, Islamabad, Pakistan. The seniors supervised and engaged the groups to develop research questions, formulate objectives, review literature, outline study designs, develop study tools/questionnaire and finally shape their projects in synopsis. Overall advantages to both juniors and seniors through this peer-assisted learning model were assessed by feedback proformas with open and closed-ended questions. Senior peers' facilitation was effective in the learning process of junior peers. Senior peers also gained academic benefit by exercising their leadership qualities through teaching and maintaining group dynamics.

  3. A computational framework to characterize and compare the geometry of coronary networks.

    PubMed

    Bulant, C A; Blanco, P J; Lima, T P; Assunção, A N; Liberato, G; Parga, J R; Ávila, L F R; Pereira, A C; Feijóo, R A; Lemos, P A

    2017-03-01

    This work presents a computational framework to perform a systematic and comprehensive assessment of the morphometry of coronary arteries from in vivo medical images. The methodology embraces image segmentation, arterial vessel representation, characterization and comparison, data storage, and finally analysis. Validation is performed using a sample of 48 patients. Data mining of morphometric information of several coronary arteries is presented. Results agree to medical reports in terms of basic geometric and anatomical variables. Concerning geometric descriptors, inter-artery and intra-artery correlations are studied. Data reported here can be useful for the construction and setup of blood flow models of the coronary circulation. Finally, as an application example, similarity criterion to assess vasculature likelihood based on geometric features is presented and used to test geometric similarity among sibling patients. Results indicate that likelihood, measured through geometric descriptors, is stronger between siblings compared with non-relative patients. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  4. What Do Final Year Medical Students Understand by the Concept of Recovery? A Descriptive Qualitative Study.

    PubMed

    Newton-Howes, Giles; Beverley, Georgia; Ellis, Pete M; Gordon, Sarah; Levack, William

    2018-06-01

    Traditional teaching in psychiatry does little to address recovery concepts. The aim of this study was to evaluate the incorporation of a recovery-focused teaching program for medical students in psychiatry. Recovery, as understood by medical students who had participated in a recovery-focused teaching program, was assessed by thematic analysis of recovery-focused assessment reflections. Six major themes emerged from the recovery reflections from final year medical students are as follows: (1) recovery as a person-centered approach, (2) the need for social integration, (3) non-diagnostic framing of mental illness, (4) tensions between the medical model and personal recovery, (5) a patient's willingness to engage with mental health services, and (6) the development of a positive sense of self. A recovery teaching program was associated with students expressing knowledge of recovery principles and positive attitudes towards people with experience of mental illness. Psychiatric placements for medical students may benefit from a recovery focus.

  5. 78 FR 11632 - Notice of Availability for the Draft Finding of No Significant Impact and Final Programmatic...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-19

    ... Significant Impact and Final Programmatic Environmental Assessment for Army 2020 Force Structure Realignment... Significant Impact (FNSI) and final Programmatic Environmental Assessment (PEA) for Army 2020 force structure realignments that may occur from Fiscal Years (FYs) 2013-2020. The Army published the Notice of Availability of...

  6. Developing Land Use Land Cover Maps for the Lower Mekong Basin to Aid SWAT Hydrologic Modeling

    NASA Astrophysics Data System (ADS)

    Spruce, J.; Bolten, J. D.; Srinivasan, R.

    2017-12-01

    This presentation discusses research to develop Land Use Land Cover (LULC) maps for the Lower Mekong Basin (LMB). Funded by a NASA ROSES Disasters grant, the main objective was to produce updated LULC maps to aid the Mekong River Commission's (MRC's) Soil and Water Assessment Tool (SWAT) hydrologic model. In producing needed LULC maps, temporally processed MODIS monthly NDVI data for 2010 were used as the primary data source for classifying regionally prominent forest and agricultural types. The MODIS NDVI data was derived from processing MOD09 and MYD09 8-day reflectance data with the Time Series Product Tool, a custom software package. Circa 2010 Landsat multispectral data from the dry season were processed into top of atmosphere reflectance mosaics and then classified to derive certain locally common LULC types, such as urban areas and industrial forest plantations. Unsupervised ISODATA clustering was used to derive most LULC classifications. GIS techniques were used to merge MODIS and Landsat classifications into final LULC maps for Sub-Basins (SBs) 1-8 of the LMB. The final LULC maps were produced at 250-meter resolution and delivered to the MRC for use in SWAT modeling for the LMB. A map accuracy assessment was performed for the SB 7 LULC map with 14 classes. This assessment was performed by comparing random locations for sampled LULC types to geospatial reference data such as Landsat RGBs, MODIS NDVI phenologic profiles, high resolution satellite data from Google Map/Earth, and other reference data from the MRC (e.g., crop calendars). LULC accuracy assessment results for SB 7 indicated an overall agreement to reference data of 81% at full scheme specificity. However, by grouping 3 deciduous forest classes into 1 class, the overall agreement improved to 87%. The project enabled updated LULC maps, plus more specific rice types were classified compared to the previous LULC maps. The LULC maps from this project should improve the use of SWAT for modeling hydrology in the LMB, plus improve water and disaster management in a region vulnerable to flooding, droughts, and anthropogenic change (e.g., from dam building and other LULC change).

  7. Technology Assessment Software Package: Final Report.

    ERIC Educational Resources Information Center

    Hutinger, Patricia L.

    This final report describes the Technology Assessment Software Package (TASP) Project, which produced developmentally appropriate technology assessment software for children from 18 months through 8 years of age who have moderate to severe disabilities that interfere with their interaction with people, objects, tasks, and events in their…

  8. Determination of the Biologically Relevant Sampling Depth for Terrestrial and Aquatic Ecological Risk Assessments (Final Report)

    EPA Science Inventory

    The Ecological Risk Assessment Support Center (ERASC) announced the release of the final report, Determination of the Biologically Relevant Sampling Depth for Terrestrial and Aquatic Ecological Risk Assessments. This technical paper provides defensible approximations fo...

  9. Statistical models for fever forecasting based on advanced body temperature monitoring.

    PubMed

    Jordan, Jorge; Miro-Martinez, Pau; Vargas, Borja; Varela-Entrecanales, Manuel; Cuesta-Frau, David

    2017-02-01

    Body temperature monitoring provides health carers with key clinical information about the physiological status of patients. Temperature readings are taken periodically to detect febrile episodes and consequently implement the appropriate medical countermeasures. However, fever is often difficult to assess at early stages, or remains undetected until the next reading, probably a few hours later. The objective of this article is to develop a statistical model to forecast fever before a temperature threshold is exceeded to improve the therapeutic approach to the subjects involved. To this end, temperature series of 9 patients admitted to a general internal medicine ward were obtained with a continuous monitoring Holter device, collecting measurements of peripheral and core temperature once per minute. These series were used to develop different statistical models that could quantify the probability of having a fever spike in the following 60 minutes. A validation series was collected to assess the accuracy of the models. Finally, the results were compared with the analysis of some series by experienced clinicians. Two different models were developed: a logistic regression model and a linear discrimination analysis model. Both of them exhibited a fever peak forecasting accuracy greater than 84%. When compared with experts' assessment, both models identified 35 (97.2%) of 36 fever spikes. The models proposed are highly accurate in forecasting the appearance of fever spikes within a short period in patients with suspected or confirmed febrile-related illnesses. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. A Study to Design, Develop, Implement, Evaluate, and Revise Specific, Measurable Performance Objectives to Serve as a Model to Individualize Instruction for Secondary Schools. Final Report.

    ERIC Educational Resources Information Center

    Wendt, Marilynn S.; Jacobson, Marjory E.

    Specific objectives of this two-phase study included: (1) assessment of design of the behavioral objectives in all secondary level courses; (2) construction of criteria for evaluation of the design; (3) testing of objectives against the evaluative criteria and/or the specifications of the curriculum design; and (4) determination of the validity of…

  11. The Assessment of a Model for Determining Community-Based Needs of American Indians with Disabilities through Consumer Involvement in Community Planning and Change. Final Report, 1990.

    ERIC Educational Resources Information Center

    Marshall, Catherine A.; And Others

    This research project sought to identify, at a local community level, the needs and concerns of American Indians with disabilities. The selected target community was Denver, Colorado, and its surrounding metropolitan area. Data were collected through 100 face-to-face interviews with a volunteer population of American Indians with disabilities. The…

  12. Evaluation Tools in the European Higher Education Area (EHEA): An Assessment for Evaluating the Competences of the Final Year Project in the Social Sciences

    ERIC Educational Resources Information Center

    Mateo, Joan; Escofet, Anna; Martinez-Olmo, Francesc; Ventura, Javier; Vlachopoulos, Dimitrios

    2012-01-01

    The guidelines of the European Higher Education Area (EHEA) imply the rethinking of many of the current evaluation systems, since the new pedagogical models now focus on the learning acquired through the students' personal work and on the establishment of the ideal conditions for them to achieve the learning outcomes of the proposed educational…

  13. A Survey and Analysis of Military Computer-Based Systems: A Two Part Study. Volume II. A Descriptive and Predictive Model for Evaluating Instructional Systems. Final Report.

    ERIC Educational Resources Information Center

    McDonnell Douglas Astronautics Co. - East, St. Louis, MO.

    This is the second volume of a two volume study. The first volume examined the literature to identify authoring aids for developing instructional materials, and to identify information clearing houses for existing materials. The purpose of this volume was to develop a means for assessing the cost versus expected benefits of innovations in…

  14. The Early Childhood Cluster Initiative of Palm Beach County, Florida. Early Implementation Study And Evaluability Assessment. Final Report. Chapin Hall Working Paper

    ERIC Educational Resources Information Center

    Spielberger, Julie; Goyette, Paul

    2006-01-01

    This publication reports findings from the first year of an implementation study of the Early Childhood Cluster Initiative (ECCI). ECCI is a prekindergarten program in ten elementary schools and a community child care center in Palm Beach County, based on the design of the High/Scope Perry Preschool model. The initiative is characterized by low…

  15. Analysis of the Research and Studies Program at the United States Military Academy

    DTIC Science & Technology

    2004-09-01

    operational assessment methodology, efficiency analysis, recruiting analysis especially marketing effects and capability analysis and modeling. Lieutenant...Finally, and arguably the most compelling rationale is the market force of increased funding. Figure 3 below shows the increase in funding received by...to integrate in a team of analysts from other departments to assist in the effort. First, bringing in analysts from other departments gave those

  16. Evaluation of Regression Models of Balance Calibration Data Using an Empirical Criterion

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert; Volden, Thomas R.

    2012-01-01

    An empirical criterion for assessing the significance of individual terms of regression models of wind tunnel strain gage balance outputs is evaluated. The criterion is based on the percent contribution of a regression model term. It considers a term to be significant if its percent contribution exceeds the empirical threshold of 0.05%. The criterion has the advantage that it can easily be computed using the regression coefficients of the gage outputs and the load capacities of the balance. First, a definition of the empirical criterion is provided. Then, it is compared with an alternate statistical criterion that is widely used in regression analysis. Finally, calibration data sets from a variety of balances are used to illustrate the connection between the empirical and the statistical criterion. A review of these results indicated that the empirical criterion seems to be suitable for a crude assessment of the significance of a regression model term as the boundary between a significant and an insignificant term cannot be defined very well. Therefore, regression model term reduction should only be performed by using the more universally applicable statistical criterion.

  17. What are the effects of Agro-Ecological Zones and land use region boundaries on land resource projection using the Global Change Assessment Model?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Vittorio, Alan V.; Kyle, Page; Collins, William D.

    Understanding potential impacts of climate change is complicated by spatially mismatched land representations between gridded datasets and models, and land use models with larger regions defined by geopolitical and/or biophysical criteria. Here in this study, we quantify the sensitivity of Global Change Assessment Model (GCAM) outputs to the delineation of Agro-Ecological Zones (AEZs), which are normally based on historical (1961–1990) climate. We reconstruct GCAM's land regions using projected (2071–2100) climate, and find large differences in estimated future land use that correspond with differences in agricultural commodity prices and production volumes. Importantly, historically delineated AEZs experience spatially heterogeneous climate impacts overmore » time, and do not necessarily provide more homogenous initial land productivity than projected AEZs. Finally, we conclude that non-climatic criteria for land use region delineation are likely preferable for modeling land use change in the context of climate change, and that uncertainty associated with land delineation needs to be quantified.« less

  18. Assimilation of GRACE Terrestrial Water Storage Observations into a Land Surface Model for the Assessment of Regional Flood Potential

    NASA Technical Reports Server (NTRS)

    Reager, John T.; Thomas, Alys C.; Sproles, Eric A.; Rodell, Matthew; Beaudoing, Hiroko K.; Li, Bailing; Famiglietti, James S.

    2015-01-01

    We evaluate performance of the Catchment Land Surface Model (CLSM) under flood conditions after the assimilation of observations of the terrestrial water storage anomaly (TWSA) from NASA's Gravity Recovery and Climate Experiment (GRACE). Assimilation offers three key benefits for the viability of GRACE observations to operational applications: (1) near-real time analysis; (2) a downscaling of GRACE's coarse spatial resolution; and (3) state disaggregation of the vertically-integrated TWSA. We select the 2011 flood event in the Missouri river basin as a case study, and find that assimilation generally made the model wetter in the months preceding flood. We compare model outputs with observations from 14 USGS groundwater wells to assess improvements after assimilation. Finally, we examine disaggregated water storage information to improve the mechanistic understanding of event generation. Validation establishes that assimilation improved the model skill substantially, increasing regional groundwater anomaly correlation from 0.58 to 0.86. For the 2011 flood event in the Missouri river basin, results show that groundwater and snow water equivalent were contributors to pre-event flood potential, providing spatially-distributed early warning information.

  19. Structural empowerment and burnout among Portuguese nursing staff: An explicative model.

    PubMed

    Orgambídez-Ramos, Alejandro; Borrego-Alés, Yolanda; Vázquez-Aguado, Octavio; March-Amegual, Jaume

    2017-11-01

    Kanter's structural empowerment model was used to assess the influence of access to opportunities, resources, information and support on core burnout through global empowerment in a nursing sample in Portugal. The empowerment experience increases the levels of nursing professionals' satisfaction and performance preventing the emergence of burnout. However, the relationship between structural empowerment and burnout has been scarcely studied in Portugal. We conducted a cross-sectional correlational study assessing a final sample of 297 participants (62.13% response rate, 63.64% women). Model fit and mediation test were examined using structural equation modelling (path analysis). Access to opportunities and access to support had direct impact, through global empowerment, on core burnout, whereas access to resources had both direct and indirect impact on core burnout. The results validated the structural empowerment model and its application in nursing staff in Portugal. Professional training plans, the development of formal and informal support networks, and the availability of resources increase the levels of empowerment and decrease the likelihood of experiencing burnout in nursing professionals. © 2017 John Wiley & Sons Ltd.

  20. What are the effects of Agro-Ecological Zones and land use region boundaries on land resource projection using the Global Change Assessment Model?

    DOE PAGES

    Di Vittorio, Alan V.; Kyle, Page; Collins, William D.

    2016-09-03

    Understanding potential impacts of climate change is complicated by spatially mismatched land representations between gridded datasets and models, and land use models with larger regions defined by geopolitical and/or biophysical criteria. Here in this study, we quantify the sensitivity of Global Change Assessment Model (GCAM) outputs to the delineation of Agro-Ecological Zones (AEZs), which are normally based on historical (1961–1990) climate. We reconstruct GCAM's land regions using projected (2071–2100) climate, and find large differences in estimated future land use that correspond with differences in agricultural commodity prices and production volumes. Importantly, historically delineated AEZs experience spatially heterogeneous climate impacts overmore » time, and do not necessarily provide more homogenous initial land productivity than projected AEZs. Finally, we conclude that non-climatic criteria for land use region delineation are likely preferable for modeling land use change in the context of climate change, and that uncertainty associated with land delineation needs to be quantified.« less

  1. ISC-GEM: Global Instrumental Earthquake Catalogue (1900-2009), III. Re-computed MS and mb, proxy MW, final magnitude composition and completeness assessment

    NASA Astrophysics Data System (ADS)

    Di Giacomo, Domenico; Bondár, István; Storchak, Dmitry A.; Engdahl, E. Robert; Bormann, Peter; Harris, James

    2015-02-01

    This paper outlines the re-computation and compilation of the magnitudes now contained in the final ISC-GEM Reference Global Instrumental Earthquake Catalogue (1900-2009). The catalogue is available via the ISC website (http://www.isc.ac.uk/iscgem/). The available re-computed MS and mb provided an ideal basis for deriving new conversion relationships to moment magnitude MW. Therefore, rather than using previously published regression models, we derived new empirical relationships using both generalized orthogonal linear and exponential non-linear models to obtain MW proxies from MS and mb. The new models were tested against true values of MW, and the newly derived exponential models were then preferred to the linear ones in computing MW proxies. For the final magnitude composition of the ISC-GEM catalogue, we preferred directly measured MW values as published by the Global CMT project for the period 1976-2009 (plus intermediate-depth earthquakes between 1962 and 1975). In addition, over 1000 publications have been examined to obtain direct seismic moment M0 and, therefore, also MW estimates for 967 large earthquakes during 1900-1978 (Lee and Engdahl, 2015) by various alternative methods to the current GCMT procedure. In all other instances we computed MW proxy values by converting our re-computed MS and mb values into MW, using the newly derived non-linear regression models. The final magnitude composition is an improvement in terms of magnitude homogeneity compared to previous catalogues. The magnitude completeness is not homogeneous over the 110 years covered by the ISC-GEM catalogue. Therefore, seismicity rate estimates may be strongly affected without a careful time window selection. In particular, the ISC-GEM catalogue appears to be complete down to MW 5.6 starting from 1964, whereas for the early instrumental period the completeness varies from ∼7.5 to 6.2. Further time and resources would be necessary to homogenize the magnitude of completeness over the entire catalogue length.

  2. The drivers of wildfire enlargement do not exhibit scale thresholds in southeastern Australian forests.

    PubMed

    Price, Owen F; Penman, Trent; Bradstock, Ross; Borah, Rittick

    2016-10-01

    Wildfires are complex adaptive systems, and have been hypothesized to exhibit scale-dependent transitions in the drivers of fire spread. Among other things, this makes the prediction of final fire size from conditions at the ignition difficult. We test this hypothesis by conducting a multi-scale statistical modelling of the factors determining whether fires reached 10 ha, then 100 ha then 1000 ha and the final size of fires >1000 ha. At each stage, the predictors were measures of weather, fuels, topography and fire suppression. The objectives were to identify differences among the models indicative of scale transitions, assess the accuracy of the multi-step method for predicting fire size (compared to predicting final size from initial conditions) and to quantify the importance of the predictors. The data were 1116 fires that occurred in the eucalypt forests of New South Wales between 1985 and 2010. The models were similar at the different scales, though there were subtle differences. For example, the presence of roads affected whether fires reached 10 ha but not larger scales. Weather was the most important predictor overall, though fuel load, topography and ease of suppression all showed effects. Overall, there was no evidence that fires have scale-dependent transitions in behaviour. The models had a predictive accuracy of 73%, 66%, 72% and 53% accuracy at 10 ha, 100 ha, 1000 ha and final size scales. When these steps were combined, the overall accuracy for predicting the size of fires was 62%, while the accuracy of the one step model was only 20%. Thus, the multi-scale approach was an improvement on the single scale approach, even though the predictive accuracy was probably insufficient for use as an operational tool. The analysis has also provided further evidence of the important role of weather, compared to fuel, suppression and topography in driving fire behaviour. Copyright © 2016. Published by Elsevier Ltd.

  3. Impact of the proposed energy tax on nuclear electric generating technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edmunds, T.A.; Lamont, A.D.; Pasternak, A.D.

    1993-05-01

    The President`s new economic initiatives include an energy tax that will affect the costs of power from most electric generating technologies. The tax on nuclear power could be applied in a number of different ways at several different points in the fuel cycle. These different approaches could have different effects on the generation costs and benefits of advanced reactors. The Office of Nuclear Energy has developed models for assessing the costs and benefits of advanced reactor cycles which must be updated to take into account the impacts of the proposed tax. This report has been prepared to assess the spectrummore » of impacts of the energy tax on nuclear power and can be used in updating the Office`s economic models. This study was conducted in the following steps. First, the most authoritative statement of the proposed tax available at this time was obtained. Then the impacts of the proposed tax on the costs of nuclear and fossil fueled generation were compared. Finally several other possible approaches to taxing nuclear energy were evaluated. The cost impact on several advanced nuclear technologies and a current light water technology were computed. Finally, the rationale for the energy tax as applied to various electric generating methods was examined.« less

  4. Land Survey from Unmaned Aerial Veichle

    NASA Astrophysics Data System (ADS)

    Peterman, V.; Mesarič, M.

    2012-07-01

    In this paper we present, how we use a quadrocopter unmanned aerial vehicle with a camera attached to it, to do low altitude photogrammetric land survey. We use the quadrocopter to take highly overlapping photos of the area of interest. A "structure from motion" algorithm is implemented to get parameters of camera orientations and to generate a sparse point cloud representation of objects in photos. Than a patch based multi view stereo algorithm is applied to generate a dense point cloud. Ground control points are used to georeference the data. Further processing is applied to generate digital orthophoto maps, digital surface models, digital terrain models and assess volumes of various types of material. Practical examples of land survey from a UAV are presented in the paper. We explain how we used our system to monitor the reconstruction of commercial building, then how our UAV was used to assess the volume of coal supply for Ljubljana heating plant. Further example shows the usefulness of low altitude photogrammetry for documentation of archaeological excavations. In the final example we present how we used our UAV to prepare an underlay map for natural gas pipeline's route planning. In the final analysis we conclude that low altitude photogrammetry can help bridge the gap between laser scanning and classic tachymetric survey, since it offers advantages of both techniques.

  5. Decreased microvascular cerebral blood flow assessed by diffuse correlation spectroscopy after repetitive concussions in mice.

    PubMed

    Buckley, Erin M; Miller, Benjamin F; Golinski, Julianne M; Sadeghian, Homa; McAllister, Lauren M; Vangel, Mark; Ayata, Cenk; Meehan, William P; Franceschini, Maria Angela; Whalen, Michael J

    2015-12-01

    Repetitive concussions are associated with long-term cognitive dysfunction that can be attenuated by increasing the time intervals between concussions; however, biomarkers of the safest rest interval between injuries remain undefined. We hypothesize that deranged cerebral blood flow (CBF) is a candidate biomarker for vulnerability to repetitive concussions. Using a mouse model of human concussion, we examined the effect of single and repetitive concussions on cognition and on an index of CBF (CBFi) measured with diffuse correlation spectroscopy. After a single mild concussion, CBFi was reduced by 35±4% at 4 hours (P<0.01 versus baseline) and returned to preinjury levels by 24 hours. After five concussions spaced 1 day apart, CBFi was also reduced from preinjury levels 4 hours after each concussion but had returned to preinjury levels by 72 hours after the final concussion. Interestingly, in this repetitive concussion model, lower CBFi values measured both preinjury and 4 hours after the third concussion were associated with worse performance on the Morris water maze assessed 72 hours after the final concussion. We conclude that low CBFi measured either before or early on in the evolution of injury caused by repetitive concussions could be a useful predictor of cognitive outcome.

  6. A probabilistic asteroid impact risk model: assessment of sub-300 m impacts

    NASA Astrophysics Data System (ADS)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2017-06-01

    A comprehensive asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain input parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions for objects up to 300 m in diameter. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data have little effect on the metrics of interest.

  7. The Development of an Empirical Model of Mental Health Stigma in Adolescents.

    PubMed

    Silke, Charlotte; Swords, Lorraine; Heary, Caroline

    2016-08-30

    Research on mental health stigma in adolescents is hampered by a lack of empirical investigation into the theoretical conceptualisation of stigma, as well as by the lack of validated stigma measures. This research aims to develop a model of public stigma toward depression in adolescents and to use this model to empirically examine whether stigma is composed of three separate dimensions (Stereotypes, Prejudice and Discrimination), as is theoretically proposed. Adolescents completed self-report measures assessing their stigmatising responses toward a fictional peer with depression. An exploratory factor analysis (EFA; N=332) was carried out on 58-items, which proposed to measure aspects of stigma. A confirmatory factor analysis (CFA; N=236) was then carried out to evaluate the validity of the observed stigma model. Finally, higher-order CFAs were conducted in order to assess whether the observed model supported the tripartite conceptualisation of stigma. The EFA returned a seven-factor model of stigma. These factors were designated as Dangerousness, Warmth & Competency, Responsibility, Negative Attributes, Prejudice, Classroom Discrimination and Friendship Discrimination. The CFA supported the goodness-of-fit of this seven-factor model. The higher-order CFAs indicated that these seven factors represented the latent constructs of, Stereotypes, Prejudice and Discrimination, which in turn represented Stigma. Overall, results support the tripartite conceptualisation of stigma and suggest that measurements of mental health stigma in adolescents should include assessments of all three dimensions. These results also highlight the importance of establishing valid and reliable measures for assessing stigma in adolescents. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. Scan-To Output Validation: Towards a Standardized Geometric Quality Assessment of Building Information Models Based on Point Clouds

    NASA Astrophysics Data System (ADS)

    Bonduel, M.; Bassier, M.; Vergauwen, M.; Pauwels, P.; Klein, R.

    2017-11-01

    The use of Building Information Modeling (BIM) for existing buildings based on point clouds is increasing. Standardized geometric quality assessment of the BIMs is needed to make them more reliable and thus reusable for future users. First, available literature on the subject is studied. Next, an initial proposal for a standardized geometric quality assessment is presented. Finally, this method is tested and evaluated with a case study. The number of specifications on BIM relating to existing buildings is limited. The Levels of Accuracy (LOA) specification of the USIBD provides definitions and suggestions regarding geometric model accuracy, but lacks a standardized assessment method. A deviation analysis is found to be dependent on (1) the used mathematical model, (2) the density of the point clouds and (3) the order of comparison. Results of the analysis can be graphical and numerical. An analysis on macro (building) and micro (BIM object) scale is necessary. On macro scale, the complete model is compared to the original point cloud and vice versa to get an overview of the general model quality. The graphical results show occluded zones and non-modeled objects respectively. Colored point clouds are derived from this analysis and integrated in the BIM. On micro scale, the relevant surface parts are extracted per BIM object and compared to the complete point cloud. Occluded zones are extracted based on a maximum deviation. What remains is classified according to the LOA specification. The numerical results are integrated in the BIM with the use of object parameters.

  9. Summative Self-Assessment in Higher Education: Implications of Its Counting towards the Final Mark

    ERIC Educational Resources Information Center

    Tejeiro, Ricardo A.; Gomez-Vallecillo, Jorge L.; Romero, Antonio F.; Pelegrina, Manuel; Wallace, Agustin; Emberley, Enrique

    2012-01-01

    Introduction: Our study aims at assessing the validity of summative criteria-referenced self-assessment in higher education, and in particular, if that validity varies when the professor counts self-assessment toward the final mark. Method: One hundred and twenty-two first year students from two groups in Teacher Education at the Universidad de…

  10. Causal Assessment of Biological Impairment in the Bogue Homo River, Mississippi Using the U.S. EPA’s Stressor Identification Methodology (Final)

    EPA Science Inventory

    EPA announced the availability of the final report, Causal Assessment of Biological Impairment in the Bogue Homo River, Mississippi Using the U.S. EPA’s Stressor Identification Methodology. This assessment is taken from more than 700 court ordered assessments of the cau...

  11. Comprehensive Environmental Assessment Applied to Multiwalled Carbon Nanotube Flame-Retardant Coatings in Upholstery Textiles: A Case Study Presenting Priority Research Gaps for Future Risk Assessments (Final Report)

    EPA Science Inventory

    In September 2013, EPA announced the availability of the final report, Comprehensive Environmental Assessment Applied to Multiwalled Carbon Nanotube Flame-Retardant Coatings in Upholstery Textiles: A Case Study Presenting Priority Research Gaps for Future Risk Assessments...

  12. Introduction to the JPA special issue: Can the Psychodynamic Diagnostic Manual put the complex person back at the center-stage of personality assessment?

    PubMed

    Huprich, Steven K; Meyer, Gregory J

    2011-03-01

    We briefly introduce this special issue, which focuses both on the Psychodynamic Diagnostic Manual (PDM) and the practice of idiographic, depth-oriented personality assessment. The 7 articles in this issue are diverse in scope but all address these 2 important topics. To set the stage, the special issue opens with a description of the history behind, the purposes of, and the steps taken to develop the PDM, and the next article provides a compelling illustration of depth-oriented personality assessment in the context of a long-term course of psychodynamic treatment. The third and fourth articles describe how the PDM model fosters attention to dynamic processes, not just overt symptoms, and they articulate the challenges and benefits of integrating this model into both the revitalized practice of assessment and diagnosis and the research avenues that will evaluate its validity and utility. The fifth article provides a broad overview of interesting experimental research on implicit processes from personality, social, and cognitive psychology, with implications for understanding and assessing dynamic processes. The sixth article illustrates how a PDM-based assessment of an adolescent boy helpfully contributed to his psychodynamic therapy. Finally, the issue closes with an illuminating article describing a PDM-based training model for the graduated development of assessment and diagnosis skills in a doctoral program. Overall, this special issue helps show how the PDM can invigorate multimethod personality assessment by placing the complex idiographic understanding of a person at the center-stage in the assessment process.

  13. Assessment of outdoor radiofrequency electromagnetic field exposure through hotspot localization using kriging-based sequential sampling.

    PubMed

    Aerts, Sam; Deschrijver, Dirk; Verloock, Leen; Dhaene, Tom; Martens, Luc; Joseph, Wout

    2013-10-01

    In this study, a novel methodology is proposed to create heat maps that accurately pinpoint the outdoor locations with elevated exposure to radiofrequency electromagnetic fields (RF-EMF) in an extensive urban region (or, hotspots), and that would allow local authorities and epidemiologists to efficiently assess the locations and spectral composition of these hotspots, while at the same time developing a global picture of the exposure in the area. Moreover, no prior knowledge about the presence of radiofrequency radiation sources (e.g., base station parameters) is required. After building a surrogate model from the available data using kriging, the proposed method makes use of an iterative sampling strategy that selects new measurement locations at spots which are deemed to contain the most valuable information-inside hotspots or in search of them-based on the prediction uncertainty of the model. The method was tested and validated in an urban subarea of Ghent, Belgium with a size of approximately 1 km2. In total, 600 input and 50 validation measurements were performed using a broadband probe. Five hotspots were discovered and assessed, with maximum total electric-field strengths ranging from 1.3 to 3.1 V/m, satisfying the reference levels issued by the International Commission on Non-Ionizing Radiation Protection for exposure of the general public to RF-EMF. Spectrum analyzer measurements in these hotspots revealed five radiofrequency signals with a relevant contribution to the exposure. The radiofrequency radiation emitted by 900 MHz Global System for Mobile Communications (GSM) base stations was always dominant, with contributions ranging from 45% to 100%. Finally, validation of the subsequent surrogate models shows high prediction accuracy, with the final model featuring an average relative error of less than 2dB (factor 1.26 in electric-field strength), a correlation coefficient of 0.7, and a specificity of 0.96. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. FlexPepDock lessons from CAPRI peptide-protein rounds and suggested new criteria for assessment of model quality and utility.

    PubMed

    Marcu, Orly; Dodson, Emma-Joy; Alam, Nawsad; Sperber, Michal; Kozakov, Dima; Lensink, Marc F; Schueler-Furman, Ora

    2017-03-01

    CAPRI rounds 28 and 29 included, for the first time, peptide-receptor targets of three different systems, reflecting increased appreciation of the importance of peptide-protein interactions. The CAPRI rounds allowed us to objectively assess the performance of Rosetta FlexPepDock, one of the first protocols to explicitly include peptide flexibility in docking, accounting for peptide conformational changes upon binding. We discuss here successes and challenges in modeling these targets: we obtain top-performing, high-resolution models of the peptide motif for cases with known binding sites but there is a need for better modeling of flanking regions, as well as better selection criteria, in particular for unknown binding sites. These rounds have also provided us the opportunity to reassess the success criteria, to better reflect the quality of a peptide-protein complex model. Using all models submitted to CAPRI, we analyze the correlation between current classification criteria and the ability to retrieve critical interface features, such as hydrogen bonds and hotspots. We find that loosening the backbone (and ligand) RMSD threshold, together with a restriction on the side chain RMSD measure, allows us to improve the selection of high-accuracy models. We also suggest a new measure to assess interface hydrogen bond recovery, which is not assessed by the current CAPRI criteria. Finally, we find that surprisingly much can be learned from rather inaccurate models about binding hotspots, suggesting that the current status of peptide-protein docking methods, as reflected by the submitted CAPRI models, can already have a significant impact on our understanding of protein interactions. Proteins 2017; 85:445-462. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  15. Modeling the growth of Listeria monocytogenes on the surface of smear- or mold-ripened cheese.

    PubMed

    Schvartzman, M Sol; Gonzalez-Barron, Ursula; Butler, Francis; Jordan, Kieran

    2014-01-01

    Surface-ripened cheeses are matured by means of manual or mechanical technologies posing a risk of cross-contamination, if any cheeses are contaminated with Listeria monocytogenes. In predictive microbiology, primary models are used to describe microbial responses, such as growth rate over time and secondary models explain how those responses change with environmental factors. In this way, primary models were used to assess the growth rate of L. monocytogenes during ripening of the cheeses and the secondary models to test how much the growth rate was affected by either the pH and/or the water activity (aw) of the cheeses. The two models combined can be used to predict outcomes. The purpose of these experiments was to test three primary (the modified Gompertz equation, the Baranyi and Roberts model, and the Logistic model) and three secondary (the Cardinal model, the Ratowski model, and the Presser model) mathematical models in order to define which combination of models would best predict the growth of L. monocytogenes on the surface of artificially contaminated surface-ripened cheeses. Growth on the surface of the cheese was assessed and modeled. The primary models were firstly fitted to the data and the effects of pH and aw on the growth rate (μmax) were incorporated and assessed one by one with the secondary models. The Logistic primary model by itself did not show a better fit of the data among the other primary models tested, but the inclusion of the Cardinal secondary model improved the final fit. The aw was not related to the growth of Listeria. This study suggests that surface-ripened cheese should be separately regulated within EU microbiological food legislation and results expressed as counts per surface area rather than per gram.

  16. Sex-biased prevalence in infections with heterosexual, direct, and vector-mediated transmission: A theoretical analysis.

    PubMed

    Pugliese, Andrea; Gumel, Abba B; Milner, Fabio A; Velasco-Hernandez, Jorge X

    2018-02-01

    Three deterministic Kermack-McKendrick-type models for studying the transmission dynamics of an infection in a two-sex closed population are analyzed here. In each model it is assumed that infection can be transmitted through heterosexual contacts, and that there is a higher probability of transmission from one sex to the other than vice versa. The study is focused on understanding whether and how this bias in transmission reflects in sex differences in final attack ratios (i.e. the fraction of individuals of each sex that eventually gets infected). In the first model, where the other two transmission modes are not considered, the attack ratios (fractions of the population of each sex that will eventually be infected) can be obtained as solutions of a system of two nonlinear equations, that has a unique solution if the net reproduction number exceeds unity. It is also shown that the ratio of attack ratios depends solely on the ratio of gender-specific susceptibilities and on the basic reproductive number of the epidemic Ro, and that the gender-specific final attack-ratio is biased in the same direction as the gender-specific susceptibilities. The second model allows also for infection transmission through direct, non-sexual, contacts. In this case too, an analytical expression is derived from which the attack ratios can be obtained. The qualitative results are similar to those obtained for the previous model, but another important parameter for determining the value of the ratio between the attack ratios in the two sexes is obtained, the relative weight of direct vs. heterosexual transmission (namely, ρ). Quantitatively, the ratio of final attack ratios generally will not exceed 1.5, if non-sexual transmission accounts for most transmission events (ρ≥0.6) and the ratio of gender-specific susceptibilities is not too large (say, 5 at most). The third model considers vector-borne, instead of direct transmission. In this case, we were not able to find an analytical expression for the final attack ratios, but used instead numerical simulations. The results on final attack ratios are actually quite similar to those obtained with the second model. It is interesting to note that transient patterns can differ from final attack ratios, as new cases will tend to occur more often in the more susceptible sex, while later depletion of susceptibles may bias the ratio in the opposite direction. The analysis of these simple models, despite their lack of realism, can help in providing insight into, and assessment of, the potential role of gender-specific transmission in infections with multiple modes of transmission, such as Zika virus (ZIKV), by gauging what can be expected to be seen from epidemiological reports of new cases, disease incidence and seroprevalence surveys.

  17. Application of remote sensing data and GIS for landslide risk assessment as an environmental threat to Izmir city (west Turkey).

    PubMed

    Akgun, Aykut; Kıncal, Cem; Pradhan, Biswajeet

    2012-09-01

    In this study, landslide risk assessment for Izmir city (west Turkey) was carried out, and the environmental effects of landslides on further urban development were evaluated using geographical information systems and remote sensing techniques. For this purpose, two different data groups, namely conditioning and triggering data, were produced. With the help of conditioning data such as lithology, slope gradient, slope aspect, distance from roads, distance from faults and distance from drainage lines, a landslide susceptibility model was constructed by using logistic regression modelling approach. The accuracy assessment of the susceptibility map was carried out by the area under curvature (AUC) approach, and a 0.810 AUC value was obtained. This value shows that the map obtained is successful. Due to the fact that the study area is located in an active seismic region, earthquake data were considered as primary triggering factor contributing to landslide occurrence. In addition to this, precipitation data were also taken into account as a secondary triggering factor. Considering the susceptibility data and triggering factors, a landslide hazard index was obtained. Furthermore, using the Aster data, a land-cover map was produced with an overall kappa value of 0.94. From this map, settlement areas were extracted, and these extracted data were assessed as elements at risk in the study area. Next, a vulnerability index was created by using these data. Finally, the hazard index and the vulnerability index were combined, and a landslide risk map for Izmir city was obtained. Based on this final risk map, it was observed that especially south and north parts of the Izmir Bay, where urbanization is dense, are threatened to future landsliding. This result can be used for preliminary land use planning by local governmental authorities.

  18. Methodology for the development and calibration of the SCI-QOL item banks

    PubMed Central

    Tulsky, David S.; Kisala, Pamela A.; Victorson, David; Choi, Seung W.; Gershon, Richard; Heinemann, Allen W.; Cella, David

    2015-01-01

    Objective To develop a comprehensive, psychometrically sound, and conceptually grounded patient reported outcomes (PRO) measurement system for individuals with spinal cord injury (SCI). Methods Individual interviews (n = 44) and focus groups (n = 65 individuals with SCI and n = 42 SCI clinicians) were used to select key domains for inclusion and to develop PRO items. Verbatim items from other cutting-edge measurement systems (i.e. PROMIS, Neuro-QOL) were included to facilitate linkage and cross-population comparison. Items were field tested in a large sample of individuals with traumatic SCI (n = 877). Dimensionality was assessed with confirmatory factor analysis. Local item dependence and differential item functioning were assessed, and items were calibrated using the item response theory (IRT) graded response model. Finally, computer adaptive tests (CATs) and short forms were administered in a new sample (n = 245) to assess test-retest reliability and stability. Participants and Procedures A calibration sample of 877 individuals with traumatic SCI across five SCI Model Systems sites and one Department of Veterans Affairs medical center completed SCI-QOL items in interview format. Results We developed 14 unidimensional calibrated item banks and 3 calibrated scales across physical, emotional, and social health domains. When combined with the five Spinal Cord Injury – Functional Index physical function banks, the final SCI-QOL system consists of 22 IRT-calibrated item banks/scales. Item banks may be administered as CATs or short forms. Scales may be administered in a fixed-length format only. Conclusions The SCI-QOL measurement system provides SCI researchers and clinicians with a comprehensive, relevant and psychometrically robust system for measurement of physical-medical, physical-functional, emotional, and social outcomes. All SCI-QOL instruments are freely available on Assessment CenterSM. PMID:26010963

  19. Methodology for the development and calibration of the SCI-QOL item banks.

    PubMed

    Tulsky, David S; Kisala, Pamela A; Victorson, David; Choi, Seung W; Gershon, Richard; Heinemann, Allen W; Cella, David

    2015-05-01

    To develop a comprehensive, psychometrically sound, and conceptually grounded patient reported outcomes (PRO) measurement system for individuals with spinal cord injury (SCI). Individual interviews (n=44) and focus groups (n=65 individuals with SCI and n=42 SCI clinicians) were used to select key domains for inclusion and to develop PRO items. Verbatim items from other cutting-edge measurement systems (i.e. PROMIS, Neuro-QOL) were included to facilitate linkage and cross-population comparison. Items were field tested in a large sample of individuals with traumatic SCI (n=877). Dimensionality was assessed with confirmatory factor analysis. Local item dependence and differential item functioning were assessed, and items were calibrated using the item response theory (IRT) graded response model. Finally, computer adaptive tests (CATs) and short forms were administered in a new sample (n=245) to assess test-retest reliability and stability. A calibration sample of 877 individuals with traumatic SCI across five SCI Model Systems sites and one Department of Veterans Affairs medical center completed SCI-QOL items in interview format. We developed 14 unidimensional calibrated item banks and 3 calibrated scales across physical, emotional, and social health domains. When combined with the five Spinal Cord Injury--Functional Index physical function banks, the final SCI-QOL system consists of 22 IRT-calibrated item banks/scales. Item banks may be administered as CATs or short forms. Scales may be administered in a fixed-length format only. The SCI-QOL measurement system provides SCI researchers and clinicians with a comprehensive, relevant and psychometrically robust system for measurement of physical-medical, physical-functional, emotional, and social outcomes. All SCI-QOL instruments are freely available on Assessment CenterSM.

  20. An introduction to multidimensional measurement using Rasch models.

    PubMed

    Briggs, Derek C; Wilson, Mark

    2003-01-01

    The act of constructing a measure requires a number of important assumptions. Principle among these assumptions is that the construct is unidimensional. In practice there are many instances when the assumption of unidimensionality does not hold, and where the application of a multidimensional measurement model is both technically appropriate and substantively advantageous. In this paper we illustrate the usefulness of a multidimensional approach to measurement with the Multidimensional Random Coefficient Multinomial Logit (MRCML) model, an extension of the unidimensional Rasch model. An empirical example is taken from a collection of embedded assessments administered to 541 students enrolled in middle school science classes with a hands-on science curriculum. Student achievement on these assessments are multidimensional in nature, but can also be treated as consecutive unidimensional estimates, or as is most common, as a composite unidimensional estimate. Structural parameters are estimated for each model using ConQuest, and model fit is compared. Student achievement in science is also compared across models. The multidimensional approach has the best fit to the data, and provides more reliable estimates of student achievement than under the consecutive unidimensional approach. Finally, at an interpretational level, the multidimensional approach may well provide richer information to the classroom teacher about the nature of student achievement.

  1. Assessment of MARMOT Grain Growth Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fromm, B.; Zhang, Y.; Schwen, D.

    2015-12-01

    This report assesses the MARMOT grain growth model by comparing modeling predictions with experimental results from thermal annealing. The purpose here is threefold: (1) to demonstrate the validation approach of using thermal annealing experiments with non-destructive characterization, (2) to test the reconstruction capability and computation efficiency in MOOSE, and (3) to validate the grain growth model and the associated parameters that are implemented in MARMOT for UO 2. To assure a rigorous comparison, the 2D and 3D initial experimental microstructures of UO 2 samples were characterized using non-destructive Synchrotron x-ray. The same samples were then annealed at 2273K for grainmore » growth, and their initial microstructures were used as initial conditions for simulated annealing at the same temperature using MARMOT. After annealing, the final experimental microstructures were characterized again to compare with the results from simulations. So far, comparison between modeling and experiments has been done for 2D microstructures, and 3D comparison is underway. The preliminary results demonstrated the usefulness of the non-destructive characterization method for MARMOT grain growth model validation. A detailed analysis of the 3D microstructures is in progress to fully validate the current model in MARMOT.« less

  2. Building equity in: strategies for integrating equity into modelling for a 1.5°C world.

    PubMed

    Sonja, Klinsky; Harald, Winkler

    2018-05-13

    Emission pathways consistent with limiting temperature increase to 1.5°C raise pressing questions from an equity perspective. These pathways would limit impacts and benefit vulnerable communities but also present trade-offs that could increase inequality. Meanwhile, rapid mitigation could exacerbate political debates in which equity has played a central role. In this paper, we first develop a set of elements we suggest are essential for evaluating the equity implications of policy actions consistent with 1.5°C. These elements include (i) assess climate impacts, adaptation, loss and damage; (ii) be sensitive to context; (iii) compare costs of mitigation and adaptation policy action; (iv) incorporate human development and poverty; (v) integrate inequality dynamics; and (vi) be clear about normative assumptions and responsive to users. We then assess the ability of current modelling practices to address each element, focusing on global integrated assessment models augmented by national modelling and scenarios. We find current practices face serious limitations across all six dimensions although the severity of these varies. Finally, based on our assessment we identify strategies that may be best suited for enabling us to generate insights into each of the six elements in the context of assessing pathways for a 1.5°C world.This article is part of the theme issue 'The Paris Agreement: understanding the physical and social challenges for a warming world of 1.5°C above pre-industrial levels'. © 2018 The Author(s).

  3. An investigation of client mood in the initial and final sessions of cognitive-behavioral therapy and psychodynamic-interpersonal therapy.

    PubMed

    Mcclintock, Andrew S; Stiles, William B; Himawan, Lina; Anderson, Timothy; Barkham, Michael; Hardy, Gillian E

    2016-01-01

    Our aim was to examine client mood in the initial and final sessions of cognitive-behavioral therapy (CBT) and psychodynamic-interpersonal therapy (PIT) and to determine how client mood is related to therapy outcomes. Hierarchical linear modeling was applied to data from a clinical trial comparing CBT with PIT. In this trial, client mood was assessed before and after sessions with the Session Evaluation Questionnaire-Positivity Subscale (SEQ-P). In the initial sessions, CBT clients had higher pre-session and post-session SEQ-P ratings and greater pre-to-post session mood change than did clients in PIT. In the final sessions, these pre, post, and change scores were generally equivalent across CBT and PIT. CBT outcome was predicted by pre- and post-session SEQ-P ratings from both the initial sessions and the final sessions of CBT. However, PIT outcome was predicted by pre- and post-session SEQ-P ratings from the final sessions only. Pre-to-post session mood change was unrelated to outcome in both treatments. These results suggest different change processes are at work in CBT and PIT.

  4. Medicare program; prospective payment system and consolidated billing for skilled nursing facilities for FY 2010; minimum data set, version 3.0 for skilled nursing facilities and Medicaid nursing facilities. Final rule.

    PubMed

    2009-08-11

    This final rule updates the payment rates used under the prospective payment system (PPS) for skilled nursing facilities (SNFs), for fiscal year (FY) 2010. In addition, it recalibrates the case-mix indexes so that they more accurately reflect parity in expenditures related to the implementation of case-mix refinements in January 2006. It also discusses the results of our ongoing analysis of nursing home staff time measurement data collected in the Staff Time and Resource Intensity Verification project, as well as a new Resource Utilization Groups, version 4 case-mix classification model for FY 2011 that will use the updated Minimum Data Set 3.0 resident assessment for case-mix classification. In addition, this final rule discusses the public comments that we have received on these and other issues, including a possible requirement for the quarterly reporting of nursing home staffing data, as well as on applying the quality monitoring mechanism in place for all other SNF PPS facilities to rural swing-bed hospitals. Finally, this final rule revises the regulations to incorporate certain technical corrections.

  5. A climate model projection weighting scheme accounting for performance and interdependence

    NASA Astrophysics Data System (ADS)

    Knutti, Reto; Sedláček, Jan; Sanderson, Benjamin M.; Lorenz, Ruth; Fischer, Erich M.; Eyring, Veronika

    2017-02-01

    Uncertainties of climate projections are routinely assessed by considering simulations from different models. Observations are used to evaluate models, yet there is a debate about whether and how to explicitly weight model projections by agreement with observations. Here we present a straightforward weighting scheme that accounts both for the large differences in model performance and for model interdependencies, and we test reliability in a perfect model setup. We provide weighted multimodel projections of Arctic sea ice and temperature as a case study to demonstrate that, for some questions at least, it is meaningless to treat all models equally. The constrained ensemble shows reduced spread and a more rapid sea ice decline than the unweighted ensemble. We argue that the growing number of models with different characteristics and considerable interdependence finally justifies abandoning strict model democracy, and we provide guidance on when and how this can be achieved robustly.

  6. Climate Change Vulnerability Assessments: Four Case Studies of Water Utility Practices (2011 Final)

    EPA Science Inventory

    EPA has released the final report titled, Climate Change Vulnerability Assessments: Four Case Studies of Water Utility Practices. This report was prepared by the National Center for Environmental Assessment's Global Climate Research Staff in the Office of Research and D...

  7. [The Quality of the Family Physician-Patient Relationship. Patient-Related Predictors in a Sample Representative for the German Population].

    PubMed

    Dinkel, Andreas; Schneider, Antonius; Schmutzer, Gabriele; Brähler, Elmar; Henningsen, Peter; Häuser, Winfried

    2016-03-01

    Patient-centeredness and a strong working alliance are core elements of family medicine. Surveys in Germany showed that most people are satisfied with the quality of the family physician-patient relationship. However, factors that are responsible for the quality of the family physician-patient relationship remain unclear. This study aimed at identifying patient-related predictors of the quality of this relationship. Participants of a cross-sectional survey representative for the general German population were assessed using standardized questionnaires. The perceived quality of the family physician-patient relationship was measured with the German version of the Patient-Doctor Relationship Questionnaire (PDRQ-9). Associations of demographic and clinical variables (comorbidity, somatic symptom burden, psychological distress) with the quality of the family physician-patient relationship were assessed by applying hierarchical linear regression. 2278 participants (91,9%) reported having a family physician. The mean total score of the PDRQ-9 was high (M=4,12, SD=0,70). The final regression model showed that higher age, being female, and most notably less somatic and less depressive symptoms predicted a higher quality of the family physician-patient relationship. Comorbidity lost significance when somatic symptom burden was added to the regression model. The final model explained 11% of the variance, indicating a small effect. Experiencing somatic and depressive symptoms emerged as most relevant patient-related predictors of the quality of the family physician-patient relationship. © Georg Thieme Verlag KG Stuttgart · New York.

  8. On the use of drawing tasks in neuropsychological assessment.

    PubMed

    Smith, Alastair D

    2009-03-01

    Drawing tasks have attained a central position in neuropsychological assessment and are considered a rich source of information about the presence (or absence) of cognitive and perceptuo-motor abilities. However, unlike other tests of cognitive impairment, drawing tasks are often administered without reference to normative models of graphic production, and their results are often analyzed qualitatively. I begin this article by delineating the different ways in which drawing errors have been used to indicate particular functional deficits in neurological patients. I then describe models of drawing that have been explicitly based on the errors observed in patient drawings. Finally, the case is made for developing a more sensitive set of metrics in order to quantitatively assess patient performance. By providing a finer grain of analysis to assessment we will not only be better able to characterize the consequences of cognitive dysfunction, but may also be able to more subtly characterize and dissociate patients who would otherwise have been placed in the same broad category of impairment. (PsycINFO Database Record (c) 2009 APA, all rights reserved).

  9. Associations of Maternal Weight Gain in Pregnancy With Offspring Cognition in Childhood and Adolescence: Findings From the Avon Longitudinal Study of Parents and Children

    PubMed Central

    Gage, Suzanne H.; Lawlor, Debbie A.; Tilling, Kate; Fraser, Abigail

    2013-01-01

    An association of gestational weight gain (GWG) with offspring cognition has been postulated. We used data from the Avon Longitudinal Study of Parents and Children, a United Kingdom prospective cohort (1990 through the present) with a median of 10 maternal weight measurements in pregnancy. These were used to allocate participants to 2009 Institute of Medicine weight-gain categories and in random effect linear spline models. Outcomes were School Entry Assessment score (age, 4 years; n = 5,832), standardized intelligence quotient assessed by Wechsler Intelligence Scale for Children (age, 8 years; n = 5,191), and school final-examination results (age, 16 years; n = 7,339). Offspring of women who gained less weight than recommended had a 0.075 standard deviation lower mean School Entry Assessment score (95% confidence interval: −0.127, −0.023) and were less likely to achieve adequate final-examination results (odds ratio = 0.88, 95% confidence interval: 0.78, 0.99) compared with offspring of women who gained as recommended. GWG in early pregnancy (defined as 0–18 weeks on the basis of a knot point at 18 weeks) and midpregnancy (defined as 18–28 weeks on the basis of knot points at 18 and 28 weeks) was positively associated with School Entry Assessment score and intelligence quotient. GWG in late pregnancy (defined as 28 weeks onward on the basis of a knot point at 28 weeks) was positively associated with offspring intelligence quotient and with increased odds of offspring achieving adequate final-examination results in mothers who were overweight prepregnancy. Findings support small positive associations between GWG and offspring cognitive development, which may have lasting effects on educational attainment up to age 16 years. PMID:23388581

  10. Associations of maternal weight gain in pregnancy with offspring cognition in childhood and adolescence: findings from the Avon Longitudinal Study of Parents and Children.

    PubMed

    Gage, Suzanne H; Lawlor, Debbie A; Tilling, Kate; Fraser, Abigail

    2013-03-01

    An association of gestational weight gain (GWG) with offspring cognition has been postulated. We used data from the Avon Longitudinal Study of Parents and Children, a United Kingdom prospective cohort (1990 through the present) with a median of 10 maternal weight measurements in pregnancy. These were used to allocate participants to 2009 Institute of Medicine weight-gain categories and in random effect linear spline models. Outcomes were School Entry Assessment score (age, 4 years; n = 5,832), standardized intelligence quotient assessed by Wechsler Intelligence Scale for Children (age, 8 years; n = 5,191), and school final-examination results (age, 16 years; n = 7,339). Offspring of women who gained less weight than recommended had a 0.075 standard deviation lower mean School Entry Assessment score (95% confidence interval: -0.127, -0.023) and were less likely to achieve adequate final-examination results (odds ratio = 0.88, 95% confidence interval: 0.78, 0.99) compared with offspring of women who gained as recommended. GWG in early pregnancy (defined as 0-18 weeks on the basis of a knot point at 18 weeks) and midpregnancy (defined as 18-28 weeks on the basis of knot points at 18 and 28 weeks) was positively associated with School Entry Assessment score and intelligence quotient. GWG in late pregnancy (defined as 28 weeks onward on the basis of a knot point at 28 weeks) was positively associated with offspring intelligence quotient and with increased odds of offspring achieving adequate final-examination results in mothers who were overweight prepregnancy. Findings support small positive associations between GWG and offspring cognitive development, which may have lasting effects on educational attainment up to age 16 years.

  11. A statistical simulation model for field testing of non-target organisms in environmental risk assessment of genetically modified plants.

    PubMed

    Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore

    2014-04-01

    Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided.

  12. A statistical simulation model for field testing of non-target organisms in environmental risk assessment of genetically modified plants

    PubMed Central

    Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore

    2014-01-01

    Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided. PMID:24834325

  13. Item Response Theory and Health Outcomes Measurement in the 21st Century

    PubMed Central

    Hays, Ron D.; Morales, Leo S.; Reise, Steve P.

    2006-01-01

    Item response theory (IRT) has a number of potential advantages over classical test theory in assessing self-reported health outcomes. IRT models yield invariant item and latent trait estimates (within a linear transformation), standard errors conditional on trait level, and trait estimates anchored to item content. IRT also facilitates evaluation of differential item functioning, inclusion of items with different response formats in the same scale, and assessment of person fit and is ideally suited for implementing computer adaptive testing. Finally, IRT methods can be helpful in developing better health outcome measures and in assessing change over time. These issues are reviewed, along with a discussion of some of the methodological and practical challenges in applying IRT methods. PMID:10982088

  14. Assessing and evaluating multidisciplinary translational teams: a mixed methods approach.

    PubMed

    Wooten, Kevin C; Rose, Robert M; Ostir, Glenn V; Calhoun, William J; Ameredes, Bill T; Brasier, Allan R

    2014-03-01

    A case report illustrates how multidisciplinary translational teams can be assessed using outcome, process, and developmental types of evaluation using a mixed-methods approach. Types of evaluation appropriate for teams are considered in relation to relevant research questions and assessment methods. Logic models are applied to scientific projects and team development to inform choices between methods within a mixed-methods design. Use of an expert panel is reviewed, culminating in consensus ratings of 11 multidisciplinary teams and a final evaluation within a team-type taxonomy. Based on team maturation and scientific progress, teams were designated as (a) early in development, (b) traditional, (c) process focused, or (d) exemplary. Lessons learned from data reduction, use of mixed methods, and use of expert panels are explored.

  15. 78 FR 28873 - Availability of Final Environmental Assessment and Finding of No Significant Impact for the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... DEPARTMENT OF HOMELAND SECURITY Coast Guard [Docket No. USCG-2012-1091] Availability of Final Environmental Assessment and Finding of No Significant Impact for the Proposed Modification of the Bayonne... view Final EA and FONSI go to http://www.regulations.gov , insert ``USCG-2012-1091'' in the Search box...

  16. 50 CFR Table 2b to Part 660... - 2012, and beyond, Allocations by Species or Species Group (final 2012 allocations for assessed...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... or Species Group (final 2012 allocations for assessed flatfish are contingent upon potential changes to flatfish status determination criteria and the harvest control rule, and, for overfished species... Part 660, Subpart C—2012, and beyond, Allocations by Species or Species Group (final 2012 allocations...

  17. A spatial risk assessment of bighorn sheep extirpation by grazing domestic sheep on public lands.

    PubMed

    Carpenter, Tim E; Coggins, Victor L; McCarthy, Clinton; O'Brien, Chans S; O'Brien, Joshua M; Schommer, Timothy J

    2014-04-01

    Bighorn sheep currently occupy just 30% of their historic distribution, and persist in populations less than 5% as abundant overall as their early 19th century counterparts. Present-day recovery of bighorn sheep populations is in large part limited by periodic outbreaks of respiratory disease, which can be transmitted to bighorn sheep via contact with domestic sheep grazing in their vicinity. In order to assess the viability of bighorn sheep populations on the Payette National Forest (PNF) under several alternative proposals for domestic sheep grazing, we developed a series of interlinked models. Using telemetry and habitat data, we characterized herd home ranges and foray movements of bighorn sheep from their home ranges. Combining foray model movement estimates with known domestic sheep grazing areas (allotments), a Risk of Contact Model estimated bighorn sheep contact rates with domestic sheep allotments. Finally, we used demographic and epidemiologic data to construct population and disease transmission models (Disease Model), which we used to estimate bighorn sheep persistence under each alternative grazing scenario. Depending on the probability of disease transmission following interspecies contact, extirpation probabilities for the seven bighorn sheep herds examined here ranged from 20% to 100%. The Disease Model allowed us to assess the probabilities that varied domestic sheep management scenarios would support persistent populations of free-ranging bighorn sheep. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Towards the ecotourism: a decision support model for the assessment of sustainability of mountain huts in the Alps.

    PubMed

    Stubelj Ars, Mojca; Bohanec, Marko

    2010-12-01

    This paper studies mountain hut infrastructure in the Alps as an important element of ecotourism in the Alpine region. To improve the decision-making process regarding the implementation of future infrastructure and improvement of existing infrastructure in the vulnerable natural environment of mountain ecosystems, a new decision support model has been developed. The methodology is based on qualitative multi-attribute modelling supported by the DEXi software. The integrated rule-based model is hierarchical and consists of two submodels that cover the infrastructure of the mountain huts and that of the huts' surroundings. The final goal for the designed tool is to help minimize the ecological footprint of tourists in environmentally sensitive and undeveloped mountain areas and contribute to mountain ecotourism. The model has been tested in the case study of four mountain huts in Triglav National Park in Slovenia. Study findings provide a new empirical approach to evaluating existing mountain infrastructure and predicting improvements for the future. The assessment results are of particular interest for decision makers in protected areas, such as Alpine national parks managers and administrators. In a way, this model proposes an approach to the management assessment of mountain huts with the main aim of increasing the quality of life of mountain environment visitors as well as the satisfaction of tourists who may eventually become ecotourists. Copyright © 2010 Elsevier Ltd. All rights reserved.

  19. Aporrectodea caliginosa, a relevant earthworm species for a posteriori pesticide risk assessment: current knowledge and recommendations for culture and experimental design.

    PubMed

    Bart, Sylvain; Amossé, Joël; Lowe, Christopher N; Mougin, Christian; Péry, Alexandre R R; Pelosi, Céline

    2018-06-21

    Ecotoxicological tests with earthworms are widely used and are mandatory for the risk assessment of pesticides prior to registration and commercial use. The current model species for standardized tests is Eisenia fetida or Eisenia andrei. However, these species are absent from agricultural soils and often less sensitive to pesticides than other earthworm species found in mineral soils. To move towards a better assessment of pesticide effects on non-target organisms, there is a need to perform a posteriori tests using relevant species. The endogeic species Aporrectodea caliginosa (Savigny, 1826) is representative of cultivated fields in temperate regions and is suggested as a relevant model test species. After providing information on its taxonomy, biology, and ecology, we reviewed current knowledge concerning its sensitivity towards pesticides. Moreover, we highlighted research gaps and promising perspectives. Finally, advice and recommendations are given for the establishment of laboratory cultures and experiments using this soil-dwelling earthworm species.

  20. Environmental compatibility of closed landfills - assessing future pollution hazards.

    PubMed

    Laner, David; Fellner, Johann; Brunner, Paul H

    2011-01-01

    Municipal solid waste landfills need to be managed after closure. This so-called aftercare comprises the treatment and monitoring of residual emissions as well as the maintenance and control of landfill elements. The measures can be terminated when a landfill does not pose a threat to the environment any more. Consequently, the evaluation of landfill environmental compatibility includes an estimation of future pollution hazards as well as an assessment of the vulnerability of the affected environment. An approach to assess future emission rates is presented and discussed in view of long-term environmental compatibility. The suggested method consists (a) of a continuous model to predict emissions under the assumption of constant landfill conditions, and (b) different scenarios to evaluate the effects of changing conditions within and around the landfill. The model takes into account the actual status of the landfill, hence different methods to gain information about landfill characteristics have to be applied. Finally, assumptions, uncertainties, and limitations of the methodology are discussed, and the need for future research is outlined.

Top