Sample records for current assessment models

  1. Evolving PBPK applications in regulatory risk assessment: current situation and future goals

    EPA Science Inventory

    The presentation includes current applications of PBPK modeling in regulatory risk assessment and discussions on conflicts between assuring consistency with experimental data in current situation and the desire for animal-free model development.

  2. EVALUATION OF PHYSIOLOGY COMPUTER MODELS, AND THE FEASIBILITY OF THEIR USE IN RISK ASSESSMENT.

    EPA Science Inventory

    This project will evaluate the current state of quantitative models that simulate physiological processes, and the how these models might be used in conjunction with the current use of PBPK and BBDR models in risk assessment. The work will include a literature search to identify...

  3. Comparing personality disorder models: cross-method assessment of the FFM and DSM-IV-TR.

    PubMed

    Samuel, Douglas B; Widiger, Thomas W

    2010-12-01

    The current edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV-TR; American Psychiatric Association, 2000) defines personality disorders as categorical entities that are distinct from each other and from normal personality traits. However, many scientists now believe that personality disorders are best conceptualized using a dimensional model of traits that span normal and abnormal personality, such as the Five-Factor Model (FFM). However, if the FFM or any dimensional model is to be considered as a credible alternative to the current model, it must first demonstrate an increment in the validity of the assessment offered within a clinical setting. Thus, the current study extended previous research by comparing the convergent and discriminant validity of the current DSM-IV-TR model to the FFM across four assessment methodologies. Eighty-eight individuals receiving ongoing psychotherapy were assessed for the FFM and the DSM-IV-TR personality disorders using self-report, informant report, structured interview, and therapist ratings. The results indicated that the FFM had an appreciable advantage over the DSM-IV-TR in terms of discriminant validity and, at the domain level, convergent validity. Implications of the findings and directions for future research are discussed.

  4. Comparing Personality Disorder Models: Cross-Method Assessment of the FFM and DSM-IV-TR

    PubMed Central

    Samuel, Douglas B.; Widiger, Thomas A.

    2010-01-01

    The current edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV-TR; American Psychiatric Association, 2000) defines personality disorders as categorical entities that are distinct from each other and from normal personality traits. However, many scientists now believe that personality disorders are best conceptualized using a dimensional model of traits that span normal and abnormal personality, such as the Five-Factor Model (FFM). However, if the FFM or any dimensional model is to be considered as a credible alternative to the current model, it must first demonstrate an increment in the validity of the assessment offered within a clinical setting. Thus, the current study extended previous research by comparing the convergent and discriminant validity of the current DSM-IV-TR model to the FFM across four assessment methodologies. Eighty-eight individuals receiving ongoing psychotherapy were assessed for the FFM and the DSM-IV-TR personality disorders using self-report, informant report, structured interview, and therapist ratings. The results indicated that the FFM had an appreciable advantage over the DSM-IV-TR in terms of discriminant validity and, at the domain level, convergent validity. Implications of the findings and directions for future research are discussed. PMID:21158596

  5. Aerothermal modeling program, phase 2

    NASA Technical Reports Server (NTRS)

    Mongia, H. C.; Patankar, S. V.; Murthy, S. N. B.; Sullivan, J. P.; Samuelsen, G. S.

    1985-01-01

    The main objectives of the Aerothermal Modeling Program, Phase 2 are: to develop an improved numerical scheme for incorporation in a 3-D combustor flow model; to conduct a benchmark quality experiment to study the interaction of a primary jet with a confined swirling crossflow and to assess current and advanced turbulence and scalar transport models; and to conduct experimental evaluation of the air swirler interaction with fuel injectors, assessments of current two-phase models, and verification the improved spray evaporation/dispersion models.

  6. The System of Systems Architecture Feasibility Assessment Model

    DTIC Science & Technology

    2016-06-01

    OF SYSTEMS ARCHITECTURE FEASIBILITY ASSESSMENT MODEL by Stephen E. Gillespie June 2016 Dissertation Supervisor Eugene Paulo THIS PAGE...Dissertation 4. TITLE AND SUBTITLE THE SYSTEM OF SYSTEMS ARCHITECTURE FEASIBILITY ASSESSMENT MODEL 5. FUNDING NUMBERS 6. AUTHOR(S) Stephen E...SoS architecture feasibility assessment model (SoS-AFAM). Together, these extend current model- based systems engineering (MBSE) and SoS engineering

  7. State Higher Education Funding Models: An Assessment of Current and Emerging Approaches

    ERIC Educational Resources Information Center

    Layzell, Daniel T.

    2007-01-01

    This article provides an assessment of the current and emerging approaches used by state governments in allocating funding for higher education institutions and programs. It reviews a number of desired characteristics or outcomes for state higher education funding models, including equity, adequacy, stability, and flexibility. Although there is…

  8. Modeling current climate conditions for forest pest risk assessment

    Treesearch

    Frank H. Koch; John W. Coulston

    2010-01-01

    Current information on broad-scale climatic conditions is essential for assessing potential distribution of forest pests. At present, sophisticated spatial interpolation approaches such as the Parameter-elevation Regressions on Independent Slopes Model (PRISM) are used to create high-resolution climatic data sets. Unfortunately, these data sets are based on 30-year...

  9. Leveraging Strengths Assessment and Intervention Model (LeStAIM): A Theoretical Strength-Based Assessment Framework

    ERIC Educational Resources Information Center

    Laija-Rodriguez, Wilda; Grites, Karen; Bouman, Doug; Pohlman, Craig; Goldman, Richard L.

    2013-01-01

    Current assessments in the schools are based on a deficit model (Epstein, 1998). "The National Association of School Psychologists (NASP) Model for Comprehensive and Integrated School Psychological Services" (2010), federal initiatives and mandates, and experts in the field of assessment have highlighted the need for the comprehensive…

  10. Toward refined environmental scenarios for ecological risk assessment of down-the-drain chemicals in freshwater environments.

    PubMed

    Franco, Antonio; Price, Oliver R; Marshall, Stuart; Jolliet, Olivier; Van den Brink, Paul J; Rico, Andreu; Focks, Andreas; De Laender, Frederik; Ashauer, Roman

    2017-03-01

    Current regulatory practice for chemical risk assessment suffers from the lack of realism in conventional frameworks. Despite significant advances in exposure and ecological effect modeling, the implementation of novel approaches as high-tier options for prospective regulatory risk assessment remains limited, particularly among general chemicals such as down-the-drain ingredients. While reviewing the current state of the art in environmental exposure and ecological effect modeling, we propose a scenario-based framework that enables a better integration of exposure and effect assessments in a tiered approach. Global- to catchment-scale spatially explicit exposure models can be used to identify areas of higher exposure and to generate ecologically relevant exposure information for input into effect models. Numerous examples of mechanistic ecological effect models demonstrate that it is technically feasible to extrapolate from individual-level effects to effects at higher levels of biological organization and from laboratory to environmental conditions. However, the data required to parameterize effect models that can embrace the complexity of ecosystems are large and require a targeted approach. Experimental efforts should, therefore, focus on vulnerable species and/or traits and ecological conditions of relevance. We outline key research needs to address the challenges that currently hinder the practical application of advanced model-based approaches to risk assessment of down-the-drain chemicals. Integr Environ Assess Manag 2017;13:233-248. © 2016 SETAC. © 2016 SETAC.

  11. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  12. Activity Patterns in Response to Symptoms in Patients Being Treated for Chronic Fatigue Syndrome: An Experience Sampling Methodology Study

    PubMed Central

    2016-01-01

    Objective: Cognitive–behavioral models of chronic fatigue syndrome (CFS) propose that patients respond to symptoms with 2 predominant activity patterns—activity limitation and all-or-nothing behaviors—both of which may contribute to illness persistence. The current study investigated whether activity patterns occurred at the same time as, or followed on from, patient symptom experience and affect. Method: Twenty-three adults with CFS were recruited from U.K. CFS services. Experience sampling methodology (ESM) was used to assess fluctuations in patient symptom experience, affect, and activity management patterns over 10 assessments per day for a total of 6 days. Assessments were conducted within patients’ daily life and were delivered through an app on touchscreen Android mobile phones. Multilevel model analyses were conducted to examine the role of self-reported patient fatigue, pain, and affect as predictors of change in activity patterns at the same and subsequent assessment. Results: Current experience of fatigue-related symptoms and pain predicted higher patient activity limitation at the current and subsequent assessments whereas subjective wellness predicted higher all-or-nothing behavior at both times. Current pain predicted less all-or-nothing behavior at the subsequent assessment. In contrast to hypotheses, current positive affect was predictive of current activity limitation whereas current negative affect was predictive of current all-or-nothing behavior. Both activity patterns varied at the momentary level. Conclusions: Patient symptom experiences appear to be driving patient activity management patterns in line with the cognitive–behavioral model of CFS. ESM offers a useful method for examining multiple interacting variables within the context of patients’ daily life. PMID:27819461

  13. Probabilistic inversion of expert assessments to inform projections about Antarctic ice sheet responses.

    PubMed

    Fuller, Robert William; Wong, Tony E; Keller, Klaus

    2017-01-01

    The response of the Antarctic ice sheet (AIS) to changing global temperatures is a key component of sea-level projections. Current projections of the AIS contribution to sea-level changes are deeply uncertain. This deep uncertainty stems, in part, from (i) the inability of current models to fully resolve key processes and scales, (ii) the relatively sparse available data, and (iii) divergent expert assessments. One promising approach to characterizing the deep uncertainty stemming from divergent expert assessments is to combine expert assessments, observations, and simple models by coupling probabilistic inversion and Bayesian inversion. Here, we present a proof-of-concept study that uses probabilistic inversion to fuse a simple AIS model and diverse expert assessments. We demonstrate the ability of probabilistic inversion to infer joint prior probability distributions of model parameters that are consistent with expert assessments. We then confront these inferred expert priors with instrumental and paleoclimatic observational data in a Bayesian inversion. These additional constraints yield tighter hindcasts and projections. We use this approach to quantify how the deep uncertainty surrounding expert assessments affects the joint probability distributions of model parameters and future projections.

  14. Assessment of Alternative Student Aid Delivery Systems: Assessment of the Current Delivery System.

    ERIC Educational Resources Information Center

    Advanced Technology, Inc., Reston, VA.

    The effects of the current system for delivering federal financial assistance to students under the Pell Grant, Guaranteed Student Loan (GSL), and campus-based programs are analyzed. Information is included on the use of the assessment model, which combines program evaluation, systems research, and policy analysis methodologies.…

  15. (De)Constructing the Risk Categories in the Aim Assessment Model for Children with Sexually Harmful Behaviour

    ERIC Educational Resources Information Center

    Myers, Steve

    2007-01-01

    This article critically analyses the AIM Assessment Model for children who have sexually harmful behaviour, exploring the underpinning knowledge and the processes involved. The model reflects current trends in the assessment of children, in child welfare and criminal justice services, producing categories of risk that lead to levels of…

  16. A Model for Assessing the Liability of Seemingly Correct Software

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.; Voas, Larry K.; Miller, Keith W.

    1991-01-01

    Current research on software reliability does not lend itself to quantitatively assessing the risk posed by a piece of life-critical software. Black-box software reliability models are too general and make too many assumptions to be applied confidently to assessing the risk of life-critical software. We present a model for assessing the risk caused by a piece of software; this model combines software testing results and Hamlet's probable correctness model. We show how this model can assess software risk for those who insure against a loss that can occur if life-critical software fails.

  17. Assessing the vertical structure of baroclinic tidal currents in a global model

    NASA Astrophysics Data System (ADS)

    Timko, Patrick; Arbic, Brian; Scott, Robert

    2010-05-01

    Tidal forcing plays an important role in many aspects of oceanography. Mixing, transport of particulates and internal wave generation are just three examples of local phenomena that may depend on the strength of local tidal currents. Advances in satellite altimetry have made an assessment of the global barotropic tide possible. However, the vertical structure of the tide may only be observed by deployment of instruments throughout the water column. Typically these observations are conducted at pre-determined depths based upon the interest of the observer. The high cost of such observations often limits both the number and the length of the observations resulting in a limit to our knowledge of the vertical structure of tidal currents. One way to expand our insight into the baroclinic structure of the ocean is through the use of numerical models. We compare the vertical structure of the global baroclinic tidal velocities in 1/12 degree HYCOM (HYbrid Coordinate Ocean Model) to a global database of current meter records. The model output is a subset of a 5 year global simulation that resolves the eddying general circulation, barotropic tides and baroclinic tides using 32 vertical layers. The density structure within the simulation is both vertically and horizontally non-uniform. In addition to buoyancy forcing the model is forced by astronomical tides and winds. We estimate the dominant semi-diurnal (M2), and diurnal (K1) tidal constituents of the model data using classical harmonic analysis. In regions where current meter record coverage is adequate, the model skill in replicating the vertical structure of the dominant diurnal and semi-diurnal tidal currents is assessed based upon the strength, orientation and phase of the tidal ellipses. We also present a global estimate of the baroclinic tidal energy at fixed depths estimated from the model output.

  18. The Command and Control of Communications in Joint and Combined Operations

    DTIC Science & Technology

    1994-06-03

    war. The Joint Task Force structure is used as the model for command and control relationships . The first part of the thesis assesses the current...Joint Task Force structure is used as the model for conmand and control relationships . The first part of the thesis assesses the current doctrine and...Message Switch Connectivity . . . . . . . 59 10. C4 Architecture Requirements . . . . . . 81 11. Functional Relationships . . . . . . 84 vi LIST OF

  19. The use of music therapy within the SCERTS model for children with Autism Spectrum Disorder.

    PubMed

    Walworth, Darcy DeLoach

    2007-01-01

    The SCERTS model is a new, comprehensive curriculum designed to assess and identify treatment goals and objectives within a multidisciplinary team of clinicians and educators for children with Autism Spectrum Disorders (ASD). This model is an ongoing assessment tool with resulting goals and objectives derived there from. Because music therapy offers a unique interaction setting for children with ASD to elicit communication skills, music therapists will need to be an integral part of the multidisciplinary assessment team using the SCERTS model which is projected to become the primary nation wide curriculum for children with ASD. The purpose of this paper is to assist music therapists in transitioning to this model by providing an overview and explanation of the SCERTS model and by identifying how music therapists are currently providing clinical services incorporated in the SCERTS Model for children with ASD. In order to formulate comprehensive transitional suggestions, a national survey of music therapists working with clients at risk or diagnosed with ASD was conducted to: (a) identify the areas of SCERTS assessment model that music therapists are currently addressing within their written goals for clients with ASD, (b) identify current music therapy activities that address various SCERTS goals and objectives, and (c) provide demographic information about settings, length, and tools used in music therapy interventions for clients with ASD.

  20. Practical examples of modeling choices and their consequences for risk assessment

    EPA Science Inventory

    Although benchmark dose (BMD) modeling has become the preferred approach to identifying a point of departure (POD) over the No Observed Adverse Effect Level, there remain challenges to its application in human health risk assessment. BMD modeling, as currently implemented by the...

  1. Biodiversity in environmental assessment-current practice and tools for prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gontier, Mikael; Balfors, Berit; Moertberg, Ulla

    Habitat loss and fragmentation are major threats to biodiversity. Environmental impact assessment and strategic environmental assessment are essential instruments used in physical planning to address such problems. Yet there are no well-developed methods for quantifying and predicting impacts of fragmentation on biodiversity. In this study, a literature review was conducted on GIS-based ecological models that have potential as prediction tools for biodiversity assessment. Further, a review of environmental impact statements for road and railway projects from four European countries was performed, to study how impact prediction concerning biodiversity issues was addressed. The results of the study showed the existing gapmore » between research in GIS-based ecological modelling and current practice in biodiversity assessment within environmental assessment.« less

  2. Probabilistic inversion of expert assessments to inform projections about Antarctic ice sheet responses

    PubMed Central

    Wong, Tony E.; Keller, Klaus

    2017-01-01

    The response of the Antarctic ice sheet (AIS) to changing global temperatures is a key component of sea-level projections. Current projections of the AIS contribution to sea-level changes are deeply uncertain. This deep uncertainty stems, in part, from (i) the inability of current models to fully resolve key processes and scales, (ii) the relatively sparse available data, and (iii) divergent expert assessments. One promising approach to characterizing the deep uncertainty stemming from divergent expert assessments is to combine expert assessments, observations, and simple models by coupling probabilistic inversion and Bayesian inversion. Here, we present a proof-of-concept study that uses probabilistic inversion to fuse a simple AIS model and diverse expert assessments. We demonstrate the ability of probabilistic inversion to infer joint prior probability distributions of model parameters that are consistent with expert assessments. We then confront these inferred expert priors with instrumental and paleoclimatic observational data in a Bayesian inversion. These additional constraints yield tighter hindcasts and projections. We use this approach to quantify how the deep uncertainty surrounding expert assessments affects the joint probability distributions of model parameters and future projections. PMID:29287095

  3. Motor Vehicle Demand Models : Assessment of the State of the Art and Directions for Future Research

    DOT National Transportation Integrated Search

    1981-04-01

    The report provides an assessment of the current state of motor vehicle demand modeling. It includes a detailed evaluation of one leading large-scale econometric vehicle demand model, which is tested for both logical consistency and forecasting accur...

  4. INCORPORATING CATASTROPHES INTO INTEGRATED ASSESSMENT: SCIENCE, IMPACTS, AND ADAPTATION

    EPA Science Inventory

    Incorporating potential catastrophic consequences into integrated assessment models of climate change has been a top priority of policymakers and modelers alike. We review the current state of scientific understanding regarding three frequently mentioned geophysical catastrophes,...

  5. Bayesian Model Development for Analysis of Open Source Information to Support the Assessment of Nuclear Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gastelum, Zoe N.; Whitney, Paul D.; White, Amanda M.

    2013-07-15

    Pacific Northwest National Laboratory has spent several years researching, developing, and validating large Bayesian network models to support integration of open source data sets for nuclear proliferation research. Our current work focuses on generating a set of interrelated models for multi-source assessment of nuclear programs, as opposed to a single comprehensive model. By using this approach, we can break down the models to cover logical sub-problems that can utilize different expertise and data sources. This approach allows researchers to utilize the models individually or in combination to detect and characterize a nuclear program and identify data gaps. The models operatemore » at various levels of granularity, covering a combination of state-level assessments with more detailed models of site or facility characteristics. This paper will describe the current open source-driven, nuclear nonproliferation models under development, the pros and cons of the analytical approach, and areas for additional research.« less

  6. System for assessing Aviation's Global Emissions (SAGE), part 1 : model description and inventory results

    DOT National Transportation Integrated Search

    2007-07-01

    In early 2001, the US Federal Aviation Administration embarked on a multi-year effort to develop a new computer model, the System for assessing Aviation's Global Emissions (SAGE). Currently at Version 1.5, the basic use of the model has centered on t...

  7. Preclinical QSP Modeling in the Pharmaceutical Industry: An IQ Consortium Survey Examining the Current Landscape

    PubMed Central

    Wu, Fan; Bansal, Loveleena; Bradshaw‐Pierce, Erica; Chan, Jason R.; Liederer, Bianca M.; Mettetal, Jerome T.; Schroeder, Patricia; Schuck, Edgar; Tsai, Alice; Xu, Christine; Chimalakonda, Anjaneya; Le, Kha; Penney, Mark; Topp, Brian; Yamada, Akihiro

    2018-01-01

    A cross‐industry survey was conducted to assess the landscape of preclinical quantitative systems pharmacology (QSP) modeling within pharmaceutical companies. This article presents the survey results, which provide insights on the current state of preclinical QSP modeling in addition to future opportunities. Our results call attention to the need for an aligned definition and consistent terminology around QSP, yet highlight the broad applicability and benefits preclinical QSP modeling is currently delivering. PMID:29349875

  8. An Entropy-Based Measure for Assessing Fuzziness in Logistic Regression

    ERIC Educational Resources Information Center

    Weiss, Brandi A.; Dardick, William

    2016-01-01

    This article introduces an entropy-based measure of data-model fit that can be used to assess the quality of logistic regression models. Entropy has previously been used in mixture-modeling to quantify how well individuals are classified into latent classes. The current study proposes the use of entropy for logistic regression models to quantify…

  9. Model Calculations with Excited Nuclear Fragmentations and Implications of Current GCR Spectra

    NASA Astrophysics Data System (ADS)

    Saganti, Premkumar

    As a result of the fragmentation process in nuclei, energy from the excited states may also contribute to the radiation damage on the cell structure. Radiation induced damage to the human body from the excited states of oxygen and several other nuclei and its fragments are of a concern in the context of the measured abundance of the current galactic cosmic rays (GCR) environment. Nuclear Shell model based calculations of the Selective-Core (Saganti-Cucinotta) approach are being expanded for O-16 nuclei fragments into N-15 with a proton knockout and O-15 with a neutron knockout are very promising. In our on going expansions of these nuclear fragmentation model calculations and assessments, we present some of the prominent nuclei interactions from a total of 190 isotopes that were identified for the current model expansion based on the Quantum Multiple Scattering Fragmentation Model (QMSFRG) of Cucinotta. Radiation transport model calculations with the implementation of these energy level spectral characteristics are expected to enhance the understanding of radiation damage at the cellular level. Implications of these excited energy spectral calculations in the assessment of radiation damage to the human body may provide enhanced understanding of the space radiation risk assessment.

  10. Mental Models of Elementary and Middle School Students in Analyzing Simple Battery and Bulb Circuits

    ERIC Educational Resources Information Center

    Jabot, Michael; Henry, David

    2007-01-01

    Written assessment items were developed to probe students' understanding of a variety of direct current (DC) resistive electric circuit concepts. The items were used to explore the mental models that grade 3-8 students use in explaining the direction of electric current and how electric current is affected by different configurations of simple…

  11. A simple simulation model as a tool to assess alternative health care provider payment reform options in Vietnam.

    PubMed

    Cashin, Cheryl; Phuong, Nguyen Khanh; Shain, Ryan; Oanh, Tran Thi Mai; Thuy, Nguyen Thi

    2015-01-01

    Vietnam is currently considering a revision of its 2008 Health Insurance Law, including the regulation of provider payment methods. This study uses a simple spreadsheet-based, micro-simulation model to analyse the potential impacts of different provider payment reform scenarios on resource allocation across health care providers in three provinces in Vietnam, as well as on the total expenditure of the provincial branches of the public health insurance agency (Provincial Social Security [PSS]). The results show that currently more than 50% of PSS spending is concentrated at the provincial level with less than half at the district level. There is also a high degree of financial risk on district hospitals with the current fund-holding arrangement. Results of the simulation model show that several alternative scenarios for provider payment reform could improve the current payment system by reducing the high financial risk currently borne by district hospitals without dramatically shifting the current level and distribution of PSS expenditure. The results of the simulation analysis provided an empirical basis for health policy-makers in Vietnam to assess different provider payment reform options and make decisions about new models to support health system objectives.

  12. The atmospheric effects of stratospheric aircraft: A current consensus

    NASA Technical Reports Server (NTRS)

    Douglass, A. R.; Carroll, M. A.; Demore, W. B.; Holton, J. R.; Isaksen, I. S. A.; Johnston, H. S.; Ko, M. K. W.

    1991-01-01

    In the early 1970's, a fleet of supersonic aircraft flying in the lower stratosphere was proposed. A large fleet was never built for economic, political, and environmental reasons. Technological improvements may make it economically feasible to develop supersonic aircraft for current markets. Some key results of earlier scientific programs designed to assess the impact of aircraft emissions on stratospheric ozone are reviewed, and factors that must be considered to assess the environmental impact of aircraft exhaust are discussed. These include the amount of nitrogen oxides injected in the stratosphere, horizontal transport, and stratosphere/troposphere assessment models are presented. Areas in which improvements in scientific understanding and model representation must be made to reduce the uncertainty in model calculations are identified.

  13. An Exploratory Study Examining Current Assessment Supervisory Practices in Professional Psychology.

    PubMed

    Iwanicki, Sierra; Peterson, Catherine

    2017-01-01

    The extant literature reveals a considerable amount of research examining course work or technical training in psychological assessment, but a dearth of empirical research on assessment supervision. This study examined perspectives on current assessment supervisory practices in professional psychology through an online survey. Descriptive and qualitative data were collected from 125 survey respondents who were members of assessment-focused professional organizations and who had at least 1 year of supervision experience. Responses indicated a general recognition of the need for formal training in assessment supervision, ongoing training opportunities, and adherence to supervision competencies. Responses indicated more common use of developmental and skill-based models, although most did not regard any one model of assessment supervision as superior. Despite the recommended use of a supervision contract, only 65.6% (n = 80) of respondents use one. Discussion, directed readings, modeling, role-play, and case presentations were the most common supervisory interventions. Although conclusions are constrained by low survey response rate, results yielded rich data that might guide future examination of multiple perspectives on assessment supervision and ultimately contribute to curriculum advances and the development of supervision "best practices."

  14. USE OF BIOLOGICALLY BASED COMPUTATIONAL MODELING IN MODE OF ACTION-BASED RISK ASSESSMENT – AN EXAMPLE OF CHLOROFORM

    EPA Science Inventory

    The objective of current work is to develop a new cancer dose-response assessment for chloroform using a physiologically based pharmacokinetic/pharmacodynamic (PBPK/PD) model. The PBPK/PD model is based on a mode of action in which the cytolethality of chloroform occurs when the ...

  15. On Applications of Rasch Models in International Comparative Large-Scale Assessments: A Historical Review

    ERIC Educational Resources Information Center

    Wendt, Heike; Bos, Wilfried; Goy, Martin

    2011-01-01

    Several current international comparative large-scale assessments of educational achievement (ICLSA) make use of "Rasch models", to address functions essential for valid cross-cultural comparisons. From a historical perspective, ICLSA and Georg Rasch's "models for measurement" emerged at about the same time, half a century ago. However, the…

  16. SMALL POPULATIONS REQUIRE SPECIFIC MODELING APPROACHES FOR ASSESSING RISK

    EPA Science Inventory

    All populations face non-zero risks of extinction. However, the risks for small populations, and therefore the modeling approaches necessary to predict them, are different from those of large populations. These differences are currently hindering assessment of risk to small pop...

  17. COST VS. QUALITY IN DEMOGRAPHIC MODELLING: WHEN IS A VITAL RATE GOOD ENOUGH?

    EPA Science Inventory

    This presentation will focus on the assessment of quality for demographic parameters to be used in population-level risk assessment. Current population models can handle genetic, demographic, and environmental stochasticity, density dependence, and multiple stressors. However, cu...

  18. The Research and Evaluation of Road Environment in the Block of City Based on 3-D Streetscape Data

    NASA Astrophysics Data System (ADS)

    Guan, L.; Ding, Y.; Ge, J.; Yang, H.; Feng, X.; Chen, P.

    2018-04-01

    This paper focus on the problem of the street environment of block unit, based on making clear the acquisition mode and characteristics of 3D streetscape data, the paper designs the assessment model of regional block unit based on 3D streetscape data. The 3D streetscape data with the aid of oblique photogrammetry surveying and mobile equipment, will greatly improve the efficiency and accuracy of urban regional assessment, and expand the assessment scope. Based on the latest urban regional assessment model, with the street environment assessment model of the current situation, this paper analyzes the street form and street environment assessment of current situation in the typical area of Beijing. Through the street environment assessment of block unit, we found that in the megacity street environment assessment model of block unit based on 3D streetscape data has greatly help to improve the assessment efficiency and accuracy. At the same time, motor vehicle lane, green shade deficiency, bad railings and street lost situation is still very serious in Beijing, the street environment improvement of the block unit is still a heavy task. The research results will provide data support for urban fine management and urban design, and provide a solid foundation for the improvement of city image.

  19. Modeling approaches for characterizing and evaluating environmental exposure to engineered nanomaterials in support of risk-based decision making.

    PubMed

    Hendren, Christine Ogilvie; Lowry, Michael; Grieger, Khara D; Money, Eric S; Johnston, John M; Wiesner, Mark R; Beaulieu, Stephen M

    2013-02-05

    As the use of engineered nanomaterials becomes more prevalent, the likelihood of unintended exposure to these materials also increases. Given the current scarcity of experimental data regarding fate, transport, and bioavailability, determining potential environmental exposure to these materials requires an in depth analysis of modeling techniques that can be used in both the near- and long-term. Here, we provide a critical review of traditional and emerging exposure modeling approaches to highlight the challenges that scientists and decision-makers face when developing environmental exposure and risk assessments for nanomaterials. We find that accounting for nanospecific properties, overcoming data gaps, realizing model limitations, and handling uncertainty are key to developing informative and reliable environmental exposure and risk assessments for engineered nanomaterials. We find methods suited to recognizing and addressing significant uncertainty to be most appropriate for near-term environmental exposure modeling, given the current state of information and the current insufficiency of established deterministic models to address environmental exposure to engineered nanomaterials.

  20. Comparative Validity of the Shedler and Westen Assessment Procedure-200

    ERIC Educational Resources Information Center

    Mullins-Sweatt, Stephanie N.; Widiger, Thomas A.

    2008-01-01

    A predominant dimensional model of general personality structure is the five-factor model (FFM). Quite a number of alternative instruments have been developed to assess the domains of the FFM. The current study compares the validity of 2 alternative versions of the Shedler and Westen Assessment Procedure (SWAP-200) FFM scales, 1 that was developed…

  1. Modeling and Dynamic Analysis of Paralleled dc/dc Converters With Master-Slave Current Sharing Control

    NASA Technical Reports Server (NTRS)

    Rajagopalan, J.; Xing, K.; Guo, Y.; Lee, F. C.; Manners, Bruce

    1996-01-01

    A simple, application-oriented, transfer function model of paralleled converters employing Master-Slave Current-sharing (MSC) control is developed. Dynamically, the Master converter retains its original design characteristics; all the Slave converters are forced to depart significantly from their original design characteristics into current-controlled current sources. Five distinct loop gains to assess system stability and performance are identified and their physical significance is described. A design methodology for the current share compensator is presented. The effect of this current sharing scheme on 'system output impedance' is analyzed.

  2. Animal behavioral assessments in current research of Parkinson's disease.

    PubMed

    Asakawa, Tetsuya; Fang, Huan; Sugiyama, Kenji; Nozaki, Takao; Hong, Zhen; Yang, Yilin; Hua, Fei; Ding, Guanghong; Chao, Dongman; Fenoy, Albert J; Villarreal, Sebastian J; Onoe, Hirotaka; Suzuki, Katsuaki; Mori, Norio; Namba, Hiroki; Xia, Ying

    2016-06-01

    Parkinson's disease (PD), a neurodegenerative disorder, is traditionally classified as a movement disorder. Patients typically suffer from many motor dysfunctions. Presently, clinicians and scientists recognize that many non-motor symptoms are associated with PD. There is an increasing interest in both motor and non-motor symptoms in clinical studies on PD patients and laboratory research on animal models that imitate the pathophysiologic features and symptoms of PD patients. Therefore, appropriate behavioral assessments are extremely crucial for correctly understanding the mechanisms of PD and accurately evaluating the efficacy and safety of novel therapies. This article systematically reviews the behavioral assessments, for both motor and non-motor symptoms, in various animal models involved in current PD research. We addressed the strengths and weaknesses of these behavioral tests and their appropriate applications. Moreover, we discussed potential mechanisms behind these behavioral tests and cautioned readers against potential experimental bias. Since most of the behavioral assessments currently used for non-motor symptoms are not particularly designed for animals with PD, it is of the utmost importance to greatly improve experimental design and evaluation in PD research with animal models. Indeed, it is essential to develop specific assessments for non-motor symptoms in PD animals based on their characteristics. We concluded with a prospective view for behavioral assessments with real-time assessment with mobile internet and wearable device in future PD research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Real Patient and its Virtual Twin: Application of Quantitative Systems Toxicology Modelling in the Cardiac Safety Assessment of Citalopram.

    PubMed

    Patel, Nikunjkumar; Wiśniowska, Barbara; Jamei, Masoud; Polak, Sebastian

    2017-11-27

    A quantitative systems toxicology (QST) model for citalopram was established to simulate, in silico, a 'virtual twin' of a real patient to predict the occurrence of cardiotoxic events previously reported in patients under various clinical conditions. The QST model considers the effects of citalopram and its most notable electrophysiologically active primary (desmethylcitalopram) and secondary (didesmethylcitalopram) metabolites, on cardiac electrophysiology. The in vitro cardiac ion channel current inhibition data was coupled with the biophysically detailed model of human cardiac electrophysiology to investigate the impact of (i) the inhibition of multiple ion currents (I Kr , I Ks , I CaL ); (ii) the inclusion of metabolites in the QST model; and (iii) unbound or total plasma as the operating drug concentration, in predicting clinically observed QT prolongation. The inclusion of multiple ion channel current inhibition and metabolites in the simulation with unbound plasma citalopram concentration provided the lowest prediction error. The predictive performance of the model was verified with three additional therapeutic and supra-therapeutic drug exposure clinical cases. The results indicate that considering only the hERG ion channel inhibition of only the parent drug is potentially misleading, and the inclusion of active metabolite data and the influence of other ion channel currents should be considered to improve the prediction of potential cardiac toxicity. Mechanistic modelling can help bridge the gaps existing in the quantitative translation from preclinical cardiac safety assessment to clinical toxicology. Moreover, this study shows that the QST models, in combination with appropriate drug and systems parameters, can pave the way towards personalised safety assessment.

  4. A multi-model assessment of the co-benefits of climate mitigation for global air quality

    NASA Astrophysics Data System (ADS)

    Rao, Shilpa; Klimont, Zbigniew; Leitao, Joana; Riahi, Keywan; van Dingenen, Rita; Aleluia Reis, Lara; Calvin, Katherine; Dentener, Frank; Drouet, Laurent; Fujimori, Shinichiro; Harmsen, Mathijs; Luderer, Gunnar; Heyes, Chris; Strefler, Jessica; Tavoni, Massimo; van Vuuren, Detlef P.

    2016-12-01

    We present a model comparison study that combines multiple integrated assessment models with a reduced-form global air quality model to assess the potential co-benefits of global climate mitigation policies in relation to the World Health Organization (WHO) goals on air quality and health. We include in our assessment, a range of alternative assumptions on the implementation of current and planned pollution control policies. The resulting air pollution emission ranges significantly extend those in the Representative Concentration Pathways. Climate mitigation policies complement current efforts on air pollution control through technology and fuel transformations in the energy system. A combination of stringent policies on air pollution control and climate change mitigation results in 40% of the global population exposed to PM levels below the WHO air quality guideline; with the largest improvements estimated for India, China, and Middle East. Our results stress the importance of integrated multisector policy approaches to achieve the Sustainable Development Goals.

  5. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT ...

    EPA Pesticide Factsheets

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execution of the Soil Water Assessment Tool (SWAT) and KINEmatic Runoff and EROSion (KINEROS2) hydrologic models. The application of these two models allows AGWA to conduct hydrologic modeling and watershed assessments at multiple temporal and spatial scales. AGWA’s current outputs are runoff (volumes and peaks) and sediment yield, plus nitrogen and phosphorus with the SWAT model. AGWA uses commonly available GIS data layers to fully parameterize, execute, and visualize results from both models. Through an intuitive interface the user selects an outlet from which AGWA delineates and discretizes the watershed using a Digital Elevation Model (DEM) based on the individual model requirements. The watershed model elements are then intersected with soils and land cover data layers to derive the requisite model input parameters. The chosen model is then executed, and the results are imported back into AGWA for visualization. This allows managers to identify potential problem areas where additional monitoring can be undertaken or mitigation activities can be focused. AGWA also has tools to apply an array of best management practices. There are currently two versions of AGWA available; AGWA 1.5 for

  6. National Assessment of Writing: Useless and Uninteresting?

    ERIC Educational Resources Information Center

    Maxwell, John C.

    1973-01-01

    Points out flaws in the current National Assessment of Writing model and its results, but concludes that the National Assessment is a step in the right direction. (RB) Aspect of National Assessment (NAEP) dealt with in this document: Procedures (Exercise Development).

  7. ERRORS IN APPLYING LOW IONIC-STRENGTH ACTIVITY COEFFICIENT ALGORITHMS TO HIGHER IONIC-STRENGTH AQUATIC MEDIA

    EPA Science Inventory

    The toxicological and regulatory communities are currently exploring the use of the free-ion-activity (FIA) model both alone and in conjunction with the biotic ligand model (BLM) as a means of reducing uncertainties in current methods for assessing metals bioavailability from aqu...

  8. Using Item Response Theory to Conduct a Distracter Analysis on Conceptual Inventory of Natural Selection

    ERIC Educational Resources Information Center

    Battisti, Bryce Thomas; Hanegan, Nikki; Sudweeks, Richard; Cates, Rex

    2010-01-01

    Concept inventories are often used to assess current student understanding although conceptual change models are problematic. Due to controversies with conceptual change models and the realities of student assessment, it is important that concept inventories are evaluated using a variety of theoretical models to improve quality. This study used a…

  9. PBPK and population modelling to interpret urine cadmium concentrations of the French population

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Béchaux, Camille, E-mail: Camille.bechaux@anses.fr; Bodin, Laurent; Clémençon, Stéphan

    As cadmium accumulates mainly in kidney, urinary concentrations are considered as relevant data to assess the risk related to cadmium. The French Nutrition and Health Survey (ENNS) recorded the concentration of cadmium in the urine of the French population. However, as with all biomonitoring data, it needs to be linked to external exposure for it to be interpreted in term of sources of exposure and for risk management purposes. The objective of this work is thus to interpret the cadmium biomonitoring data of the French population in terms of dietary and cigarette smoke exposures. Dietary and smoking habits recorded inmore » the ENNS study were combined with contamination levels in food and cigarettes to assess individual exposures. A PBPK model was used in a Bayesian population model to link this external exposure with the measured urinary concentrations. In this model, the level of the past exposure was corrected thanks to a scaling function which account for a trend in the French dietary exposure. It resulted in a modelling which was able to explain the current urinary concentrations measured in the French population through current and past exposure levels. Risk related to cadmium exposure in the general French population was then assessed from external and internal critical values corresponding to kidney effects. The model was also applied to predict the possible urinary concentrations of the French population in 2030 assuming there will be no more changes in the exposures levels. This scenario leads to significantly lower concentrations and consequently lower related risk. - Highlights: • Interpretation of urine cadmium concentrations in France • PBPK and Bayesian population modelling of cadmium exposure • Assessment of the historic time-trend of the cadmium exposure in France • Risk assessment from current and future external and internal exposure.« less

  10. A framework for global river flood risk assessments

    NASA Astrophysics Data System (ADS)

    Winsemius, H. C.; Van Beek, L. P. H.; Jongman, B.; Ward, P. J.; Bouwman, A.

    2012-08-01

    There is an increasing need for strategic global assessments of flood risks in current and future conditions. In this paper, we propose a framework for global flood risk assessment for river floods, which can be applied in current conditions, as well as in future conditions due to climate and socio-economic changes. The framework's goal is to establish flood hazard and impact estimates at a high enough resolution to allow for their combination into a risk estimate. The framework estimates hazard at high resolution (~1 km2) using global forcing datasets of the current (or in scenario mode, future) climate, a global hydrological model, a global flood routing model, and importantly, a flood extent downscaling routine. The second component of the framework combines hazard with flood impact models at the same resolution (e.g. damage, affected GDP, and affected population) to establish indicators for flood risk (e.g. annual expected damage, affected GDP, and affected population). The framework has been applied using the global hydrological model PCR-GLOBWB, which includes an optional global flood routing model DynRout, combined with scenarios from the Integrated Model to Assess the Global Environment (IMAGE). We performed downscaling of the hazard probability distributions to 1 km2 resolution with a new downscaling algorithm, applied on Bangladesh as a first case-study application area. We demonstrate the risk assessment approach in Bangladesh based on GDP per capita data, population, and land use maps for 2010 and 2050. Validation of the hazard and damage estimates has been performed using the Dartmouth Flood Observatory database and damage estimates from the EM-DAT database and World Bank sources. We discuss and show sensitivities of the estimated risks with regard to the use of different climate input sets, decisions made in the downscaling algorithm, and different approaches to establish impact models.

  11. Assessing habitat risk from human activities to inform coastal and marine spatial planning: a demonstration in Belize

    NASA Astrophysics Data System (ADS)

    Arkema, Katie K.; Verutes, Gregory; Bernhardt, Joanna R.; Clarke, Chantalle; Rosado, Samir; Canto, Maritza; Wood, Spencer A.; Ruckelshaus, Mary; Rosenthal, Amy; McField, Melanie; de Zegher, Joann

    2014-11-01

    Integrated coastal and ocean management requires transparent and accessible approaches for understanding the influence of human activities on marine environments. Here we introduce a model for assessing the combined risk to habitats from multiple ocean uses. We apply the model to coral reefs, mangrove forests and seagrass beds in Belize to inform the design of the country’s first Integrated Coastal Zone Management (ICZM) Plan. Based on extensive stakeholder engagement, review of existing legislation and data collected from diverse sources, we map the current distribution of coastal and ocean activities and develop three scenarios for zoning these activities in the future. We then estimate ecosystem risk under the current and three future scenarios. Current levels of risk vary spatially among the nine coastal planning regions in Belize. Empirical tests of the model are strong—three-quarters of the measured data for coral reef health lie within the 95% confidence interval of interpolated model data and 79% of the predicted mangrove occurrences are associated with observed responses. The future scenario that harmonizes conservation and development goals results in a 20% reduction in the area of high-risk habitat compared to the current scenario, while increasing the extent of several ocean uses. Our results are a component of the ICZM Plan for Belize that will undergo review by the national legislature in 2015. This application of our model to marine spatial planning in Belize illustrates an approach that can be used broadly by coastal and ocean planners to assess risk to habitats under current and future management scenarios.

  12. NATIONAL GEODATABASE OF TIDAL STREAM POWER RESOURCE IN USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Brennan T; Neary, Vincent S; Stewart, Kevin M

    2012-01-01

    A geodatabase of tidal constituents is developed to present the regional assessment of tidal stream power resource in the USA. Tidal currents are numerically modeled with the Regional Ocean Modeling System (ROMS) and calibrated with the available measurements of tidal current speeds and water level surfaces. The performance of the numerical model in predicting the tidal currents and water levels is assessed by an independent validation. The geodatabase is published on a public domain via a spatial database engine with interactive tools to select, query and download the data. Regions with the maximum average kinetic power density exceeding 500 W/m2more » (corresponding to a current speed of ~1 m/s), total surface area larger than 0.5 km2 and depth greater than 5 m are defined as hotspots and documented. The regional assessment indicates that the state of Alaska (AK) has the largest number of locations with considerably high kinetic power density, followed by, Maine (ME), Washington (WA), Oregon (OR), California (CA), New Hampshire (NH), Massachusetts (MA), New York (NY), New Jersey (NJ), North and South Carolina (NC, SC), Georgia (GA), and Florida (FL).« less

  13. Finite geometry effects of field-aligned currents

    NASA Technical Reports Server (NTRS)

    Fung, Shing F.; Hoffman, R. A.

    1992-01-01

    Results are presented of model calculations of the magnetic field produced by finite current regions that would be measured by a spaceborne magnetometer. Conditions were examined under which the infinite current sheet approximation can be applied to the calculation of the field-aligned current (FAC) density, using satellite magnetometer data. The accuracy of the three methods used for calculating the current sheet normal direction with respect to the spacecraft trajectory was assessed. It is shown that the model can be used to obtain the position and the orientation of the spacecraft trajectory through the FAC region.

  14. A 20-Year High-Resolution Wave Resource Assessment of Japan with Wave-Current Interactions

    NASA Astrophysics Data System (ADS)

    Webb, A.; Waseda, T.; Kiyomatsu, K.

    2016-02-01

    Energy harvested from surface ocean waves and tidal currents has the potential to be a significant source of green energy, particularly for countries with extensive coastlines such as Japan. As part of a larger marine renewable energy project*, The University of Tokyo (in cooperation with JAMSTEC) has conducted a state-of-the-art wave resource assessment (with uncertainty estimates) to assist with wave generator site identification and construction in Japan. This assessment will be publicly available and is based on a large-scale NOAA WAVEWATCH III (version 4.18) simulation using NCEP and JAMSTEC forcings. It includes several key components to improve model skill: a 20-year simulation to reduce aleatory uncertainty, a four-nested-layer approach to resolve a 1 km shoreline, and finite-depth and current effects included in all wave power density calculations. This latter component is particularly important for regions near strong currents such as the Kuroshio. Here, we will analyze the different wave power density equations, discuss the model setup, and present results from the 20-year assessment (with a focus on the role of wave-current interactions). Time permitting, a comparison will also be made with simulations using JMA MSM 5 km winds. *New Energy and Industrial Technology Development Organization (NEDO): "Research on the Framework and Infrastructure of Marine Renewable Energy; an Energy Potential Assessment"

  15. Best Practices for Evaluating the Capability of Nondestructive Evaluation (NDE) and Structural Health Monitoring (SHM) Techniques for Damage Characterization (Post-Print)

    DTIC Science & Technology

    2016-02-10

    a wide range of part, environmental and damage conditions. Best practices of using models are presented for both an eddy current NDE sizing and...to assess the reliability of NDE and SHM characterization capability. Best practices of using models are presented for both an eddy current NDE... EDDY CURRENT NDE CASE STUDY An eddy current crack sizing case study is presented to highlight examples of some of these complex characteristics of

  16. Making to Measure? Reconsidering Assessment in Professional Continuing Education

    ERIC Educational Resources Information Center

    Fenwick, Tara

    2009-01-01

    Drawing on studies of teachers, accountants and pharmacists conducted in Canada, this essay examines models for assessing professional learning that currently enjoy widespread use in continuing education. These models include professional growth plans, self-administered tests and learning logs, and they are often used for regulatory as well as…

  17. An Integrated Ecological Modeling System for Assessing Impacts of Multiple Stressors on Stream and Riverine Ecosystem Services Within River Basins

    EPA Science Inventory

    We demonstrate a novel, spatially explicit assessment of the current condition of aquatic ecosystem services, with limited sensitivity analysis for the atmospheric contaminant mercury. The Integrated Ecological Modeling System (IEMS) forecasts water quality and quantity, habitat ...

  18. Modeling and Dynamic Analysis of Paralleled of dc/dc Converters with Master-Slave Current Sharing Control

    NASA Technical Reports Server (NTRS)

    Rajagopalan, J.; Xing, K.; Guo, Y.; Lee, F. C.; Manners, Bruce

    1996-01-01

    A simple, application-oriented, transfer function model of paralleled converters employing Master-Slave Current-sharing (MSC) control is developed. Dynamically, the Master converter retains its original design characteristics; all the Slave converters are forced to depart significantly from their original design characteristics into current-controlled current sources. Five distinct loop gains to assess system stability and performance are identified and their physical significance is described. A design methodology for the current share compensator is presented. The effect of this current sharing scheme on 'system output impedance' is analyzed.

  19. FlexPepDock lessons from CAPRI peptide-protein rounds and suggested new criteria for assessment of model quality and utility.

    PubMed

    Marcu, Orly; Dodson, Emma-Joy; Alam, Nawsad; Sperber, Michal; Kozakov, Dima; Lensink, Marc F; Schueler-Furman, Ora

    2017-03-01

    CAPRI rounds 28 and 29 included, for the first time, peptide-receptor targets of three different systems, reflecting increased appreciation of the importance of peptide-protein interactions. The CAPRI rounds allowed us to objectively assess the performance of Rosetta FlexPepDock, one of the first protocols to explicitly include peptide flexibility in docking, accounting for peptide conformational changes upon binding. We discuss here successes and challenges in modeling these targets: we obtain top-performing, high-resolution models of the peptide motif for cases with known binding sites but there is a need for better modeling of flanking regions, as well as better selection criteria, in particular for unknown binding sites. These rounds have also provided us the opportunity to reassess the success criteria, to better reflect the quality of a peptide-protein complex model. Using all models submitted to CAPRI, we analyze the correlation between current classification criteria and the ability to retrieve critical interface features, such as hydrogen bonds and hotspots. We find that loosening the backbone (and ligand) RMSD threshold, together with a restriction on the side chain RMSD measure, allows us to improve the selection of high-accuracy models. We also suggest a new measure to assess interface hydrogen bond recovery, which is not assessed by the current CAPRI criteria. Finally, we find that surprisingly much can be learned from rather inaccurate models about binding hotspots, suggesting that the current status of peptide-protein docking methods, as reflected by the submitted CAPRI models, can already have a significant impact on our understanding of protein interactions. Proteins 2017; 85:445-462. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  20. Assessing pesticide risks to threatened and endangered species using population models: Findings and recommendations from a CropLife America Science Forum.

    PubMed

    Forbes, V E; Brain, R; Edwards, D; Galic, N; Hall, T; Honegger, J; Meyer, C; Moore, D R J; Nacci, D; Pastorok, R; Preuss, T G; Railsback, S F; Salice, C; Sibly, R M; Tenhumberg, B; Thorbek, P; Wang, M

    2015-07-01

    This brief communication reports on the main findings and recommendations from the 2014 Science Forum organized by CropLife America. The aim of the Forum was to gain a better understanding of the current status of population models and how they could be used in ecological risk assessments for threatened and endangered species potentially exposed to pesticides in the United States. The Forum panelists' recommendations are intended to assist the relevant government agencies with implementation of population modeling in future endangered species risk assessments for pesticides. The Forum included keynote presentations that provided an overview of current practices, highlighted the findings of a recent National Academy of Sciences report and its implications, reviewed the main categories of existing population models and the types of risk expressions that can be produced as model outputs, and provided examples of how population models are currently being used in different legislative contexts. The panel concluded that models developed for listed species assessments should provide quantitative risk estimates, incorporate realistic variability in environmental and demographic factors, integrate complex patterns of exposure and effects, and use baseline conditions that include present factors that have caused the species to be listed (e.g., habitat loss, invasive species) or have resulted in positive management action. Furthermore, the panel advocates for the formation of a multipartite advisory committee to provide best available knowledge and guidance related to model implementation and use, to address such needs as more systematic collection, digitization, and dissemination of data for listed species; consideration of the newest developments in good modeling practice; comprehensive review of existing population models and their applicability for listed species assessments; and development of case studies using a few well-tested models for particular species to demonstrate proof of concept. To advance our common goals, the panel recommends the following as important areas for further research and development: quantitative analysis of the causes of species listings to guide model development; systematic assessment of the relative role of toxicity versus other factors in driving pesticide risk; additional study of how interactions between density dependence and pesticides influence risk; and development of pragmatic approaches to assessing indirect effects of pesticides on listed species. © 2015 SETAC.

  1. How well do terrestrial biosphere models simulate coarse-scale runoff in the contiguous United States?

    Treesearch

    C.R. Schwalm; D.N. Huntzinger; R.B. Cook; Y. Wei; I.T. Baker; R.P. Neilson; B. Poulter; Peter Caldwell; G. Sun; H.Q. Tian; N. Zeng

    2015-01-01

    Significant changes in the water cycle are expected under current global environmental change. Robust assessment of present-day water cycle dynamics at continental to global scales is confounded by shortcomings in the observed record. Modeled assessments also yield conflicting results which are linked to differences in model structure and simulation protocol. Here we...

  2. Systems engineering approach to environmental risk management: A case study of depleted uranium at test area C-64, Eglin Air Force Base, Florida. Master`s thesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carter, C.M.; Fortmann, K.M.; Hill, S.W.

    1994-12-01

    Environmental restoration is an area of concern in an environmentally conscious world. Much effort is required to clean up the environment and promote environmentally sound methods for managing current land use. In light of the public consciousness with the latter topic, the United States Air Force must also take an active role in addressing these environmental issues with respect to current and future USAF base land use. This thesis uses the systems engineering technique to assess human health risks and to evaluate risk management options with respect to depleted uranium contamination in the sampled region of Test Area (TA) C-64more » at Eglin Air Force Base (AFB). The research combines the disciplines of environmental data collection, DU soil concentration distribution modeling, ground water modeling, particle resuspension modeling, exposure assessment, health hazard assessment, and uncertainty analysis to characterize the test area. These disciplines are required to quantify current and future health risks, as well as to recommend cost effective ways to increase confidence in health risk assessment and remediation options.« less

  3. Model-independent assessment of current direct searches for spin-dependent dark matter.

    PubMed

    Giuliani, F

    2004-10-15

    I evaluate the current results of spin-dependent weakly interacting massive particle searches within a model-independent framework, showing the most restrictive limits to date derive from the combination of xenon and sodium iodide experiments. The extension of this analysis to the case of positive signal experiments is elaborated.

  4. Expanding the "CBAL"™ Mathematics Assessments to Elementary Grades: The Development of a Competency Model and a Rational Number Learning Progression. Research Report. ETS RR-14-08

    ERIC Educational Resources Information Center

    Arieli-Attali, Meirav; Cayton-Hodges, Gabrielle

    2014-01-01

    Prior work on the "CBAL"™ mathematics competency model resulted in an initial competency model for middle school grades with several learning progressions (LPs) that elaborate central ideas in the competency model and provide a basis for connecting summative and formative assessment. In the current project, we created a competency model…

  5. Future directions for LDEF ionizing radiation modeling and assessments

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    1993-01-01

    A calculational program utilizing data from radiation dosimetry measurements aboard the Long Duration Exposure Facility (LDEF) satellite to reduce the uncertainties in current models defining the ionizing radiation environment is in progress. Most of the effort to date has been on using LDEF radiation dose measurements to evaluate models defining the geomagnetically trapped radiation, which has provided results applicable to radiation design assessments being performed for Space Station Freedom. Plans for future data comparisons, model evaluations, and assessments using additional LDEF data sets (LET spectra, induced radioactivity, and particle spectra) are discussed.

  6. . Ecological conceptual models: a framework and case study on ecosystem management for South Florida sustainability

    USGS Publications Warehouse

    Gentile, J.H.; Harwell, M.A.; Cropper, W.; Harwell, C. C.; DeAngelis, Donald L.; Davis, S.; Ogden, J.C.; Lirman, D.

    2001-01-01

    The Everglades and South Florida ecosystems are the focus of national and international attention because of their current degraded and threatened state. Ecological risk assessment, sustainability and ecosystem and adaptive management principles and processes are being used nationally as a decision and policy framework for a variety of types of ecological assessments. The intent of this study is to demonstrate the application of these paradigms and principles at a regional scale. The effects-directed assessment approach used in this study consists of a retrospective, eco-epidemiological phase to determine the causes for the current conditions and a prospective predictive risk-based assessment using scenario analysis to evaluate future options. Embedded in these assessment phases is a process that begins with the identification of goals and societal preferences which are used to develop an integrated suite of risk-based and policy relevant conceptual models. Conceptual models are used to illustrate the linkages among management (societal) actions, environmental stressors, and societal/ecological effects, and provide the basis for developing and testing causal hypotheses. These models, developed for a variety of landscape units and their drivers, stressors, and endpoints, are used to formulate hypotheses to explain the current conditions. They are also used as the basis for structuring management scenarios and analyses to project the temporal and spatial magnitude of risk reduction and system recovery. Within the context of recovery, the conceptual models are used in the initial development of performance criteria for those stressors that are determined to be most important in shaping the landscape, and to guide the use of numerical models used to develop quantitative performance criteria in the scenario analysis. The results will be discussed within an ecosystem and adaptive management framework that provides the foundation for decision making.

  7. Ecological conceptual models: a framework and case study on ecosystem management for South Florida sustainability.

    PubMed

    Gentile, J H; Harwell, M A; Cropper, W; Harwell, C C; DeAngelis, D; Davis, S; Ogden, J C; Lirman, D

    2001-07-02

    The Everglades and South Florida ecosystems are the focus of national and international attention because of their current degraded and threatened state. Ecological risk assessment, sustainability, and ecosystem and adaptive management principles and processes are being used nationally as a decision and policy framework for a variety of types of ecological assessments. The intent of this study is to demonstrate the application of these paradigms and principles at a regional scale. The effects-directed assessment approach used in this study consists of a retrospective, eco-epidemiological phase to determine the causes for the current conditions and a prospective predictive risk-based assessment using scenario analysis to evaluate future options. Embedded in these assessment phases is a process that begins with the identification of goals and societal preferences which are used to develop an integrated suite of risk-based and policy relevant conceptual models. Conceptual models are used to illustrate the linkages among management (societal) actions, environmental stressors, and societal/ecological effects, and provide the basis for developing and testing causal hypotheses. These models, developed for a variety of landscape units and their drivers, stressors, and endpoints, are used to formulate hypotheses to explain the current conditions. They are also used as the basis for structuring management scenarios and analyses to project the temporal and spatial magnitude of risk reduction and system recovery. Within the context of recovery, the conceptual models are used in the initial development of performance criteria for those stressors that are determined to be most important in shaping the landscape, and to guide the use of numerical models used to develop quantitative performance criteria in the scenario analysis. The results will be discussed within an ecosystem and adaptive management framework that provides the foundation for decision making.

  8. A new assessment method of pHEMT models by comparing relative errors of drain current and its derivatives up to the third order

    NASA Astrophysics Data System (ADS)

    Dobeš, Josef; Grábner, Martin; Puričer, Pavel; Vejražka, František; Míchal, Jan; Popp, Jakub

    2017-05-01

    Nowadays, there exist relatively precise pHEMT models available for computer-aided design, and they are frequently compared to each other. However, such comparisons are mostly based on absolute errors of drain-current equations and their derivatives. In the paper, a novel method is suggested based on relative root-mean-square errors of both drain current and its derivatives up to the third order. Moreover, the relative errors are subsequently relativized to the best model in each category to further clarify obtained accuracies of both drain current and its derivatives. Furthermore, one our older and two newly suggested models are also included in comparison with the traditionally precise Ahmed, TOM-2 and Materka ones. The assessment is performed using measured characteristics of a pHEMT operating up to 110 GHz. Finally, a usability of the proposed models including the higher-order derivatives is illustrated using s-parameters analysis and measurement at more operating points as well as computation and measurement of IP3 points of a low-noise amplifier of a multi-constellation satellite navigation receiver with ATF-54143 pHEMT.

  9. Algal Supply System Design - Harmonized Version

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abodeely, Jared; Stevens, Daniel; Ray, Allison

    2013-03-01

    The objective of this design report is to provide an assessment of current technologies used for production, dewatering, and converting microalgae cultivated in open-pond systems to biofuel. The original draft design was created in 2011 and has subsequently been brought into agreement with the DOE harmonized model. The design report extends beyond this harmonized model to discuss some of the challenges with assessing algal production systems, including the ability to (1) quickly assess alternative algal production system designs, (2) assess spatial and temporal variability, and (3) perform large-scale assessments considering multiple scenarios for thousands of potential sites. The Algae Logisticsmore » Model (ALM) was developed to address each of these limitations of current modeling efforts to enable assessment of the economic feasibility of algal production systems across the United States. The (ALM) enables (1) dynamic assessments using spatiotemporal conditions, (2) exploration of algal production system design configurations, (3) investigation of algal production system operating assumptions, and (4) trade-off assessments with technology decisions and operating assumptions. The report discusses results from the ALM, which is used to assess the baseline design determined by harmonization efforts between U.S. DOE national laboratories. Productivity and resource assessment data is provided by coupling the ALM with the Biomass Assessment Tool developed at PNNL. This high-fidelity data is dynamically passed to the ALM and used to help better understand the impacts of spatial and temporal constraints on algal production systems by providing a cost for producing extracted algal lipids annually for each potential site.« less

  10. AIR QUALITY MODELING OF PM AND AIR TOXICS AT NEIGHBORHOOD SCALES

    EPA Science Inventory

    The current interest in fine particles and toxics pollutants provide an impetus for extending air quality modeling capability towards improving exposure modeling and assessments. Human exposure models require information on concentration derived from interpolation of observati...

  11. Risk stratification following acute myocardial infarction.

    PubMed

    Singh, Mandeep

    2007-07-01

    This article reviews the current risk assessment models available for patients presenting with myocardial infarction (MI). These practical tools enhance the health care provider's ability to rapidly and accurately assess patient risk from the event or revascularization therapy, and are of paramount importance in managing patients presenting with MI. This article highlights the models used for ST-elevation MI (STEMI) and non-ST elevation MI (NSTEMI) and provides an additional description of models used to assess risks after primary angioplasty (ie, angioplasty performed for STEMI).

  12. The SIETTE Automatic Assessment Environment

    ERIC Educational Resources Information Center

    Conejo, Ricardo; Guzmán, Eduardo; Trella, Monica

    2016-01-01

    This article describes the evolution and current state of the domain-independent Siette assessment environment. Siette supports different assessment methods--including classical test theory, item response theory, and computer adaptive testing--and integrates them with multidimensional student models used by intelligent educational systems.…

  13. Towards an Integrated Model for Developing Sustainable Assessment Skills

    ERIC Educational Resources Information Center

    Fastre, Greet M. J.; van der Klink, Marcel R.; Sluijsmans, Dominique; van Merrienboer, Jeroen J. G.

    2013-01-01

    One of the goals of current education is to ensure that graduates can act as independent lifelong learners. Graduates need to be able to assess their own learning and interpret assessment results. The central question in this article is how to acquire sustainable assessment skills, enabling students to assess their performance and learning…

  14. Integrated earth system dynamic modeling for life cycle impact assessment of ecosystem services.

    PubMed

    Arbault, Damien; Rivière, Mylène; Rugani, Benedetto; Benetto, Enrico; Tiruta-Barna, Ligia

    2014-02-15

    Despite the increasing awareness of our dependence on Ecosystem Services (ES), Life Cycle Impact Assessment (LCIA) does not explicitly and fully assess the damages caused by human activities on ES generation. Recent improvements in LCIA focus on specific cause-effect chains, mainly related to land use changes, leading to Characterization Factors (CFs) at the midpoint assessment level. However, despite the complexity and temporal dynamics of ES, current LCIA approaches consider the environmental mechanisms underneath ES to be independent from each other and devoid of dynamic character, leading to constant CFs whose representativeness is debatable. This paper takes a step forward and is aimed at demonstrating the feasibility of using an integrated earth system dynamic modeling perspective to retrieve time- and scenario-dependent CFs that consider the complex interlinkages between natural processes delivering ES. The GUMBO (Global Unified Metamodel of the Biosphere) model is used to quantify changes in ES production in physical terms - leading to midpoint CFs - and changes in human welfare indicators, which are considered here as endpoint CFs. The interpretation of the obtained results highlights the key methodological challenges to be solved to consider this approach as a robust alternative to the mainstream rationale currently adopted in LCIA. Further research should focus on increasing the granularity of environmental interventions in the modeling tools to match current standards in LCA and on adapting the conceptual approach to a spatially-explicit integrated model. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Dark Zones of Solid Propellant Flames: Critical Assessment and Quantitative Modeling of Experimental Datasets With Analysis of Chemical Pathways and Sensitivities

    DTIC Science & Technology

    2011-01-01

    with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1...Research Associate at ARL with WRA, and largely completed more recently while at Dept. of Chem., SUNY, Cortland, NY. Currently unaffiliated. †Former...promised to provide an extensive, definitive review critically assessing our current understanding of DZ structure and chemistry, and providing a documented

  16. EPA'S ENVIRONMENTAL MONITORING AND ASSESSMENT PROGRAM: AVAILABILITY OF BROAD-SCALE ENVIRONMENTAL DATA AND OPPORTUNITIES FOR USE IN ENVIRONMENTAL MODELING APPLICATIONS

    EPA Science Inventory

    The Environmental Monitoring and Assessment Program (EMAP) has collected a suite of environmental data over a four year period from estuarine system in the mid-Atlantic and Gulf of Mexico. ata are currently available for secondary users including environmental modelers. he data w...

  17. Generating Multiple Imputations for Matrix Sampling Data Analyzed with Item Response Models.

    ERIC Educational Resources Information Center

    Thomas, Neal; Gan, Nianci

    1997-01-01

    Describes and assesses missing data methods currently used to analyze data from matrix sampling designs implemented by the National Assessment of Educational Progress. Several improved methods are developed, and these models are evaluated using an EM algorithm to obtain maximum likelihood estimates followed by multiple imputation of complete data…

  18. Modelling consequences of change in biodiversity and ...

    EPA Pesticide Factsheets

    This chapter offers an assessment of the rapidly changing landscape of methods assessing and forecasting the benefits that people receive from nature and how these benefits are shaped by institutions and various anthropogenic assets. There has been an explosion of activity in understanding and modeling the benefits that people receive from nature, and this explosion has provided a diversity of approaches that are both complementary and contradictory. However, there remain major gaps in what current models can do. They are not well suited to estimate most types of benefits at national, regional, or global scales. they are focused on decision analysis, but have not focused on implementation, learning, or dialogue. This hap in particular means that current models are not well suited to bridging among multiple knowledge systems, however, there are initial efforts made towards this goal. Furthermore, while participatory social-ecological scenarios are able to bridge multiple knowledge systems in their assessment and analysis of multiple ecosystem series, the social-ecological scenarios community is fragmented and not well connected. Consequently, IPBES has an excellent knowledge base to build upon, but a real investment in building a more integrated modeling and scenarios community of practice is needed to produce a more complete and useful toolbox of approaches to meet the needs of IPBES assessment and other assessment of nature benefits. This Chapter describes

  19. Case Study of a Computer Based Examination System

    ERIC Educational Resources Information Center

    Fluck, Andrew; Pullen, Darren; Harper, Colleen

    2009-01-01

    Electronic supported assessment or e-Assessment is a field of growing importance, but it has yet to make a significant impact in the Australian higher education sector (Byrnes & Ellis, 2006). Current computer based assessment models focus on the assessment of knowledge rather than deeper understandings, using multiple choice type questions,…

  20. Medical Updates Number 5 to the International Space Station Probability Risk Assessment (PRA) Model Using the Integrated Medical Model

    NASA Technical Reports Server (NTRS)

    Butler, Doug; Bauman, David; Johnson-Throop, Kathy

    2011-01-01

    The Integrated Medical Model (IMM) Project has been developing a probabilistic risk assessment tool, the IMM, to help evaluate in-flight crew health needs and impacts to the mission due to medical events. This package is a follow-up to a data package provided in June 2009. The IMM currently represents 83 medical conditions and associated ISS resources required to mitigate medical events. IMM end state forecasts relevant to the ISS PRA model include evacuation (EVAC) and loss of crew life (LOCL). The current version of the IMM provides the basis for the operational version of IMM expected in the January 2011 timeframe. The objectives of this data package are: 1. To provide a preliminary understanding of medical risk data used to update the ISS PRA Model. The IMM has had limited validation and an initial characterization of maturity has been completed using NASA STD 7009 Standard for Models and Simulation. The IMM has been internally validated by IMM personnel but has not been validated by an independent body external to the IMM Project. 2. To support a continued dialogue between the ISS PRA and IMM teams. To ensure accurate data interpretation, and that IMM output format and content meets the needs of the ISS Risk Management Office and ISS PRA Model, periodic discussions are anticipated between the risk teams. 3. To help assess the differences between the current ISS PRA and IMM medical risk forecasts of EVAC and LOCL. Follow-on activities are anticipated based on the differences between the current ISS PRA medical risk data and the latest medical risk data produced by IMM.

  1. A Bayesian Hierarchical Model for Large-Scale Educational Surveys: An Application to the National Assessment of Educational Progress. Research Report. ETS RR-04-38

    ERIC Educational Resources Information Center

    Johnson, Matthew S.; Jenkins, Frank

    2005-01-01

    Large-scale educational assessments such as the National Assessment of Educational Progress (NAEP) sample examinees to whom an exam will be administered. In most situations the sampling design is not a simple random sample and must be accounted for in the estimating model. After reviewing the current operational estimation procedure for NAEP, this…

  2. Predicting in-patient falls in a geriatric clinic: a clinical study combining assessment data and simple sensory gait measurements.

    PubMed

    Marschollek, M; Nemitz, G; Gietzelt, M; Wolf, K H; Meyer Zu Schwabedissen, H; Haux, R

    2009-08-01

    Falls are among the predominant causes for morbidity and mortality in elderly persons and occur most often in geriatric clinics. Despite several studies that have identified parameters associated with elderly patients' fall risk, prediction models -- e.g., based on geriatric assessment data -- are currently not used on a regular basis. Furthermore, technical aids to objectively assess mobility-associated parameters are currently not used. To assess group differences in clinical as well as common geriatric assessment data and sensory gait measurements between fallers and non-fallers in a geriatric sample, and to derive and compare two prediction models based on assessment data alone (model #1) and added sensory measurement data (model #2). For a sample of n=110 geriatric in-patients (81 women, 29 men) the following fall risk-associated assessments were performed: Timed 'Up & Go' (TUG) test, STRATIFY score and Barthel index. During the TUG test the subjects wore a triaxial accelerometer, and sensory gait parameters were extracted from the data recorded. Group differences between fallers (n=26) and non-fallers (n=84) were compared using Student's t-test. Two classification tree prediction models were computed and compared. Significant differences between the two groups were found for the following parameters: time to complete the TUG test, transfer item (Barthel), recent falls (STRATIFY), pelvic sway while walking and step length. Prediction model #1 (using common assessment data only) showed a sensitivity of 38.5% and a specificity of 97.6%, prediction model #2 (assessment data plus sensory gait parameters) performed with 57.7% and 100%, respectively. Significant differences between fallers and non-fallers among geriatric in-patients can be detected for several assessment subscores as well as parameters recorded by simple accelerometric measurements during a common mobility test. Existing geriatric assessment data may be used for falls prediction on a regular basis. Adding sensory data improves the specificity of our test markedly.

  3. Operations Assessment of Launch Vehicle Architectures using Activity Based Cost Models

    NASA Technical Reports Server (NTRS)

    Ruiz-Torres, Alex J.; McCleskey, Carey

    2000-01-01

    The growing emphasis on affordability for space transportation systems requires the assessment of new space vehicles for all life cycle activities, from design and development, through manufacturing and operations. This paper addresses the operational assessment of launch vehicles, focusing on modeling the ground support requirements of a vehicle architecture, and estimating the resulting costs and flight rate. This paper proposes the use of Activity Based Costing (ABC) modeling for this assessment. The model uses expert knowledge to determine the activities, the activity times and the activity costs based on vehicle design characteristics. The approach provides several advantages to current approaches to vehicle architecture assessment including easier validation and allowing vehicle designers to understand the cost and cycle time drivers.

  4. A Review: The Current In Vivo Models for the Discovery and Utility of New Anti-leishmanial Drugs Targeting Cutaneous Leishmaniasis

    PubMed Central

    Mears, Emily Rose; Modabber, Farrokh; Don, Robert; Johnson, George E.

    2015-01-01

    The current in vivo models for the utility and discovery of new potential anti-leishmanial drugs targeting Cutaneous Leishmaniasis (CL) differ vastly in their immunological responses to the disease and clinical presentation of symptoms. Animal models that show similarities to the human form of CL after infection with Leishmania should be more representative as to the effect of the parasite within a human. Thus, these models are used to evaluate the efficacy of new anti-leishmanial compounds before human clinical trials. Current animal models aim to investigate (i) host–parasite interactions, (ii) pathogenesis, (iii) biochemical changes/pathways, (iv) in vivo maintenance of parasites, and (v) clinical evaluation of drug candidates. This review focuses on the trends of infection observed between Leishmania parasites, the predictability of different strains, and the determination of parasite load. These factors were used to investigate the overall effectiveness of the current animal models. The main aim was to assess the efficacy and limitations of the various CL models and their potential for drug discovery and evaluation. In conclusion, we found that the following models are the most suitable for the assessment of anti-leishmanial drugs: L. major–C57BL/6 mice (or–vervet monkey, or–rhesus monkeys), L. tropica–CsS-16 mice, L. amazonensis–CBA mice, L. braziliensis–golden hamster (or–rhesus monkey). We also provide in-depth guidance for which models are not suitable for these investigations. PMID:26334763

  5. A framework for global river flood risk assessments

    NASA Astrophysics Data System (ADS)

    Winsemius, H. C.; Van Beek, L. P. H.; Jongman, B.; Ward, P. J.; Bouwman, A.

    2013-05-01

    There is an increasing need for strategic global assessments of flood risks in current and future conditions. In this paper, we propose a framework for global flood risk assessment for river floods, which can be applied in current conditions, as well as in future conditions due to climate and socio-economic changes. The framework's goal is to establish flood hazard and impact estimates at a high enough resolution to allow for their combination into a risk estimate, which can be used for strategic global flood risk assessments. The framework estimates hazard at a resolution of ~ 1 km2 using global forcing datasets of the current (or in scenario mode, future) climate, a global hydrological model, a global flood-routing model, and more importantly, an inundation downscaling routine. The second component of the framework combines hazard with flood impact models at the same resolution (e.g. damage, affected GDP, and affected population) to establish indicators for flood risk (e.g. annual expected damage, affected GDP, and affected population). The framework has been applied using the global hydrological model PCR-GLOBWB, which includes an optional global flood routing model DynRout, combined with scenarios from the Integrated Model to Assess the Global Environment (IMAGE). We performed downscaling of the hazard probability distributions to 1 km2 resolution with a new downscaling algorithm, applied on Bangladesh as a first case study application area. We demonstrate the risk assessment approach in Bangladesh based on GDP per capita data, population, and land use maps for 2010 and 2050. Validation of the hazard estimates has been performed using the Dartmouth Flood Observatory database. This was done by comparing a high return period flood with the maximum observed extent, as well as by comparing a time series of a single event with Dartmouth imagery of the event. Validation of modelled damage estimates was performed using observed damage estimates from the EM-DAT database and World Bank sources. We discuss and show sensitivities of the estimated risks with regard to the use of different climate input sets, decisions made in the downscaling algorithm, and different approaches to establish impact models.

  6. Using Multigroup Confirmatory Factor Analysis to Test Measurement Invariance in Raters: A Clinical Skills Examination Application

    ERIC Educational Resources Information Center

    Kahraman, Nilufer; Brown, Crystal B.

    2015-01-01

    Psychometric models based on structural equation modeling framework are commonly used in many multiple-choice test settings to assess measurement invariance of test items across examinee subpopulations. The premise of the current article is that they may also be useful in the context of performance assessment tests to test measurement invariance…

  7. Governance of Higher Education in Britain: The Significance of the Research Assessment Exercises for the Funding Council Model

    ERIC Educational Resources Information Center

    Tapper, Ted; Salter, Brian

    2004-01-01

    This article uses the political struggles that have enveloped the research assessment exercises (RAEs) to interpret the UK's current funding council model of governance. Ironically, the apparently widespread improvement in the research performance of British universities, as demonstrated by RAE 2001, has made it more difficult to distribute…

  8. Bridging the gap between habitat-modeling research and bird conservation with dynamic landscape and population models

    Treesearch

    Frank R., III Thompson

    2009-01-01

    Habitat models are widely used in bird conservation planning to assess current habitat or populations and to evaluate management alternatives. These models include species-habitat matrix or database models, habitat suitability models, and statistical models that predict abundance. While extremely useful, these approaches have some limitations.

  9. Drivers and rates of stock assessments in the United States

    PubMed Central

    Thorson, James T.; Melnychuk, Michael C.; Methot, Richard; Blackhart, Kristan

    2018-01-01

    Fisheries management is most effective when based on scientific estimates of sustainable fishing rates. While some simple approaches allow estimation of harvest limits, more data-intensive stock assessments are generally required to evaluate the stock’s biomass and fishing rates relative to sustainable levels. Here we evaluate how stock characteristics relate to the rate of new assessments in the United States. Using a statistical model based on time-to-event analysis and 569 coastal marine fish and invertebrate stocks landed in commercial fisheries, we quantify the impact of region, habitat, life-history, and economic factors on the annual probability of being assessed. Although the majority of landings come from assessed stocks in all regions, less than half of the regionally-landed species currently have been assessed. As expected, our time-to-event model identified landed tonnage and ex-vessel price as the dominant factors determining increased rates of new assessments. However, we also found that after controlling for landings and price, there has been a consistent bias towards assessing larger-bodied species. A number of vulnerable groups such as rockfishes (Scorpaeniformes) and groundsharks (Carcharhiniformes) have a relatively high annual probability of being assessed after controlling for their relatively small tonnage and low price. Due to relatively low landed tonnage and price of species that are currently unassessed, our model suggests that the number of assessed stocks will increase more slowly in future decades. PMID:29750789

  10. Spatio-temporal pattern clustering for skill assessment of the Korea Operational Oceanographic System

    NASA Astrophysics Data System (ADS)

    Kim, J.; Park, K.

    2016-12-01

    In order to evaluate the performance of operational forecast models in the Korea operational oceanographic system (KOOS) which has been developed by Korea Institute of Ocean Science and Technology (KIOST), a skill assessment (SA) tool has developed and provided multiple skill metrics including not only correlation and error skills by comparing predictions and observation but also pattern clustering with numerical models, satellite, and observation. The KOOS has produced 72 hours forecast information on atmospheric and hydrodynamic forecast variables of wind, pressure, current, tide, wave, temperature, and salinity at every 12 hours per day produced by operating numerical models such as WRF, ROMS, MOM5, WW-III, and SWAN and the SA has conducted to evaluate the forecasts. We have been operationally operated several kinds of numerical models such as WRF, ROMS, MOM5, MOHID, WW-III. Quantitative assessment of operational ocean forecast model is very important to provide accurate ocean forecast information not only to general public but also to support ocean-related problems. In this work, we propose a method of pattern clustering using machine learning method and GIS-based spatial analytics to evaluate spatial distribution of numerical models and spatial observation data such as satellite and HF radar. For the clustering, we use 10 or 15 years-long reanalysis data which was computed by the KOOS, ECMWF, and HYCOM to make best matching clusters which are classified physical meaning with time variation and then we compare it with forecast data. Moreover, for evaluating current, we develop extraction method of dominant flow and apply it to hydrodynamic models and HF radar's sea surface current data. By applying pattern clustering method, it allows more accurate and effective assessment of ocean forecast models' performance by comparing not only specific observation positions which are determined by observation stations but also spatio-temporal distribution of whole model areas. We believe that our proposed method will be very useful to examine and evaluate large amount of numerical modeling data as well as satellite data.

  11. An Overview of NASA's Orbital Debris Engineering Model

    NASA Technical Reports Server (NTRS)

    Matney, Mark

    2010-01-01

    This slide presentation reviews the importance of Orbital debris engineering models. They are mathematical tools to assess orbital debris flux. It briefly reviews the history of the orbital debris engineering models, and reviews the new features in the current model (i.e., ORDEM2010).

  12. The interpersonal relationship in clinical practice. The Barrett-Lennard Relationship Inventory as an assessment instrument.

    PubMed

    Simmons, J; Roberge, L; Kendrick, S B; Richards, B

    1995-03-01

    The biomedical model that has long been central to medical practice is gradually being expanded to a broader biopsychosocial model. Relationship-building skills commensurate with the new paradigm need to be understood by educators and taught to medical practitioners. The person-centered, or humanistic, model of psychologist Carl Rogers provides a theoretical approach for the development of effective biopsychosocial relationships. The Barrett-Lennard Relationship Inventory (BLRI) was developed in 1962 as an assessment instrument for the person-centered model. In this article, the person-centered model and the use of the BLRI as an assessment instrument of this model are discussed. Current and potential uses of the BLRI are explored.

  13. A robust and flexible Geospatial Modeling Interface (GMI) for environmental model deployment and evaluation

    USDA-ARS?s Scientific Manuscript database

    This paper provides an overview of the GMI (Geospatial Modeling Interface) simulation framework for environmental model deployment and assessment. GMI currently provides access to multiple environmental models including AgroEcoSystem-Watershed (AgES-W), Nitrate Leaching and Economic Analysis 2 (NLEA...

  14. Risk assessment of vector-borne diseases for public health governance.

    PubMed

    Sedda, L; Morley, D W; Braks, M A H; De Simone, L; Benz, D; Rogers, D J

    2014-12-01

    In the context of public health, risk governance (or risk analysis) is a framework for the assessment and subsequent management and/or control of the danger posed by an identified disease threat. Generic frameworks in which to carry out risk assessment have been developed by various agencies. These include monitoring, data collection, statistical analysis and dissemination. Due to the inherent complexity of disease systems, however, the generic approach must be modified for individual, disease-specific risk assessment frameworks. The analysis was based on the review of the current risk assessments of vector-borne diseases adopted by the main Public Health organisations (OIE, WHO, ECDC, FAO, CDC etc…). Literature, legislation and statistical assessment of the risk analysis frameworks. This review outlines the need for the development of a general public health risk assessment method for vector-borne diseases, in order to guarantee that sufficient information is gathered to apply robust models of risk assessment. Stochastic (especially spatial) methods, often in Bayesian frameworks are now gaining prominence in standard risk assessment procedures because of their ability to assess accurately model uncertainties. Risk assessment needs to be addressed quantitatively wherever possible, and submitted with its quality assessment in order to enable successful public health measures to be adopted. In terms of current practice, often a series of different models and analyses are applied to the same problem, with results and outcomes that are difficult to compare because of the unknown model and data uncertainties. Therefore, the risk assessment areas in need of further research are identified in this article. Copyright © 2014 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  15. Near Earth Asteroid Characterization for Threat Assessment

    NASA Technical Reports Server (NTRS)

    Dotson, Jessie; Mathias, Donovan; Wheeler, Lorien; Wooden, Diane; Bryson, Kathryn; Ostrowski, Daniel

    2017-01-01

    Physical characteristics of NEAs are an essential input to modeling behavior during atmospheric entry and to assess the risk of impact but determining these properties requires a non-trivial investment of time and resources. The characteristics relevant to these models include size, density, strength and ablation coefficient. Some of these characteristics cannot be directly measured, but rather must be inferred from related measurements of asteroids and/or meteorites. Furthermore, for the majority of NEAs, only the basic measurements exist so often properties must be inferred from statistics of the population of more completely characterized objects. The Asteroid Threat Assessment Project at NASA Ames Research Center has developed a probabilistic asteroid impact risk (PAIR) model in order to assess the risk of asteroid impact. Our PAIR model and its use to develop probability distributions of impact risk are discussed in other contributions to PDC 2017 (e.g., Mathias et al.). Here we utilize PAIR to investigate which NEA characteristics are important for assessing the impact threat by investigating how changes in these characteristics alter the damage predicted by PAIR. We will also provide an assessment of the current state of knowledge of the NEA characteristics of importance for asteroid threat assessment. The relative importance of different properties as identified using PAIR will be combined with our assessment of the current state of knowledge to identify potential high impact investigations. In addition, we will discuss an ongoing effort to collate the existing measurements of NEA properties of interest to the planetary defense community into a readily accessible database.

  16. Ferrets as Models for Influenza Virus Transmission Studies and Pandemic Risk Assessments

    PubMed Central

    Barclay, Wendy; Barr, Ian; Fouchier, Ron A.M.; Matsuyama, Ryota; Nishiura, Hiroshi; Peiris, Malik; Russell, Charles J.; Subbarao, Kanta; Zhu, Huachen

    2018-01-01

    The ferret transmission model is extensively used to assess the pandemic potential of emerging influenza viruses, yet experimental conditions and reported results vary among laboratories. Such variation can be a critical consideration when contextualizing results from independent risk-assessment studies of novel and emerging influenza viruses. To streamline interpretation of data generated in different laboratories, we provide a consensus on experimental parameters that define risk-assessment experiments of influenza virus transmissibility, including disclosure of variables known or suspected to contribute to experimental variability in this model, and advocate adoption of more standardized practices. We also discuss current limitations of the ferret transmission model and highlight continued refinements and advances to this model ongoing in laboratories. Understanding, disclosing, and standardizing the critical parameters of ferret transmission studies will improve the comparability and reproducibility of pandemic influenza risk assessment and increase the statistical power and, perhaps, accuracy of this model. PMID:29774862

  17. Proton effects on low noise and high responsivity silicon-based photodiodes for space environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pedroza, Guillaume; Gilard, Olivier; Bourqui, Marie-Lise

    A series of proton irradiations has been carried out on p-n silicon photodiodes for the purpose of assessing the suitability of these devices for the European Galileo space mission. The irradiations were performed at energies of 60, 100, and 150 MeV with proton fluences ranging from 1.7x10{sup 10} to 1x10{sup 11} protons/cm{sup 2}. Dark current, spectral responsivity, and dark current noise were measured before and after each irradiation step. We observed an increase in both dark current, dark current noise, and noise equivalent power and a drop of the spectral responsivity with increasing displacement damage dose. An analytical model hasmore » been developed to investigate proton damage effects through the modeling of the electro-optical characteristics of the photodiode. Experimental degradations were successfully explained taking into account the degradation of the minority carrier diffusion length in the N-region of the photodiode. The degradation model was then applied to assess the end-of-life performance of these devices in the framework of the Galileo mission.« less

  18. Collection of empirical data for assessing 800MHz coverage models

    DOT National Transportation Integrated Search

    2004-12-01

    Wireless communications plays an important role in KDOT operations. Currently, decisions pertaining to KDOTs : 800MHz radio system are made on the basis of coverage models that rely on antenna and terrain characteristics to model the : coverage. W...

  19. Relationships between Lexical Processing Speed, Language Skills, and Autistic Traits in Children

    ERIC Educational Resources Information Center

    Abrigo, Erin

    2012-01-01

    According to current models of spoken word recognition listeners understand speech as it unfolds over time. Eye tracking provides a non-invasive, on-line method to monitor attention, providing insight into the processing of spoken language. In the current project a spoken lexical processing assessment (LPA) confirmed current theories of spoken…

  20. Framework for modelling the cost-effectiveness of systemic interventions aimed to reduce youth delinquency.

    PubMed

    Schawo, Saskia J; van Eeren, Hester; Soeteman, Djira I; van der Veldt, Marie-Christine; Noom, Marc J; Brouwer, Werner; Busschbach, Jan J V; Hakkaart, Leona

    2012-12-01

    Many interventions initiated within and financed from the health care sector are not necessarily primarily aimed at improving health. This poses important questions regarding the operationalisation of economic evaluations in such contexts. We investigated whether assessing cost-effectiveness using state-of-the-art methods commonly applied in health care evaluations is feasible and meaningful when evaluating interventions aimed at reducing youth delinquency. A probabilistic Markov model was constructed to create a framework for the assessment of the cost-effectiveness of systemic interventions in delinquent youth. For illustrative purposes, Functional Family Therapy (FFT), a systemic intervention aimed at improving family functioning and, primarily, reducing delinquent activity in youths, was compared to Treatment as Usual (TAU). "Criminal activity free years" (CAFYs) were introduced as central outcome measure. Criminal activity may e.g. be based on police contacts or committed crimes. In absence of extensive data and for illustrative purposes the current study based criminal activity on available literature on recidivism. Furthermore, a literature search was performed to deduce the model's structure and parameters. Common cost-effectiveness methodology could be applied to interventions for youth delinquency. Model characteristics and parameters were derived from literature and ongoing trial data. The model resulted in an estimate of incremental costs/CAFY and included long-term effects. Illustrative model results point towards dominance of FFT compared to TAU. Using a probabilistic model and the CAFY outcome measure to assess cost-effectiveness of systemic interventions aimed to reduce delinquency is feasible. However, the model structure is limited to three states and the CAFY measure was defined rather crude. Moreover, as the model parameters are retrieved from literature the model results are illustrative in the absence of empirical data. The current model provides a framework to assess the cost-effectiveness of systemic interventions, while taking into account parameter uncertainty and long-term effectiveness. The framework of the model could be used to assess the cost-effectiveness of systemic interventions alongside (clinical) trial data. Consequently, it is suitable to inform reimbursement decisions, since the value for money of systemic interventions can be demonstrated using a decision analytic model. Future research could be focussed on testing the current model based on extensive empirical data, improving the outcome measure and finding appropriate values for that outcome.

  1. Modeling and Simulation Verification, Validation and Accreditation (VV&A): A New Undertaking for the Exploration Systems Mission Directorate

    NASA Technical Reports Server (NTRS)

    Prill, Mark E.

    2005-01-01

    and Accreditation (VV&A) session audience, a snapshot review of the Exploration Space Mission Directorate s (ESMD) investigation into implementation of a modeling and simulation (M&S) VV&A program. The presentation provides some legacy ESMD reference material, including information on the then-current organizational structure, and M&S (Simulation Based Acquisition (SBA)) focus contained therein, to provide a context for the proposed M&S VV&A approach. This reference material briefly highlights the SBA goals and objectives, and outlines FY05 M&S development and implementation consistent with the Subjective Assessment, Constructive Assessment, Operator-in-the-Loop Assessment, Hardware-in-the-Loop Assessment, and In Service Operations Assessment M&S construct, the NASA Exploration Information Ontology Model (NExIOM) data model, and integration with the Windchill-based Integrated Collaborative Environment (ICE). The presentation then addresses the ESMD team s initial conclusions regarding an M&S VV&A program, summarizes the general VV&A implementation approach anticipated, and outlines some of the recognized VV&A program challenges, all within a broader context of the overarching Integrated Modeling and Simulation (IM&S) environment at both the ESMD and Agency (NASA) levels. The presentation concludes with a status on the current M&S organization s progress to date relative to the recommended IM&S implementation activity. The overall presentation was focused to provide, for the Verification, Validation,

  2. Aporrectodea caliginosa, a relevant earthworm species for a posteriori pesticide risk assessment: current knowledge and recommendations for culture and experimental design.

    PubMed

    Bart, Sylvain; Amossé, Joël; Lowe, Christopher N; Mougin, Christian; Péry, Alexandre R R; Pelosi, Céline

    2018-06-21

    Ecotoxicological tests with earthworms are widely used and are mandatory for the risk assessment of pesticides prior to registration and commercial use. The current model species for standardized tests is Eisenia fetida or Eisenia andrei. However, these species are absent from agricultural soils and often less sensitive to pesticides than other earthworm species found in mineral soils. To move towards a better assessment of pesticide effects on non-target organisms, there is a need to perform a posteriori tests using relevant species. The endogeic species Aporrectodea caliginosa (Savigny, 1826) is representative of cultivated fields in temperate regions and is suggested as a relevant model test species. After providing information on its taxonomy, biology, and ecology, we reviewed current knowledge concerning its sensitivity towards pesticides. Moreover, we highlighted research gaps and promising perspectives. Finally, advice and recommendations are given for the establishment of laboratory cultures and experiments using this soil-dwelling earthworm species.

  3. Reliability, Validity, and Factor Structure of the Current Assessment Practice Evaluation-Revised (CAPER) in a National Sample.

    PubMed

    Lyon, Aaron R; Pullmann, Michael D; Dorsey, Shannon; Martin, Prerna; Grigore, Alexandra A; Becker, Emily M; Jensen-Doss, Amanda

    2018-05-11

    Measurement-based care (MBC) is an increasingly popular, evidence-based practice, but there are no tools with established psychometrics to evaluate clinician use of MBC practices in mental health service delivery. The current study evaluated the reliability, validity, and factor structure of scores generated from a brief, standardized tool to measure MBC practices, the Current Assessment Practice Evaluation-Revised (CAPER). Survey data from a national sample of 479 mental health clinicians were used to conduct exploratory and confirmatory factor analyses, as well as reliability and validity analyses (e.g., relationships between CAPER subscales and clinician MBC attitudes). Analyses revealed competing two- and three-factor models. Regardless of the model used, scores from CAPER subscales demonstrated good reliability and convergent and divergent validity with MBC attitudes in the expected directions. The CAPER appears to be a psychometrically sound tool for assessing clinician MBC practices. Future directions for development and application of the tool are discussed.

  4. Upcoming Environmental Modeling in Ground Water Public Meeting

    EPA Pesticide Factsheets

    This meeting provides a public forum for pesticide registrants, other stakeholders and EPA to discuss current issues related to modeling pesticide fate, transport, and exposure for pesticide risk assessments in a regulatory context.

  5. Climate change impact on the establishment and seasonal abundance of Invasive Mosquito Species: current state and future risk maps over southeast Europe

    NASA Astrophysics Data System (ADS)

    Tagaris, Efthimios; -Eleni Sotiropoulou, Rafaella; Sotiropoulos, Andreas; Spanos, Ioannis; Milonas, Panayiotis; Michaelakis, Antonios

    2017-04-01

    Establishment and seasonal abundance of a region for Invasive Mosquito Species (IMS) are related to climatic parameters such as temperature and precipitation. In this work the current state is assessed using data from the European Climate Assessment and Dataset (ECA&D) project over Greece and Italy for the development of current spatial risk databases of IMS. Results are validated from the installation of a prototype IMS monitoring device that has been designed and developed in the framework of the LIFE CONOPS project at key points across the two countries. Since climate models suggest changes in future temperature and precipitation rates, the future potentiality of IMS establishment and spread over Greece and Italy is assessed using the climatic parameters in 2050's provided by the NASA GISS GCM ModelE under the IPCC-A1B emissions scenarios. The need for regional climate projections in a finer grid size is assessed using the Weather Research and Forecasting (WRF) model to dynamically downscale GCM simulations. The estimated changes in the future meteorological parameters are combined with the observation data in order to estimate the future levels of the climatic parameters of interest. The final product includes spatial distribution maps presenting the future suitability of a region for the establishment and seasonal abundance of the IMS over Greece and Italy. Acknowledgement: LIFE CONOPS project "Development & demonstration of management plans against - the climate change enhanced - invasive mosquitoes in S. Europe" (LIFE12 ENV/GR/000466).

  6. Watershed Health Assessment Tools Investigating Fisheries WHAT IF Version 2: A Manager’s Guide to New Features

    EPA Pesticide Factsheets

    The CVI Watershed Health Assessment Tool Investigating Fisheries, WHAT IF version 2, currently contains five components: Regional Prioritization Tool, Hydrologic Tool, Clustering Tool, Habitat Suitability Tool, BASS model

  7. NOTE Effects of skeletal muscle anisotropy on induced currents from low-frequency magnetic fields

    NASA Astrophysics Data System (ADS)

    Tachas, Nikolaos J.; Samaras, Theodoros; Baskourelos, Konstantinos; Sahalos, John N.

    2009-12-01

    Studies which take into account the anisotropy of tissue dielectric properties for the numerical assessment of induced currents from low-frequency magnetic fields are scarce. In the present study, we compare the induced currents in two anatomical models, using the impedance method. In the first model, we assume that all tissues have isotropic conductivity, whereas in the second one, we assume anisotropic conductivity for the skeletal muscle. Results show that tissue anisotropy should be taken into account when investigating the exposure to low-frequency magnetic fields, because it leads to higher induced current values.

  8. Fostering and Assessing Creativity in Technology Education

    ERIC Educational Resources Information Center

    Buelin-Biesecker, Jennifer Katherine

    2012-01-01

    This study compared the creative outcomes in student work resulting from two pedagogical approaches to creative problem solving activities. A secondary goal was to validate the Consensual Assessment Technique (CAT) as a means of assessing creativity. Linear models for problem solving and design processes serve as the current paradigm in classroom…

  9. Online Faculty Development and Assessment System (OFDAS)

    ERIC Educational Resources Information Center

    Villar, Luis M.; Alegre, Olga M.

    2006-01-01

    The rapid growth of online learning has led to the development of faculty inservice evaluation models focused on quality improvement of degree programs. Based on current "best practices" of student online assessment, the Online Faculty Development and Assessment System (OFDAS), created at the Canary Islands, was designed to serve the…

  10. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOGIC MODELING TOOL FOR LANDSCAPE ASSESSMENT AND WATERSHED MANAGEMENT

    EPA Science Inventory

    The assessment of land use and land cover is an extremely important activity for contemporary land management. A large body of current literature suggests that human land-use practice is the most important factor influencing natural resource management and environmental condition...

  11. A Critical Analysis of the Child and Adolescent Wellness Scale (CAWS)

    ERIC Educational Resources Information Center

    Weller-Clarke, Alandra

    2006-01-01

    Current practice for assessing children and adolescents rely on objectively scored deficit-based models and/or informal assessments to determine how maladaptive behaviors affect performance. Social-emotional assessment instruments are used in schools and typically provide information related to behavioral and emotional deficits, but provide little…

  12. Are We Teaching Them Anything?: A Model for Measuring Methodology Skills in the Political Science Major

    ERIC Educational Resources Information Center

    Siver, Christi; Greenfest, Seth W.; Haeg, G. Claire

    2016-01-01

    While the literature emphasizes the importance of teaching political science students methods skills, there currently exists little guidance for how to assess student learning over the course of their time in the major. To address this gap, we develop a model set of assessment tools that may be adopted and adapted by political science departments…

  13. Models for mapping potential habitat at landscape scales: an example using northern spotted owls.

    Treesearch

    William C. McComb; Michael T. McGrath; Thomas A. Spies; David Vesely

    2002-01-01

    We are assessing the potential for current and alternative policies in the Oregon Coast Range to affect habitat capability for a suite of forest resources. We provide an example of a spatially explicit habitat capability model for northern spotted owls (Strix occidentalis caurina)to illustrate the approach we are taking to assess potential changes...

  14. Reassessing the NTCTCS Staging Systems for Differentiated Thyroid Cancer, Including Age at Diagnosis

    PubMed Central

    McLeod, Donald S.A.; Jonklaas, Jacqueline; Brierley, James D.; Ain, Kenneth B.; Cooper, David S.; Fein, Henry G.; Haugen, Bryan R.; Ladenson, Paul W.; Magner, James; Ross, Douglas S.; Skarulis, Monica C.; Steward, David L.; Xing, Mingzhao; Litofsky, Danielle R.; Maxon, Harry R.

    2015-01-01

    Background: Thyroid cancer is unique for having age as a staging variable. Recently, the commonly used age cut-point of 45 years has been questioned. Objective: This study assessed alternate staging systems on the outcome of overall survival, and compared these with current National Thyroid Cancer Treatment Cooperative Study (NTCTCS) staging systems for papillary and follicular thyroid cancer. Methods: A total of 4721 patients with differentiated thyroid cancer were assessed. Five potential alternate staging systems were generated at age cut-points in five-year increments from 35 to 70 years, and tested for model discrimination (Harrell's C-statistic) and calibration (R2). The best five models for papillary and follicular cancer were further tested with bootstrap resampling and significance testing for discrimination. Results: The best five alternate papillary cancer systems had age cut-points of 45–50 years, with the highest scoring model using 50 years. No significant difference in C-statistic was found between the best alternate and current NTCTCS systems (p = 0.200). The best five alternate follicular cancer systems had age cut-points of 50–55 years, with the highest scoring model using 50 years. All five best alternate staging systems performed better compared with the current system (p = 0.003–0.035). There was no significant difference in discrimination between the best alternate system (cut-point age 50 years) and the best system of cut-point age 45 years (p = 0.197). Conclusions: No alternate papillary cancer systems assessed were significantly better than the current system. New alternate staging systems for follicular cancer appear to be better than the current NTCTCS system, although they require external validation. PMID:26203804

  15. Student Responses to an ICT-Based E-Assessment Application for the Teaching Practicum/Teaching Practice MODULE

    ERIC Educational Resources Information Center

    Davids, M. Noor

    2017-01-01

    Situated within the context of Initial Teacher Education (ITE) in South Africa, this study introduces the notion of an interactive Teaching Practicum E- Assessment application: e-assessment application for the teaching practicum/Teaching Practice module to replace the current model of assessment. At present students enrolled for an Initial Teacher…

  16. On the importance of incorporating sampling weights in occupancy model estimation

    EPA Science Inventory

    Occupancy models are used extensively to assess wildlife-habitat associations and to predict species distributions across large geographic regions. Occupancy models were developed as a tool to properly account for imperfect detection of a species. Current guidelines on survey des...

  17. The mental health care model in Brazil: analyses of the funding, governance processes, and mechanisms of assessment

    PubMed Central

    Trapé, Thiago Lavras; Campos, Rosana Onocko

    2017-01-01

    ABSTRACT OBJECTIVE This study aims to analyze the current status of the mental health care model of the Brazilian Unified Health System, according to its funding, governance processes, and mechanisms of assessment. METHODS We have carried out a documentary analysis of the ordinances, technical reports, conference reports, normative resolutions, and decrees from 2009 to 2014. RESULTS This is a time of consolidation of the psychosocial model, with expansion of the health care network and inversion of the funding for community services with a strong emphasis on the area of crack cocaine and other drugs. Mental health is an underfunded area within the chronically underfunded Brazilian Unified Health System. The governance model constrains the progress of essential services, which creates the need for the incorporation of a process of regionalization of the management. The mechanisms of assessment are not incorporated into the health policy in the bureaucratic field. CONCLUSIONS There is a need to expand the global funding of the area of health, specifically mental health, which has been shown to be a successful policy. The current focus of the policy seems to be archaic in relation to the precepts of the psychosocial model. Mechanisms of assessment need to be expanded. PMID:28355335

  18. Modeling High-Impact Weather and Climate: Lessons From a Tropical Cyclone Perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Done, James; Holland, Greg; Bruyere, Cindy

    2013-10-19

    Although the societal impact of a weather event increases with the rarity of the event, our current ability to assess extreme events and their impacts is limited by not only rarity but also by current model fidelity and a lack of understanding of the underlying physical processes. This challenge is driving fresh approaches to assess high-impact weather and climate. Recent lessons learned in modeling high-impact weather and climate are presented using the case of tropical cyclones as an illustrative example. Through examples using the Nested Regional Climate Model to dynamically downscale large-scale climate data the need to treat bias inmore » the driving data is illustrated. Domain size, location, and resolution are also shown to be critical and should be guided by the need to: include relevant regional climate physical processes; resolve key impact parameters; and to accurately simulate the response to changes in external forcing. The notion of sufficient model resolution is introduced together with the added value in combining dynamical and statistical assessments to fill out the parent distribution of high-impact parameters. Finally, through the example of a tropical cyclone damage index, direct impact assessments are resented as powerful tools that distill complex datasets into concise statements on likely impact, and as highly effective communication devices.« less

  19. Underground Test Area Subproject Phase I Data Analysis Task. Volume VIII - Risk Assessment Documentation Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  20. Application of the cognitive therapy model to initial crisis assessment.

    PubMed

    Calvert, Patricia; Palmer, Christine

    2003-03-01

    This article provides a background to the development of cognitive therapy and cognitive therapeutic skills with a specific focus on the treatment of a depressive episode. It discusses the utility of cognitive therapeutic strategies to the model of crisis theory and initial crisis assessment currently used by the Community Assessment & Treatment Team of Waitemata District Health Board on the North Shore of Auckland, New Zealand. A brief background to cognitive therapy is provided, followed by a comprehensive example of the use of the Socratic questioning method in guiding collaborative assessment and treatment of suicidality by nurses during the initial crisis assessment.

  1. The assessment and treatment of prosodic disorders and neurological theories of prosody.

    PubMed

    Diehl, Joshua J; Paul, Rhea

    2009-08-01

    In this article, we comment on specific aspects of Peppé (Peppé, 2009). In particular, we address the assessment and treatment of prosody in clinical settings and discuss current theory on neurological models of prosody. We argue that in order for prosodic assessment instruments and treatment programs to be clinical effective, we need assessment instruments that: (1) have a representative normative comparison sample and strong psychometric properties; (2) are based on empirical information regarding the typical sequence of prosodic acquisition and are sensitive to developmental change; (3) meaningfully subcategorize various aspects of prosody; (4) use tasks that have ecological validity; and (5) have clinical properties, such as length and ease of administration, that allow them to become part of standard language assessment batteries. In addition, we argue that current theories of prosody processing in the brain are moving toward network models that involve multiple brain areas and are crucially dependent on cortical communication. The implications of these observations for future research and clinical practice are outlined.

  2. TEMPy: a Python library for assessment of three-dimensional electron microscopy density fits.

    PubMed

    Farabella, Irene; Vasishtan, Daven; Joseph, Agnel Praveen; Pandurangan, Arun Prasad; Sahota, Harpal; Topf, Maya

    2015-08-01

    Three-dimensional electron microscopy is currently one of the most promising techniques used to study macromolecular assemblies. Rigid and flexible fitting of atomic models into density maps is often essential to gain further insights into the assemblies they represent. Currently, tools that facilitate the assessment of fitted atomic models and maps are needed. TEMPy (template and electron microscopy comparison using Python) is a toolkit designed for this purpose. The library includes a set of methods to assess density fits in intermediate-to-low resolution maps, both globally and locally. It also provides procedures for single-fit assessment, ensemble generation of fits, clustering, and multiple and consensus scoring, as well as plots and output files for visualization purposes to help the user in analysing rigid and flexible fits. The modular nature of TEMPy helps the integration of scoring and assessment of fits into large pipelines, making it a tool suitable for both novice and expert structural biologists.

  3. Experimental and modeling uncertainties in the validation of lower hybrid current drive

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poli, F. M.; Bonoli, P. T.; Chilenski, M.

    Our work discusses sources of uncertainty in the validation of lower hybrid wave current drive simulations against experiments, by evolving self-consistently the magnetic equilibrium and the heating and current drive profiles, calculated with a combined toroidal ray tracing code and 3D Fokker–Planck solver. The simulations indicate a complex interplay of elements, where uncertainties in the input plasma parameters, in the models and in the transport solver combine and compensate each other, at times. It is concluded that ray-tracing calculations should include a realistic representation of the density and temperature in the region between the confined plasma and the wall, whichmore » is especially important in regimes where the LH waves are weakly damped and undergo multiple reflections from the plasma boundary. Uncertainties introduced in the processing of diagnostic data as well as uncertainties introduced by model approximations are assessed. We show that, by comparing the evolution of the plasma parameters in self-consistent simulations with available data, inconsistencies can be identified and limitations in the models or in the experimental data assessed.« less

  4. Experimental and modeling uncertainties in the validation of lower hybrid current drive

    DOE PAGES

    Poli, F. M.; Bonoli, P. T.; Chilenski, M.; ...

    2016-07-28

    Our work discusses sources of uncertainty in the validation of lower hybrid wave current drive simulations against experiments, by evolving self-consistently the magnetic equilibrium and the heating and current drive profiles, calculated with a combined toroidal ray tracing code and 3D Fokker–Planck solver. The simulations indicate a complex interplay of elements, where uncertainties in the input plasma parameters, in the models and in the transport solver combine and compensate each other, at times. It is concluded that ray-tracing calculations should include a realistic representation of the density and temperature in the region between the confined plasma and the wall, whichmore » is especially important in regimes where the LH waves are weakly damped and undergo multiple reflections from the plasma boundary. Uncertainties introduced in the processing of diagnostic data as well as uncertainties introduced by model approximations are assessed. We show that, by comparing the evolution of the plasma parameters in self-consistent simulations with available data, inconsistencies can be identified and limitations in the models or in the experimental data assessed.« less

  5. Animal models of listeriosis: a comparative review of the current state of the art and lessons learned

    PubMed Central

    2012-01-01

    Listeriosis is a leading cause of hospitalization and death due to foodborne illness in the industrialized world. Animal models have played fundamental roles in elucidating the pathophysiology and immunology of listeriosis, and will almost certainly continue to be integral components of the research on listeriosis. Data derived from animal studies helped for example characterize the importance of cell-mediated immunity in controlling infection, allowed evaluation of chemotherapeutic treatments for listeriosis, and contributed to quantitative assessments of the public health risk associated with L. monocytogenes contaminated food commodities. Nonetheless, a number of pivotal questions remain unresolved, including dose-response relationships, which represent essential components of risk assessments. Newly emerging data about species-specific differences have recently raised concern about the validity of most traditional animal models of listeriosis. However, considerable uncertainty about the best choice of animal model remains. Here we review the available data on traditional and potential new animal models to summarize currently recognized strengths and limitations of each model. This knowledge is instrumental for devising future studies and for interpreting current data. We deliberately chose a historical, comparative and cross-disciplinary approach, striving to reveal clues that may help predict the ultimate value of each animal model in spite of incomplete data. PMID:22417207

  6. Animal models of listeriosis: a comparative review of the current state of the art and lessons learned.

    PubMed

    Hoelzer, Karin; Pouillot, Régis; Dennis, Sherri

    2012-03-14

    Listeriosis is a leading cause of hospitalization and death due to foodborne illness in the industrialized world. Animal models have played fundamental roles in elucidating the pathophysiology and immunology of listeriosis, and will almost certainly continue to be integral components of the research on listeriosis. Data derived from animal studies helped for example characterize the importance of cell-mediated immunity in controlling infection, allowed evaluation of chemotherapeutic treatments for listeriosis, and contributed to quantitative assessments of the public health risk associated with L. monocytogenes contaminated food commodities. Nonetheless, a number of pivotal questions remain unresolved, including dose-response relationships, which represent essential components of risk assessments. Newly emerging data about species-specific differences have recently raised concern about the validity of most traditional animal models of listeriosis. However, considerable uncertainty about the best choice of animal model remains. Here we review the available data on traditional and potential new animal models to summarize currently recognized strengths and limitations of each model. This knowledge is instrumental for devising future studies and for interpreting current data. We deliberately chose a historical, comparative and cross-disciplinary approach, striving to reveal clues that may help predict the ultimate value of each animal model in spite of incomplete data.

  7. Workshop overview: approaches to the assessment of the allergenic potential of food from genetically modified crops.

    PubMed

    Ladics, Gregory S; Holsapple, Michael P; Astwood, James D; Kimber, Ian; Knippels, Leon M J; Helm, Ricki M; Dong, Wumin

    2003-05-01

    There is a need to assess the safety of foods deriving from genetically modified (GM) crops, including the allergenic potential of novel gene products. Presently, there is no single in vitro or in vivo model that has been validated for the identification or characterization of potential food allergens. Instead, the evaluation focuses on risk factors such as source of the gene (i.e., allergenic vs. nonallergenic sources), physicochemical and genetic comparisons to known allergens, and exposure assessments. The purpose of this workshop was to gather together researchers working on various strategies for assessing protein allergenicity: (1) to describe the current state of knowledge and progress that has been made in the development and evaluation of appropriate testing strategies and (2) to identify critical issues that must now be addressed. This overview begins with a consideration of the current issues involved in assessing the allergenicity of GM foods. The second section presents information on in vitro models of digestibility, bioinformatics, and risk assessment in the context of clinical prevention and management of food allergy. Data on rodent models are presented in the next two sections. Finally, nonrodent models for assessing protein allergenicity are discussed. Collectively, these studies indicate that significant progress has been made in developing testing strategies. However, further efforts are needed to evaluate and validate the sensitivity, specificity, and reproducibility of many of these assays for determining the allergenicity potential of GM foods.

  8. Atmospheric Aerosol Properties and Climate Impacts

    NASA Technical Reports Server (NTRS)

    Chin, Mian; Kahn, Ralph A.; Remer, Lorraine A.; Yu, Hongbin; Rind, David; Feingold, Graham; Quinn, Patricia K.; Schwartz, Stephen E.; Streets, David G.; DeCola, Phillip; hide

    2009-01-01

    This report critically reviews current knowledge about global distributions and properties of atmospheric aerosols, as they relate to aerosol impacts on climate. It assesses possible next steps aimed at substantially reducing uncertainties in aerosol radiative forcing estimates. Current measurement techniques and modeling approaches are summarized, providing context. As a part of the Synthesis and Assessment Product in the Climate Change Science Program, this assessment builds upon recent related assessments, including the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC AR4, 2007) and other Climate Change Science Program reports. The objectives of this report are (1) to promote a consensus about the knowledge base for climate change decision support, and (2) to provide a synthesis and integration of the current knowledge of the climate-relevant impacts of anthropogenic aerosols for policy makers, policy analysts, and general public, both within and outside the U.S government and worldwide.

  9. Precipitation-runoff and streamflow-routing models for the Willamette River basin, Oregon

    USGS Publications Warehouse

    Laenen, Antonius; Risley, John C.

    1997-01-01

    With an input of current streamflow, precipitation, and air temperature data the combined runoff and routing models can provide current estimates of streamflow at almost 500 locations on the main stem and major tributaries of the Willamette River with a high degree of accuracy. Relative contributions of surface runoff, subsurface flow, and ground-water flow can be assessed for 1 to 10 HRU classes in each of 253 subbasins identified for precipitation-runoff modeling. Model outputs were used with a water-quality model to simulate the movement of dye in the Pudding River as an example

  10. Mexico’s Drug War and Its Unintended Regional Consequences

    DTIC Science & Technology

    2013-03-01

    multiple approaches are designed to solve the problem.9 Analysis of the current strategic environment, relying on the environmental assessment model...current environmental assessment, this paper will provide a brief description of a more desired environment and also a problem statement that depicts...the 1980s the U.S. focused its counter drug efforts in Peru and Bolivia, then the world’s leaders in coca leaf supply. In the meantime, Colombian

  11. USING TWO-DIMENSIONAL HYDRODYNAMIC MODELS AT SCALES OF ECOLOGICAL IMPORTANCE. (R825760)

    EPA Science Inventory

    Modeling of flow features that are important in assessing stream habitat conditions has been a long-standing interest of stream biologists. Recently, they have begun examining the usefulness of two-dimensional (2-D) hydrodynamic models in attaining this objective. Current modelin...

  12. ASSESSING MULTIMEDIA/MULTIPATHWAY EXPOSURE TO ARSENIC USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    A series of case studies is presented focusing on multimedia/multipathway population exposures to arsenic, employing the Population Based Modeling approach of the MENTOR (Modeling Environment for Total Risks) framework. This framework considers currently five exposure routes: i...

  13. Assessing cognitive processes related to insomnia: A review and measurement guide for Harvey's cognitive model for the maintenance of insomnia.

    PubMed

    Hiller, Rachel M; Johnston, Anna; Dohnt, Hayley; Lovato, Nicole; Gradisar, Michael

    2015-10-01

    Cognitive processes play an important role in the maintenance, and treatment of sleep difficulties, including insomnia. In 2002, a comprehensive model was proposed by Harvey. Since its inception the model has received >300 citations, and provided researchers and clinicians with a framework for understanding and treating insomnia. The aim of this review is two-fold. First, we review the current literature investigating each factor proposed in Harvey's cognitive model of insomnia. Second, we summarise the psychometric properties of key measures used to assess the model's factors and mechanisms. From these aims, we demonstrate both strengths and limitations of the current knowledge of appropriate measurements associated with the model. This review aims to stimulate and guide future research in this area; and provide an understanding of the resources available to measure, target, and resolve cognitive factors that may maintain chronic insomnia. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Osteotomy models - the current status on pain scoring and management in small rodents.

    PubMed

    Lang, Annemarie; Schulz, Anja; Ellinghaus, Agnes; Schmidt-Bleek, Katharina

    2016-12-01

    Fracture healing is a complex regeneration process which produces new bone tissue without scar formation. However, fracture healing disorders occur in approximately 10% of human patients and cause severe pain and reduced quality of life. Recently, the development of more standardized, sophisticated and commercially available osteosynthesis techniques reflecting clinical approaches has increased the use of small rodents such as rats and mice in bone healing research dramatically. Nevertheless, there is no standard for pain assessment, especially in these species, and consequently limited information regarding the welfare aspects of osteotomy models. Moreover, the selection of analgesics is restricted for osteotomy models since non-steroidal anti-inflammatory drugs (NSAIDs) are known to affect the initial, inflammatory phase of bone healing. Therefore, opioids such as buprenorphine and tramadol are often used. However, dosage data in the literature are varied. Within this review, we clarify the background of osteotomy models, explain the current status and challenges of animal welfare assessment, and provide an example score sheet including model specific parameters. Furthermore, we summarize current refinement options and present a brief outlook on further 3R research. © The Author(s) 2016.

  15. ASSESSMENT OF DYNAMIC PRA TECHNIQUES WITH INDUSTRY AVERAGE COMPONENT PERFORMANCE DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yadav, Vaibhav; Agarwal, Vivek; Gribok, Andrei V.

    In the nuclear industry, risk monitors are intended to provide a point-in-time estimate of the system risk given the current plant configuration. Current risk monitors are limited in that they do not properly take into account the deteriorating states of plant equipment, which are unit-specific. Current approaches to computing risk monitors use probabilistic risk assessment (PRA) techniques, but the assessment is typically a snapshot in time. Living PRA models attempt to address limitations of traditional PRA models in a limited sense by including temporary changes in plant and system configurations. However, information on plant component health are not considered. Thismore » often leaves risk monitors using living PRA models incapable of conducting evaluations with dynamic degradation scenarios evolving over time. There is a need to develop enabling approaches to solidify risk monitors to provide time and condition-dependent risk by integrating traditional PRA models with condition monitoring and prognostic techniques. This paper presents estimation of system risk evolution over time by integrating plant risk monitoring data with dynamic PRA methods incorporating aging and degradation. Several online, non-destructive approaches have been developed for diagnosing plant component conditions in nuclear industry, i.e., condition indication index, using vibration analysis, current signatures, and operational history [1]. In this work the component performance measures at U.S. commercial nuclear power plants (NPP) [2] are incorporated within the various dynamic PRA methodologies [3] to provide better estimates of probability of failures. Aging and degradation is modeled within the Level-1 PRA framework and is applied to several failure modes of pumps and can be extended to a range of components, viz. valves, generators, batteries, and pipes.« less

  16. The NASA Space Radiobiology Risk Assessment Project

    NASA Astrophysics Data System (ADS)

    Cucinotta, Francis A.; Huff, Janice; Ponomarev, Artem; Patel, Zarana; Kim, Myung-Hee

    The current first phase (2006-2011) has the three major goals of: 1) optimizing the conventional cancer risk models currently used based on the double-detriment life-table and radiation quality functions; 2) the integration of biophysical models of acute radiation syndromes; and 3) the development of new systems radiation biology models of cancer processes. The first-phase also includes continued uncertainty assessment of space radiation environmental models and transport codes, and relative biological effectiveness factors (RBE) based on flight data and NSRL results, respectively. The second phase of the (2012-2016) will: 1) develop biophysical models of central nervous system risks (CNS); 2) achieve comphrensive systems biology models of cancer processes using data from proton and heavy ion studies performed at NSRL; and 3) begin to identify computational models of biological countermeasures. Goals for the third phase (2017-2021) include: 1) the development of a systems biology model of cancer risks for operational use at NASA; 2) development of models of degenerative risks, 2) quantitative models of counter-measure impacts on cancer risks; and 3) indiviudal based risk assessments. Finally, we will support a decision point to continue NSRL research in support of NASA's exploration goals beyond 2021, and create an archival of NSRL research results for continued analysis. Details on near term goals, plans for a WEB based data resource of NSRL results, and a space radiation Wikepedia are described.

  17. An Entropy-Based Measure for Assessing Fuzziness in Logistic Regression

    PubMed Central

    Weiss, Brandi A.; Dardick, William

    2015-01-01

    This article introduces an entropy-based measure of data–model fit that can be used to assess the quality of logistic regression models. Entropy has previously been used in mixture-modeling to quantify how well individuals are classified into latent classes. The current study proposes the use of entropy for logistic regression models to quantify the quality of classification and separation of group membership. Entropy complements preexisting measures of data–model fit and provides unique information not contained in other measures. Hypothetical data scenarios, an applied example, and Monte Carlo simulation results are used to demonstrate the application of entropy in logistic regression. Entropy should be used in conjunction with other measures of data–model fit to assess how well logistic regression models classify cases into observed categories. PMID:29795897

  18. An Entropy-Based Measure for Assessing Fuzziness in Logistic Regression.

    PubMed

    Weiss, Brandi A; Dardick, William

    2016-12-01

    This article introduces an entropy-based measure of data-model fit that can be used to assess the quality of logistic regression models. Entropy has previously been used in mixture-modeling to quantify how well individuals are classified into latent classes. The current study proposes the use of entropy for logistic regression models to quantify the quality of classification and separation of group membership. Entropy complements preexisting measures of data-model fit and provides unique information not contained in other measures. Hypothetical data scenarios, an applied example, and Monte Carlo simulation results are used to demonstrate the application of entropy in logistic regression. Entropy should be used in conjunction with other measures of data-model fit to assess how well logistic regression models classify cases into observed categories.

  19. The evolution of global disaster risk assessments: from hazard to global change

    NASA Astrophysics Data System (ADS)

    Peduzzi, Pascal

    2013-04-01

    The perception of disaster risk as a dynamic process interlinked with global change is a fairly recent concept. It gradually emerged as an evolution from new scientific theories, currents of thinking and lessons learned from large disasters since the 1970s. The interest was further heighten, in the mid-1980s, by the Chernobyl nuclear accident and the discovery of the ozone layer hole, both bringing awareness that dangerous hazards can generate global impacts. The creation of the UN International Decade for Natural Disaster Reduction (IDNDR) and the publication of the first IPCC report in 1990 reinforced the interest for global risk assessment. First global risk models including hazard, exposure and vulnerability components were available since mid-2000s. Since then increased computation power and more refined datasets resolution, led to more numerous and sophisticated global risk models. This article presents a recent history of global disaster risk models, the current status of researches for the Global Assessment Report on Disaster Risk Reduction (GAR 2013) and future challenges and limitations for the development of next generation global disaster risk models.

  20. Uncertainty estimates of purity measurements based on current information: toward a "live validation" of purity methods.

    PubMed

    Apostol, Izydor; Kelner, Drew; Jiang, Xinzhao Grace; Huang, Gang; Wypych, Jette; Zhang, Xin; Gastwirt, Jessica; Chen, Kenneth; Fodor, Szilan; Hapuarachchi, Suminda; Meriage, Dave; Ye, Frank; Poppe, Leszek; Szpankowski, Wojciech

    2012-12-01

    To predict precision and other performance characteristics of chromatographic purity methods, which represent the most widely used form of analysis in the biopharmaceutical industry. We have conducted a comprehensive survey of purity methods, and show that all performance characteristics fall within narrow measurement ranges. This observation was used to develop a model called Uncertainty Based on Current Information (UBCI), which expresses these performance characteristics as a function of the signal and noise levels, hardware specifications, and software settings. We applied the UCBI model to assess the uncertainty of purity measurements, and compared the results to those from conventional qualification. We demonstrated that the UBCI model is suitable to dynamically assess method performance characteristics, based on information extracted from individual chromatograms. The model provides an opportunity for streamlining qualification and validation studies by implementing a "live validation" of test results utilizing UBCI as a concurrent assessment of measurement uncertainty. Therefore, UBCI can potentially mitigate the challenges associated with laborious conventional method validation and facilitates the introduction of more advanced analytical technologies during the method lifecycle.

  1. Vocational Assessment of Students with Disadvantages: Their Peculiar Needs.

    ERIC Educational Resources Information Center

    Nolte, Deborah

    A study examined the underlying factor structure of the aptitude tests and work samples being completed by students with educational disadvantages (limited reading and mathematics skills) who were assessed with the current assessment model in the Akron (Ohio) Public Schools. The amount of variance accounted for by the factors was also…

  2. Instructional Model of Natural Science in Junior High Schools, Batu-Malang

    ERIC Educational Resources Information Center

    Pantiwati, Yuni; Wahyuni, Sri; Permana, Fendy Hardian

    2017-01-01

    The instruction of Natural Science subject in junior high schools, as regulated by 2013 Curriculum, is to be taught in an integrated way, combining Biology, Physics, and Chemistry subjects. The assessment of which is called authentic assessment. This current study described the instructional system especially the assessment system of Natural…

  3. Integrating Scaffolding Strategies into Technology-Enhanced Assessments of English Learners: Task Types and Measurement Models

    ERIC Educational Resources Information Center

    Wolf, Mikyung Kim; Guzman-Orth, Danielle; Lopez, Alexis; Castellano, Katherine; Himelfarb, Igor; Tsutagawa, Fred S.

    2016-01-01

    This article investigates ways to improve the assessment of English learner students' English language proficiency given the current movement of creating next-generation English language proficiency assessments in the Common Core era. In particular, this article discusses the integration of scaffolding strategies, which are prevalently utilized as…

  4. Assessing Understanding of Biological Processes: Elucidating Students' Models of Meiosis.

    ERIC Educational Resources Information Center

    Kindfield, Ann C.

    1994-01-01

    Presents a meiosis reasoning problem that provides direct access to students' current models of chromosomes and meiosis. Also included in the article are tips for classroom implementation and a summary of the solution evaluation. (ZWH)

  5. Conscientiousness and obsessive-compulsive personality disorder.

    PubMed

    Samuel, Douglas B; Widiger, Thomas A

    2011-07-01

    A dimensional perspective on personality disorder hypothesizes that the current diagnostic categories represent maladaptive variants of general personality traits. However, a fundamental foundation of this viewpoint is that dimensional models can adequately account for the pathology currently described by these categories. While most of the personality disorders have well established links to dimensional models that buttress this hypothesis, obsessive-compulsive personality disorder (OCPD) has obtained only inconsistent support. The current study administered multiple measures of 1) conscientiousness-related personality traits, 2) DSM-IV OCPD, and 3) specific components of OCPD (e.g., compulsivity and perfectionism) to a sample of 536 undergraduates who were oversampled for elevated OCPD scores. Six existing measures of conscientiousness-related personality traits converged strongly with each other supporting their assessment of a common trait. These measures of conscientiousness correlated highly with scales assessing specific components of OCPD, but obtained variable relationships with measures of DSM-IV OCPD. More specifically, there were differences within the conscientiousness instruments such that those designed to assess general personality functioning had small to medium relationships with OCPD, but those assessing more maladaptive variants obtained large effect sizes. These findings support the view that OCPD does represent a maladaptive variant of normal-range conscientiousness.

  6. Improvement of automatic control system for high-speed current collectors

    NASA Astrophysics Data System (ADS)

    Sidorov, O. A.; Goryunov, V. N.; Golubkov, A. S.

    2018-01-01

    The article considers the ways of regulation of pantographs to provide quality and reliability of current collection at high speeds. To assess impact of regulation was proposed integral criterion of the quality of current collection, taking into account efficiency and reliability of operation of the pantograph. The study was carried out using mathematical model of interaction of pantograph and catenary system, allowing to assess contact force and intensity of arcing at the contact zone at different movement speeds. The simulation results allowed us to estimate the efficiency of different methods of regulation of pantographs and determine the best option.

  7. The Conservation Effects Assessment Project (CEAP): a national scale natural resources and conservation needs assessment and decision support tool

    NASA Astrophysics Data System (ADS)

    Johnson, M.-V. V.; Norfleet, M. L.; Atwood, J. D.; Behrman, K. D.; Kiniry, J. R.; Arnold, J. G.; White, M. J.; Williams, J.

    2015-07-01

    The Conservation Effects Assessment Project (CEAP) was initiated to quantify the impacts of agricultural conservation practices at the watershed, regional, and national scales across the United States. Representative cropland acres in all major U.S. watersheds were surveyed in 2003-2006 as part of the seminal CEAP Cropland National Assessment. Two process-based models, the Agricultural Policy Environmental eXtender(APEX) and the Soil Water Assessment Tool (SWAT), were applied to the survey data to provide a quantitative assessment of current conservation practice impacts, establish a benchmark against which future conservation trends and efforts could be measured, and identify outstanding conservation concerns. The flexibility of these models and the unprecedented amount of data on current conservation practices across the country enabled Cropland CEAP to meet its Congressional mandate of quantifying the value of current conservation practices. It also enabled scientifically grounded exploration of a variety of conservation scenarios, empowering CEAP to not only inform on past successes and additional needs, but to also provide a decision support tool to help guide future policy development and conservation practice decision making. The CEAP effort will repeat the national survey in 2015-2016, enabling CEAP to provide analyses of emergent conservation trends, outstanding needs, and potential costs and benefits of pursuing various treatment scenarios for all agricultural watersheds across the United States.

  8. Modeling fuels and fire effects in 3D: Model description and applications

    Treesearch

    Francois Pimont; Russell Parsons; Eric Rigolot; Francois de Coligny; Jean-Luc Dupuy; Philippe Dreyfus; Rodman R. Linn

    2016-01-01

    Scientists and managers critically need ways to assess how fuel treatments alter fire behavior, yet few tools currently exist for this purpose.We present a spatially-explicit-fuel-modeling system, FuelManager, which models fuels, vegetation growth, fire behavior (using a physics-based model, FIRETEC), and fire effects. FuelManager's flexible approach facilitates...

  9. Operational prediction of rip currents using numerical model and nearshore bathymetry from video images

    NASA Astrophysics Data System (ADS)

    Sembiring, L.; Van Ormondt, M.; Van Dongeren, A. R.; Roelvink, J. A.

    2017-07-01

    Rip currents are one of the most dangerous coastal hazards for swimmers. In order to minimize the risk, a coastal operational-process based-model system can be utilized in order to provide forecast of nearshore waves and currents that may endanger beach goers. In this paper, an operational model for rip current prediction by utilizing nearshore bathymetry obtained from video image technique is demonstrated. For the nearshore scale model, XBeach1 is used with which tidal currents, wave induced currents (including the effect of the wave groups) can be simulated simultaneously. Up-to-date bathymetry will be obtained using video images technique, cBathy 2. The system will be tested for the Egmond aan Zee beach, located in the northern part of the Dutch coastline. This paper will test the applicability of bathymetry obtained from video technique to be used as input for the numerical modelling system by comparing simulation results using surveyed bathymetry and model results using video bathymetry. Results show that the video technique is able to produce bathymetry converging towards the ground truth observations. This bathymetry validation will be followed by an example of operational forecasting type of simulation on predicting rip currents. Rip currents flow fields simulated over measured and modeled bathymetries are compared in order to assess the performance of the proposed forecast system.

  10. Model Uncertainty and Bayesian Model Averaged Benchmark Dose Estimation for Continuous Data

    EPA Science Inventory

    The benchmark dose (BMD) approach has gained acceptance as a valuable risk assessment tool, but risk assessors still face significant challenges associated with selecting an appropriate BMD/BMDL estimate from the results of a set of acceptable dose-response models. Current approa...

  11. Some Issues in Item Response Theory: Dimensionality Assessment and Models for Guessing

    ERIC Educational Resources Information Center

    Smith, Jessalyn

    2009-01-01

    Currently, standardized tests are widely used as a method to measure how well schools and students meet academic standards. As a result, measurement issues have become an increasingly popular topic of study. Unidimensional item response models are used to model latent abilities and specific item characteristics. This class of models makes…

  12. New Elements To Consider When Modeling the Hazards Associated with Botulinum Neurotoxin in Food.

    PubMed

    Ihekwaba, Adaoha E C; Mura, Ivan; Malakar, Pradeep K; Walshaw, John; Peck, Michael W; Barker, G C

    2016-01-15

    Botulinum neurotoxins (BoNTs) produced by the anaerobic bacterium Clostridium botulinum are the most potent biological substances known to mankind. BoNTs are the agents responsible for botulism, a rare condition affecting the neuromuscular junction and causing a spectrum of diseases ranging from mild cranial nerve palsies to acute respiratory failure and death. BoNTs are a potential biowarfare threat and a public health hazard, since outbreaks of foodborne botulism are caused by the ingestion of preformed BoNTs in food. Currently, mathematical models relating to the hazards associated with C. botulinum, which are largely empirical, make major contributions to botulinum risk assessment. Evaluated using statistical techniques, these models simulate the response of the bacterium to environmental conditions. Though empirical models have been successfully incorporated into risk assessments to support food safety decision making, this process includes significant uncertainties so that relevant decision making is frequently conservative and inflexible. Progression involves encoding into the models cellular processes at a molecular level, especially the details of the genetic and molecular machinery. This addition drives the connection between biological mechanisms and botulism risk assessment and hazard management strategies. This review brings together elements currently described in the literature that will be useful in building quantitative models of C. botulinum neurotoxin production. Subsequently, it outlines how the established form of modeling could be extended to include these new elements. Ultimately, this can offer further contributions to risk assessments to support food safety decision making. Copyright © 2015 Ihekwaba et al.

  13. Assessing the Liquidity of Firms: Robust Neural Network Regression as an Alternative to the Current Ratio

    NASA Astrophysics Data System (ADS)

    de Andrés, Javier; Landajo, Manuel; Lorca, Pedro; Labra, Jose; Ordóñez, Patricia

    Artificial neural networks have proven to be useful tools for solving financial analysis problems such as financial distress prediction and audit risk assessment. In this paper we focus on the performance of robust (least absolute deviation-based) neural networks on measuring liquidity of firms. The problem of learning the bivariate relationship between the components (namely, current liabilities and current assets) of the so-called current ratio is analyzed, and the predictive performance of several modelling paradigms (namely, linear and log-linear regressions, classical ratios and neural networks) is compared. An empirical analysis is conducted on a representative data base from the Spanish economy. Results indicate that classical ratio models are largely inadequate as a realistic description of the studied relationship, especially when used for predictive purposes. In a number of cases, especially when the analyzed firms are microenterprises, the linear specification is improved by considering the flexible non-linear structures provided by neural networks.

  14. Waterpipe tobacco smoking among sexual minorities in the United States: Evidence from the National Adult Tobacco Survey (2012-2014).

    PubMed

    Ortiz, Kasim; Mamkherzi, Jamal; Salloum, Ramzi; Matthews, Alicia K; Maziak, Wasim

    2017-11-01

    The current study examined differences in waterpipe smoking (both lifetime and current) comparing sexual minority populations - those identifying with lesbian, gay, or bisexual identity - to their heterosexual counterparts using a nationally representative dataset. The current study used pooled data from the 2012-2013 & 2013-2014 National Adult Tobacco Survey (NATS). Log-Poisson multivariable regression models were deployed to determine the prevalence of waterpipe smoking behavior among sexual minority individuals controlling for sociodemographic characteristics and stratified by current gender status. In fully-adjusted models assessing lifetime WTS, lesbian/gay and bisexual respondents reported higher prevalence of WTS compared to their heterosexual counterparts. This trend held true in gender-stratified models among gay men [gay men: PR 1.25, 95%CI [1.06, 1.47] and women ([lesbians: PR 1.38, 95%CI [1.12, 1.69] and bisexual women: 1.69, 95%CI [1.45, 1.97]). In fully-adjusted models assessing current WTS, lesbian/gay and bisexual respondents reported higher risk of WTS compared to their heterosexual counterparts. This trend held true in gender-stratified models, only for among gay men [gay men: PR 1.56, 95%CI [1.18, 2.05] and bisexual women: 2.38, 95%CI [1.84, 3.09]). Among the US general adult population, sexual minorities exhibited increased prevalence of current waterpipe smoking compared to their heterosexual counterparts. This pattern is also shaped by gender and variation of sexual orientation identification (e.g., lesbian/gay vs. bisexual). This warrants development of tailored interventions aimed at decreasing waterpipe smoking among sexual minority populations. Copyright © 2017. Published by Elsevier Ltd.

  15. A commercially viable virtual reality knee arthroscopy training system.

    PubMed

    McCarthy, A D; Hollands, R J

    1998-01-01

    Arthroscopy is a minimally invasive form of surgery used to inspect joints. It is complex to learn yet current training methods appear inadequate, thus negating the potential benefits to the patient. This paper describes the development and initial assessment of a cost-effective virtual reality based system for training surgeons in arthroscopy of the knee. The system runs on a P.C. Initial assessments by surgeons have been positive and current developments in deformable models are described.

  16. User assessment of smoke-dispersion models for wildland biomass burning.

    Treesearch

    Steve Breyfogle; Sue A. Ferguson

    1996-01-01

    Several smoke-dispersion models, which currently are available for modeling smoke from biomass burns, were evaluated for ease of use, availability of input data, and output data format. The input and output components of all models are listed, and differences in model physics are discussed. Each model was installed and run on a personal computer with a simple-case...

  17. Space station ECLSS integration analysis: Simplified General Cluster Systems Model, ECLS System Assessment Program enhancements

    NASA Technical Reports Server (NTRS)

    Ferguson, R. E.

    1985-01-01

    The data base verification of the ECLS Systems Assessment Program (ESAP) was documented and changes made to enhance the flexibility of the water recovery subsystem simulations are given. All changes which were made to the data base values are described and the software enhancements performed. The refined model documented herein constitutes the submittal of the General Cluster Systems Model. A source listing of the current version of ESAP is provided in Appendix A.

  18. Simulated wind-generated inertial oscillations compared to current measurements in the northern North Sea

    NASA Astrophysics Data System (ADS)

    Bruserud, Kjersti; Haver, Sverre; Myrhaug, Dag

    2018-06-01

    Measured current speed data show that episodes of wind-generated inertial oscillations dominate the current conditions in parts of the northern North Sea. In order to acquire current data of sufficient duration for robust estimation of joint metocean design conditions, such as wind, waves, and currents, a simple model for episodes of wind-generated inertial oscillations is adapted for the northern North Sea. The model is validated with and compared against measured current data at one location in the northern North Sea and found to reproduce the measured maximum current speed in each episode with considerable accuracy. The comparison is further improved when a small general background current is added to the simulated maximum current speeds. Extreme values of measured and simulated current speed are estimated and found to compare well. To assess the robustness of the model and the sensitivity of current conditions from location to location, the validated model is applied at three other locations in the northern North Sea. In general, the simulated maximum current speeds are smaller than the measured, suggesting that wind-generated inertial oscillations are not as prominent at these locations and that other current conditions may be governing. Further analysis of the simulated current speed and joint distribution of wind, waves, and currents for design of offshore structures will be presented in a separate paper.

  19. Simulated wind-generated inertial oscillations compared to current measurements in the northern North Sea

    NASA Astrophysics Data System (ADS)

    Bruserud, Kjersti; Haver, Sverre; Myrhaug, Dag

    2018-04-01

    Measured current speed data show that episodes of wind-generated inertial oscillations dominate the current conditions in parts of the northern North Sea. In order to acquire current data of sufficient duration for robust estimation of joint metocean design conditions, such as wind, waves, and currents, a simple model for episodes of wind-generated inertial oscillations is adapted for the northern North Sea. The model is validated with and compared against measured current data at one location in the northern North Sea and found to reproduce the measured maximum current speed in each episode with considerable accuracy. The comparison is further improved when a small general background current is added to the simulated maximum current speeds. Extreme values of measured and simulated current speed are estimated and found to compare well. To assess the robustness of the model and the sensitivity of current conditions from location to location, the validated model is applied at three other locations in the northern North Sea. In general, the simulated maximum current speeds are smaller than the measured, suggesting that wind-generated inertial oscillations are not as prominent at these locations and that other current conditions may be governing. Further analysis of the simulated current speed and joint distribution of wind, waves, and currents for design of offshore structures will be presented in a separate paper.

  20. Proposals for enhanced health risk assessment and stratification in an integrated care scenario

    PubMed Central

    Dueñas-Espín, Ivan; Vela, Emili; Pauws, Steffen; Bescos, Cristina; Cano, Isaac; Cleries, Montserrat; Contel, Joan Carles; de Manuel Keenoy, Esteban; Garcia-Aymerich, Judith; Gomez-Cabrero, David; Kaye, Rachelle; Lahr, Maarten M H; Lluch-Ariet, Magí; Moharra, Montserrat; Monterde, David; Mora, Joana; Nalin, Marco; Pavlickova, Andrea; Piera, Jordi; Ponce, Sara; Santaeugenia, Sebastià; Schonenberg, Helen; Störk, Stefan; Tegner, Jesper; Velickovski, Filip; Westerteicher, Christoph; Roca, Josep

    2016-01-01

    Objectives Population-based health risk assessment and stratification are considered highly relevant for large-scale implementation of integrated care by facilitating services design and case identification. The principal objective of the study was to analyse five health-risk assessment strategies and health indicators used in the five regions participating in the Advancing Care Coordination and Telehealth Deployment (ACT) programme (http://www.act-programme.eu). The second purpose was to elaborate on strategies toward enhanced health risk predictive modelling in the clinical scenario. Settings The five ACT regions: Scotland (UK), Basque Country (ES), Catalonia (ES), Lombardy (I) and Groningen (NL). Participants Responsible teams for regional data management in the five ACT regions. Primary and secondary outcome measures We characterised and compared risk assessment strategies among ACT regions by analysing operational health risk predictive modelling tools for population-based stratification, as well as available health indicators at regional level. The analysis of the risk assessment tool deployed in Catalonia in 2015 (GMAs, Adjusted Morbidity Groups) was used as a basis to propose how population-based analytics could contribute to clinical risk prediction. Results There was consensus on the need for a population health approach to generate health risk predictive modelling. However, this strategy was fully in place only in two ACT regions: Basque Country and Catalonia. We found marked differences among regions in health risk predictive modelling tools and health indicators, and identified key factors constraining their comparability. The research proposes means to overcome current limitations and the use of population-based health risk prediction for enhanced clinical risk assessment. Conclusions The results indicate the need for further efforts to improve both comparability and flexibility of current population-based health risk predictive modelling approaches. Applicability and impact of the proposals for enhanced clinical risk assessment require prospective evaluation. PMID:27084274

  1. Integrating modeling and surveys for more effective assessments

    EPA Science Inventory

    A false dichotomy currently exists in monitoring that pits sample surveys based on probability designs against targeted monitoring of hand-picked sites. We maintain that judicious use of both, when designed to be integrated, produces assessments of greater value than either inde...

  2. A screening approach using zebrafish for the detection and characterization of developmental neurotoxicity.

    EPA Science Inventory

    Thousands of chemicals have little or no data to support developmental neurotoxicity risk assessments. Current developmental neurotoxicity guideline studies mandating mammalian model systems are expensive and time consuming. Therefore a rapid, cost-effective method to assess de...

  3. Dispersal and extrapolation on the accuracy of temporal predictions from distribution models for the Darwin's frog.

    PubMed

    Uribe-Rivera, David E; Soto-Azat, Claudio; Valenzuela-Sánchez, Andrés; Bizama, Gustavo; Simonetti, Javier A; Pliscoff, Patricio

    2017-07-01

    Climate change is a major threat to biodiversity; the development of models that reliably predict its effects on species distributions is a priority for conservation biogeography. Two of the main issues for accurate temporal predictions from Species Distribution Models (SDM) are model extrapolation and unrealistic dispersal scenarios. We assessed the consequences of these issues on the accuracy of climate-driven SDM predictions for the dispersal-limited Darwin's frog Rhinoderma darwinii in South America. We calibrated models using historical data (1950-1975) and projected them across 40 yr to predict distribution under current climatic conditions, assessing predictive accuracy through the area under the ROC curve (AUC) and True Skill Statistics (TSS), contrasting binary model predictions against temporal-independent validation data set (i.e., current presences/absences). To assess the effects of incorporating dispersal processes we compared the predictive accuracy of dispersal constrained models with no dispersal limited SDMs; and to assess the effects of model extrapolation on the predictive accuracy of SDMs, we compared this between extrapolated and no extrapolated areas. The incorporation of dispersal processes enhanced predictive accuracy, mainly due to a decrease in the false presence rate of model predictions, which is consistent with discrimination of suitable but inaccessible habitat. This also had consequences on range size changes over time, which is the most used proxy for extinction risk from climate change. The area of current climatic conditions that was absent in the baseline conditions (i.e., extrapolated areas) represents 39% of the study area, leading to a significant decrease in predictive accuracy of model predictions for those areas. Our results highlight (1) incorporating dispersal processes can improve predictive accuracy of temporal transference of SDMs and reduce uncertainties of extinction risk assessments from global change; (2) as geographical areas subjected to novel climates are expected to arise, they must be reported as they show less accurate predictions under future climate scenarios. Consequently, environmental extrapolation and dispersal processes should be explicitly incorporated to report and reduce uncertainties in temporal predictions of SDMs, respectively. Doing so, we expect to improve the reliability of the information we provide for conservation decision makers under future climate change scenarios. © 2017 by the Ecological Society of America.

  4. Fractal Risk Assessment of ISS Propulsion Module in Meteoroid and Orbital Debris Environments

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.

    2001-01-01

    A unique and innovative risk assessment of the International Space Station (ISS) Propulsion Module is conducted using fractal modeling of the Module's response to the meteoroid and orbital debris environments. Both the environment models and structural failure modes due to the resultant hypervelocity impact phenomenology, as well as Module geometry, are investigated for fractal applicability. The fractal risk assessment methodology could produce a greatly simplified alternative to current methodologies, such as BUMPER analyses, while maintaining or increasing the number of complex scenarios that can be assessed. As a minimum, this innovative fractal approach will provide an independent assessment of existing methodologies in a unique way.

  5. Animal models for dengue vaccine development and testing

    PubMed Central

    2017-01-01

    Dengue fever is a tropical endemic disease; however, because of climate change, it may become a problem in South Korea in the near future. Research on vaccines for dengue fever and outbreak preparedness are currently insufficient. In addition, because there are no appropriate animal models, controversial results from vaccine efficacy assessments and clinical trials have been reported. Therefore, to study the mechanism of dengue fever and test the immunogenicity of vaccines, an appropriate animal model is urgently needed. In addition to mouse models, more suitable models using animals that can be humanized will need to be constructed. In this report, we look at the current status of model animal construction and discuss which models require further development. PMID:28775974

  6. Animal models for dengue vaccine development and testing.

    PubMed

    Na, Woonsung; Yeom, Minjoo; Choi, Il-Kyu; Yook, Heejun; Song, Daesub

    2017-07-01

    Dengue fever is a tropical endemic disease; however, because of climate change, it may become a problem in South Korea in the near future. Research on vaccines for dengue fever and outbreak preparedness are currently insufficient. In addition, because there are no appropriate animal models, controversial results from vaccine efficacy assessments and clinical trials have been reported. Therefore, to study the mechanism of dengue fever and test the immunogenicity of vaccines, an appropriate animal model is urgently needed. In addition to mouse models, more suitable models using animals that can be humanized will need to be constructed. In this report, we look at the current status of model animal construction and discuss which models require further development.

  7. An Assessment of Iterative Reconstruction Methods for Sparse Ultrasound Imaging

    PubMed Central

    Valente, Solivan A.; Zibetti, Marcelo V. W.; Pipa, Daniel R.; Maia, Joaquim M.; Schneider, Fabio K.

    2017-01-01

    Ultrasonic image reconstruction using inverse problems has recently appeared as an alternative to enhance ultrasound imaging over beamforming methods. This approach depends on the accuracy of the acquisition model used to represent transducers, reflectivity, and medium physics. Iterative methods, well known in general sparse signal reconstruction, are also suited for imaging. In this paper, a discrete acquisition model is assessed by solving a linear system of equations by an ℓ1-regularized least-squares minimization, where the solution sparsity may be adjusted as desired. The paper surveys 11 variants of four well-known algorithms for sparse reconstruction, and assesses their optimization parameters with the goal of finding the best approach for iterative ultrasound imaging. The strategy for the model evaluation consists of using two distinct datasets. We first generate data from a synthetic phantom that mimics real targets inside a professional ultrasound phantom device. This dataset is contaminated with Gaussian noise with an estimated SNR, and all methods are assessed by their resulting images and performances. The model and methods are then assessed with real data collected by a research ultrasound platform when scanning the same phantom device, and results are compared with beamforming. A distinct real dataset is finally used to further validate the proposed modeling. Although high computational effort is required by iterative methods, results show that the discrete model may lead to images closer to ground-truth than traditional beamforming. However, computing capabilities of current platforms need to evolve before frame rates currently delivered by ultrasound equipments are achievable. PMID:28282862

  8. Modelling and Simulation for Requirements Engineering and Options Analysis

    DTIC Science & Technology

    2010-05-01

    should be performed to work successfully in the domain; and process-based techniques model the processes that occur in the work domain. There is a crisp ...acad/sed/sedres/ dm /erg/cwa. DRDC Toronto CR 2010-049 39 23. Can the current technique for developing simulation models for assessments

  9. Predictive models for Escherichia coli concentrations at inland lake beaches and relationship of model variables to pathogen detection

    EPA Science Inventory

    Methods are needed improve the timeliness and accuracy of recreational water‐quality assessments. Traditional culture methods require 18–24 h to obtain results and may not reflect current conditions. Predictive models, based on environmental and water quality variables, have been...

  10. The development of a model of culturally responsive science and mathematics teaching

    NASA Astrophysics Data System (ADS)

    Hernandez, Cecilia M.; Morales, Amanda R.; Shroyer, M. Gail

    2013-12-01

    This qualitative theoretical study was conducted in response to the current need for an inclusive and comprehensive model to guide the preparation and assessment of teacher candidates for culturally responsive teaching. The process of developing a model of culturally responsive teaching involved three steps: a comprehensive review of the literature; a synthesis of the literature into thematic categories to capture the dispositions and behaviors of culturally responsive teaching; and the piloting of these thematic categories with teacher candidates to validate the usefulness of the categories and to generate specific exemplars of behavior to represent each category. The model of culturally responsive teaching contains five thematic categories: (1) content integration, (2) facilitating knowledge construction, (3) prejudice reduction, (4) social justice, and (5) academic development. The current model is a promising tool for comprehensively defining culturally responsive teaching in the context of teacher education as well as to guide curriculum and assessment changes aimed to increase candidates' culturally responsive knowledge and skills in science and mathematics teaching.

  11. Intrinsic ethics regarding integrated assessment models for climate management.

    PubMed

    Schienke, Erich W; Baum, Seth D; Tuana, Nancy; Davis, Kenneth J; Keller, Klaus

    2011-09-01

    In this essay we develop and argue for the adoption of a more comprehensive model of research ethics than is included within current conceptions of responsible conduct of research (RCR). We argue that our model, which we label the ethical dimensions of scientific research (EDSR), is a more comprehensive approach to encouraging ethically responsible scientific research compared to the currently typically adopted approach in RCR training. This essay focuses on developing a pedagogical approach that enables scientists to better understand and appreciate one important component of this model, what we call intrinsic ethics. Intrinsic ethical issues arise when values and ethical assumptions are embedded within scientific findings and analytical methods. Through a close examination of a case study and its application in teaching, namely, evaluation of climate change integrated assessment models, this paper develops a method and case for including intrinsic ethics within research ethics training to provide scientists with a comprehensive understanding and appreciation of the critical role of values and ethical choices in the production of research outcomes.

  12. A fast, calibrated model for pyroclastic density currents kinematics and hazard

    NASA Astrophysics Data System (ADS)

    Esposti Ongaro, Tomaso; Orsucci, Simone; Cornolti, Fulvio

    2016-11-01

    Multiphase flow models represent valuable tools for the study of the complex, non-equilibrium dynamics of pyroclastic density currents. Particle sedimentation, flow stratification and rheological changes, depending on the flow regime, interaction with topographic obstacles, turbulent air entrainment, buoyancy reversal, and other complex features of pyroclastic currents can be simulated in two and three dimensions, by exploiting efficient numerical solvers and the improved computational capability of modern supercomputers. However, numerical simulations of polydisperse gas-particle mixtures are quite computationally expensive, so that their use in hazard assessment studies (where there is the need of evaluating the probability of hazardous actions over hundreds of possible scenarios) is still challenging. To this aim, a simplified integral (box) model can be used, under the appropriate hypotheses, to describe the kinematics of pyroclastic density currents over a flat topography, their scaling properties and their depositional features. In this work, multiphase flow simulations are used to evaluate integral model approximations, to calibrate its free parameters and to assess the influence of the input data on the results. Two-dimensional numerical simulations describe the generation and decoupling of a dense, basal layer (formed by progressive particle sedimentation) from the dilute transport system. In the Boussinesq regime (i.e., for solid mass fractions below about 0.1), the current Froude number (i.e., the ratio between the current inertia and buoyancy) does not strongly depend on initial conditions and it is consistent to that measured in laboratory experiments (i.e., between 1.05 and 1.2). For higher density ratios (solid mass fraction in the range 0.1-0.9) but still in a relatively dilute regime (particle volume fraction lower than 0.01), numerical simulations demonstrate that the box model is still applicable, but the Froude number depends on the reduced gravity. When the box model is opportunely calibrated with the numerical simulation results, the prediction of the flow runout is fairly accurate and the model predicts a rapid, non-linear decay of the flow kinetic energy (or dynamic pressure) with the distance from the source. The capability of PDC to overcome topographic obstacles can thus be analysed in the framework of the energy-conoid approach, in which the predicted kinetic energy of the flow front is compared with the potential energy jump associated with the elevated topography to derive a condition for blocking. Model results show that, although preferable to the energy-cone, the energy-conoid approach still has some serious limitations, mostly associated with the behaviour of the flow head. Implications of these outcomes are discussed in the context of probabilistic hazard assessment studies, in which a calibrated box model can be used as a fast pyroclastic density current emulator for Monte Carlo simulations.

  13. Predicting the Mobility and Burial of Underwater Unexploded Ordnance (UXO) Using the UXO Mobility Model (ESTCP) 200417

    DTIC Science & Technology

    2009-11-01

    Abbreviations and Acronyms Acronym Definition ADCP Acoustic Doppler Current Profiler AGD Applications Guidance Document ARAMS Army Risk Assessment Modeling...Center iv NESDI Navy Environmental Sustainability Development to Integration NOS National Ocean Service NS Naval Station NWS Naval Weapons...Plan QAS Quality Assurance Specialist RAC Risk Assessment Code REF/DIF Refraction/Diffraction ROI Return on Investment SAJ Dr. Scott A. Jenkins

  14. Impact of geophysical model error for recovering temporal gravity field model

    NASA Astrophysics Data System (ADS)

    Zhou, Hao; Luo, Zhicai; Wu, Yihao; Li, Qiong; Xu, Chuang

    2016-07-01

    The impact of geophysical model error on recovered temporal gravity field models with both real and simulated GRACE observations is assessed in this paper. With real GRACE observations, we build four temporal gravity field models, i.e., HUST08a, HUST11a, HUST04 and HUST05. HUST08a and HUST11a are derived from different ocean tide models (EOT08a and EOT11a), while HUST04 and HUST05 are derived from different non-tidal models (AOD RL04 and AOD RL05). The statistical result shows that the discrepancies of the annual mass variability amplitudes in six river basins between HUST08a and HUST11a models, HUST04 and HUST05 models are all smaller than 1 cm, which demonstrates that geophysical model error slightly affects the current GRACE solutions. The impact of geophysical model error for future missions with more accurate satellite ranging is also assessed by simulation. The simulation results indicate that for current mission with range rate accuracy of 2.5 × 10- 7 m/s, observation error is the main reason for stripe error. However, when the range rate accuracy improves to 5.0 × 10- 8 m/s in the future mission, geophysical model error will be the main source for stripe error, which will limit the accuracy and spatial resolution of temporal gravity model. Therefore, observation error should be the primary error source taken into account at current range rate accuracy level, while more attention should be paid to improving the accuracy of background geophysical models for the future mission.

  15. Inter-model analysis of tsunami-induced coastal currents

    NASA Astrophysics Data System (ADS)

    Lynett, Patrick J.; Gately, Kara; Wilson, Rick; Montoya, Luis; Arcas, Diego; Aytore, Betul; Bai, Yefei; Bricker, Jeremy D.; Castro, Manuel J.; Cheung, Kwok Fai; David, C. Gabriel; Dogan, Gozde Guney; Escalante, Cipriano; González-Vida, José Manuel; Grilli, Stephan T.; Heitmann, Troy W.; Horrillo, Juan; Kânoğlu, Utku; Kian, Rozita; Kirby, James T.; Li, Wenwen; Macías, Jorge; Nicolsky, Dmitry J.; Ortega, Sergio; Pampell-Manis, Alyssa; Park, Yong Sung; Roeber, Volker; Sharghivand, Naeimeh; Shelby, Michael; Shi, Fengyan; Tehranirad, Babak; Tolkova, Elena; Thio, Hong Kie; Velioğlu, Deniz; Yalçıner, Ahmet Cevdet; Yamazaki, Yoshiki; Zaytsev, Andrey; Zhang, Y. J.

    2017-06-01

    To help produce accurate and consistent maritime hazard products, the National Tsunami Hazard Mitigation Program organized a benchmarking workshop to evaluate the numerical modeling of tsunami currents. Thirteen teams of international researchers, using a set of tsunami models currently utilized for hazard mitigation studies, presented results for a series of benchmarking problems; these results are summarized in this paper. Comparisons focus on physical situations where the currents are shear and separation driven, and are thus de-coupled from the incident tsunami waveform. In general, we find that models of increasing physical complexity provide better accuracy, and that low-order three-dimensional models are superior to high-order two-dimensional models. Inside separation zones and in areas strongly affected by eddies, the magnitude of both model-data errors and inter-model differences can be the same as the magnitude of the mean flow. Thus, we make arguments for the need of an ensemble modeling approach for areas affected by large-scale turbulent eddies, where deterministic simulation may be misleading. As a result of the analyses presented herein, we expect that tsunami modelers now have a better awareness of their ability to accurately capture the physics of tsunami currents, and therefore a better understanding of how to use these simulation tools for hazard assessment and mitigation efforts.

  16. Workshop on Current Issues in Predictive Approaches to Intelligence and Security Analytics: Fostering the Creation of Decision Advantage through Model Integration and Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanfilippo, Antonio P.

    2010-05-23

    The increasing asymmetric nature of threats to the security, health and sustainable growth of our society requires that anticipatory reasoning become an everyday activity. Currently, the use of anticipatory reasoning is hindered by the lack of systematic methods for combining knowledge- and evidence-based models, integrating modeling algorithms, and assessing model validity, accuracy and utility. The workshop addresses these gaps with the intent of fostering the creation of a community of interest on model integration and evaluation that may serve as an aggregation point for existing efforts and a launch pad for new approaches.

  17. A novel trauma leadership model reflective of changing times.

    PubMed

    DʼHuyvetter, Cecile; Cogbill, Thomas H

    2014-01-01

    As a result of generational changes in the health care workforce, we sought to evaluate our current Trauma Medical Director Leadership model. We assessed the responsibilities, accountability, time requirements, cost, and provider satisfaction with the current leadership model. Three new providers who had recently completed fellowship training were hired, each with unique professional desires, skill sets, and experience. Our goal was to establish a comprehensive, cost-effective, accountable leadership model that enabled provider satisfaction and equalized leadership responsibilities. A 3-pronged team model was established with a Medical Director title and responsibilities rotating per the American College of Surgeons verification cycle to develop leadership skills and lessen hierarchical differences.

  18. An Australasian model license reassessment procedure for identifying potentially unsafe drivers.

    PubMed

    Fildes, Brian N; Charlton, Judith; Pronk, Nicola; Langford, Jim; Oxley, Jennie; Koppel, Sjaanie

    2008-08-01

    Most licensing jurisdictions in Australia currently employ age-based assessment programs as a means to manage older driver safety, yet available evidence suggests that these programs have no safety benefits. This paper describes a community referral-based model license re assessment procedure for identifying and assessing potentially unsafe drivers. While the model was primarily developed for assessing older driver fitness to drive, it could be applicable to other forms of driver impairment associated with increased crash risk. It includes a three-tier process of assessment, involving the use of validated and relevant assessment instruments. A case is argued that this process is a more systematic, transparent and effective process for managing older driver safety and thus more likely to be widely acceptable to the target community and licensing authorities than age-based practices.

  19. The atmospheric effects of stratospheric aircraft: A topical review

    NASA Technical Reports Server (NTRS)

    Johnston, Harold S.; Prather, M. J.; Watson, R. T.

    1991-01-01

    In the late 1960s the aircraft industry became interested in developing a fleet of supersonic transports (SSTs). Between 1972 and 1975, the Climatic Impact Assessment Program (CIAP) studied the possible environmental impact of SSTs. For environmental and economic reasons, the fleet of SSTs was not developed. The Upper Atmosphere Research Program (UARP) has recently undertaken the responsibility of directing scientific research needed to assess the atmospheric impact of supersonic transports. The UARP and the High-Speed Research Program asked Harold Johnston to review the current understanding of aircraft emissions and their effect on the stratosphere. Johnston and his colleagues have recently re-examined the SST problem using current models for stratospheric ozone chemistry. A unique view is given here of the current scientific issues and the lessons learned since the beginning of CIAP, and it links the current research program with the assessment process that began two years ago.

  20. Modeling cumulative dose and exposure duration provided insights regarding the associations between benzodiazepines and injuries.

    PubMed

    Abrahamowicz, Michal; Bartlett, Gillian; Tamblyn, Robyn; du Berger, Roxane

    2006-04-01

    Accurate assessment of medication impact requires modeling cumulative effects of exposure duration and dose; however, postmarketing studies usually represent medication exposure by baseline or current use only. We propose new methods for modeling various aspects of medication use history and employment of them to assess the adverse effects of selected benzodiazepines. Time-dependent measures of cumulative dose or duration of use, with weighting of past exposures by recency, were proposed. These measures were then included in alternative versions of the multivariable Cox model to analyze the risk of fall related injuries among the elderly new users of three benzodiazepines (nitrazepam, temazepam, and flurazepam) in Quebec. Akaike's information criterion (AIC) was used to select the most predictive model for a given benzodiazepine. The best-fitting model included a combination of cumulative duration and current dose for temazepam, and cumulative dose for flurazepam and nitrazepam, with different weighting functions. The window of clinically relevant exposure was shorter for flurazepam than for the two other products. Careful modeling of the medication exposure history may enhance our understanding of the mechanisms underlying their adverse effects.

  1. An Assessment of the Department of Education's Approach and Model for Analyzing Lender Profitability.

    ERIC Educational Resources Information Center

    Jenkins, Sarah; And Others

    An assessment was done of the Department of Education's (ED) approach to determining lender profitability for Guaranteed Student Loans. The assessment described the current net present value (NPV) method as well as discussing its strengths and weaknesses. The NPV method has been widely accepted for determining the profitability of different…

  2. Do Australian Football players have sensitive groins? Players with current groin pain exhibit mechanical hyperalgesia of the adductor tendon.

    PubMed

    Drew, Michael K; Lovell, Gregory; Palsson, Thorvaldur S; Chiarelli, Pauline E; Osmotherly, Peter G

    2016-10-01

    This is the first study to evaluate the mechanical sensitivity, clinical classifications and prevalence of groin pain in Australian football players. Case-control. Professional (n=66) and semi-professional (n=9) Australian football players with and without current or previous groin injuries were recruited. Diagnoses were mapped to the Doha Agreement taxonomy. Point and career prevalence of groin pain was calculated. Pressure pain thresholds (PPTs) were assessed at regional and distant sites using handheld pressure algometry across four sites bilaterally (adductor longus tendon, pubic bone, rectus femoris, tibialis anterior muscle). To assess the relationship between current groin pain and fixed effects of hyperalgesia of each site and a history of groin pain, a mixed-effect logistic regression model was utilised. Receiver Operator Characteristic (ROC) curve were determined for the model. Point prevalence of groin pain in the preseason was 21.9% with a career prevalence of 44.8%. Adductor-related groin pain was the most prevalent classification in the pre-season period. Hyperalgesia was observed in the adductor longus tendon site in athletes with current groin pain (OR=16.27, 95% CI 1.86 to 142.02). The ROC area under the curve of the regression model was fair (AUC=0.76, 95% CI 0.54 to 0.83). Prevalence data indicates that groin pain is a larger issue than published incidence rates imply. Adductor-related groin pain is the most common diagnosis in pre-season in this population. This study has shown that hyperalgesia exists in Australian football players experiencing groin pain indicating the value of assessing mechanical pain sensitivity as a component of the clinical assessment. Copyright © 2016 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  3. A Comparison of Career-Related Assessment Tools/Models. Final [Report].

    ERIC Educational Resources Information Center

    WestEd, San Francisco, CA.

    This document contains charts that evaluate career related assessment items. Chart categories include: Purpose/Current Uses/Format; Intended Population; Oregon Career Related Learning Standards Addressed; Relationship to the Standards; Relationship to Endorsement Area Frameworks; Evidence of Validity; Evidence of Reliability; Evidence of Fairness…

  4. Risk Assessment in Child Sexual Abuse Cases

    ERIC Educational Resources Information Center

    Levenson, Jill S.; Morin, John W.

    2006-01-01

    Despite continuing improvements in risk assessment for child protective services (CPS) and movement toward actuarial prediction of child maltreatment, current models have not adequately addressed child sexual abuse. Sexual abuse cases present unique and ambiguous indicators to the investigating professional, and risk factors differ from those…

  5. Comparison of Aerodynamic Resistance Parameterizations and Implications for Dry Deposition Modeling

    EPA Science Inventory

    Nitrogen deposition data used to support the secondary National Ambient Air Quality Standards and critical loads research derives from both measurements and modeling. Data sets with spatial coverage sufficient for regional scale deposition assessments are currently generated fro...

  6. Uncertainties in Emissions In Emissions Inputs for Near-Road Assessments

    EPA Science Inventory

    Emissions, travel demand, and dispersion models are all needed to obtain temporally and spatially resolved pollutant concentrations. Current methodology combines these three models in a bottom-up approach based on hourly traffic and emissions estimates, and hourly dispersion conc...

  7. MODEL HARMONIZATION POTENTIAL AND BENEFITS

    EPA Science Inventory

    The IPCS Harmonization Project, which is currently ongoing under the auspices of the WHO, in the context of chemical risk assessment or exposure modeling, does not imply global standardization. Instead, harmonization is thought of as an effort to strive for consistency among appr...

  8. Current status and future needs of the BehavePlus Fire Modeling System

    Treesearch

    Patricia L. Andrews

    2014-01-01

    The BehavePlus Fire Modeling System is among the most widely used systems for wildland fire prediction. It is designed for use in a range of tasks including wildfire behaviour prediction, prescribed fire planning, fire investigation, fuel hazard assessment, fire model understanding, communication and research. BehavePlus is based on mathematical models for fire...

  9. Design and realization of assessment software for DC-bias of transformers

    NASA Astrophysics Data System (ADS)

    Liu, Chang; Liu, Lian-guang; Yuan, Zhong-chen

    2013-03-01

    The transformer working at the rated state will partically be saturated, and its mangetic current will be distorted accompanying with various of harmonic, increasing reactive power demand and some other affilicated phenomenon, which will threaten the safe operation of power grid. This paper establishes a transformer saturation circuit model of DCbias under duality principle basing on J-A theory which can reflect the hysteresis characteristics of iron core, and develops an software can assess the effects of transformer DC-bias using hybrid programming technology of C#.net and MATLAB with the microsoft.net platform. This software is able to simulate the mangnetizing current of different structures and assess the Saturation Level of transformers and the influnces of affilicated phenomenon accroding to the parameter of transformers and the DC equivalent voltage. It provides an effective method to assess the influnces of transformers caused by magnetic storm disaster and the earthing current of the HVDC project.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peeler, D.; Edwards, T.

    High-level waste (HLW) throughput (i.e., the amount of waste processed per unit of time) is primarily a function of two critical parameters: waste loading (WL) and melt rate. For the Defense Waste Processing Facility (DWPF), increasing HLW throughput would significantly reduce the overall mission life cycle costs for the Department of Energy (DOE). Significant increases in waste throughput have been achieved at DWPF since initial radioactive operations began in 1996. Key technical and operational initiatives that supported increased waste throughput included improvements in facility attainment, the Chemical Processing Cell (CPC) flowsheet, process control models and frit formulations. As a resultmore » of these key initiatives, DWPF increased WLs from a nominal 28% for Sludge Batch 2 (SB2) to {approx}34 to 38% for SB3 through SB6 while maintaining or slightly improving canister fill times. Although considerable improvements in waste throughput have been obtained, future contractual waste loading targets are nominally 40%, while canister production rates are also expected to increase (to a rate of 325 to 400 canisters per year). Although implementation of bubblers have made a positive impact on increasing melt rate for recent sludge batches targeting WLs in the mid30s, higher WLs will ultimately make the feeds to DWPF more challenging to process. Savannah River Remediation (SRR) recently requested the Savannah River National Laboratory (SRNL) to perform a paper study assessment using future sludge projections to evaluate whether the current Process Composition Control System (PCCS) algorithms would provide projected operating windows to allow future contractual WL targets to be met. More specifically, the objective of this study was to evaluate future sludge batch projections (based on Revision 16 of the HLW Systems Plan) with respect to projected operating windows using current PCCS models and associated constraints. Based on the assessments, the waste loading interval over which a glass system (i.e., a projected sludge composition with a candidate frit) is predicted to be acceptable can be defined (i.e., the projected operating window) which will provide insight into the ability to meet future contractual WL obligations. In this study, future contractual WL obligations are assumed to be 40%, which is the goal after all flowsheet enhancements have been implemented to support DWPF operations. For a system to be considered acceptable, candidate frits must be identified that provide access to at least 40% WL while accounting for potential variation in the sludge resulting from differences in batch-to-batch transfers into the Sludge Receipt and Adjustment Tank (SRAT) and/or analytical uncertainties. In more general terms, this study will assess whether or not the current glass formulation strategy (based on the use of the Nominal and Variation Stage assessments) and current PCCS models will allow access to compositional regions required to targeted higher WLs for future operations. Some of the key questions to be considered in this study include: (1) If higher WLs are attainable with current process control models, are the models valid in these compositional regions? If the higher WL glass regions are outside current model development or validation ranges, is there existing data that could be used to demonstrate model applicability (or lack thereof)? If not, experimental data may be required to revise current models or serve as validation data with the existing models. (2) Are there compositional trends in frit space that are required by the PCCS models to obtain access to these higher WLs? If so, are there potential issues with the compositions of the associated frits (e.g., limitations on the B{sub 2}O{sub 3} and/or Li{sub 2}O concentrations) as they are compared to model development/validation ranges or to the term 'borosilicate' glass? If limitations on the frit compositional range are realized, what is the impact of these restrictions on other glass properties such as the ability to suppress nepheline formation or influence melt rate? The model based assessments being performed make the assumption that the process control models are applicable over the glass compositional regions being evaluated. Although the glass compositional region of interest is ultimately defined by the specific frit, sludge, and WL interval used, there is no prescreening of these compositional regions with respect to the model development or validation ranges which is consistent with current DWPF operations.« less

  11. Current Trends in Distance Education: An Administrative Model

    ERIC Educational Resources Information Center

    Compora, Daniel P.

    2003-01-01

    Current practices and procedures of distance education programs at selected institutions in higher education in Ohio were studied. Relevant data was found in the areas of: (1) content of the distance education program's mission statement; (2) needs assessment procedures; (3) student demographics; (4) course acquisition, development, and evaluation…

  12. Reflective Practice in Healthcare Education: An Umbrella Review

    ERIC Educational Resources Information Center

    Fragkos, Konstantinos C.

    2016-01-01

    Reflection in healthcare education is an emerging topic with many recently published studies and reviews. This current systematic review of reviews (umbrella review) of this field explores the following aspects: which definitions and models are currently in use; how reflection impacts design, evaluation, and assessment; and what future challenges…

  13. Development of a Multi-Hazard Landscape for Exposure and Risk Interpretation

    EPA Science Inventory

    A complete accounting of potential hazard exposures is critical in the development of any model meant to depict the resilience of a system. This allows for a clear ledger to both assess current risk status along with potential ways to improve resilience. The US EPA is currently...

  14. Thinking outside the boxes: Using current reading models to assess and treat developmental surface dyslexia.

    PubMed

    Law, Caroline; Cupples, Linda

    2017-03-01

    Improving the reading performance of children with developmental surface dyslexia has proved challenging, with limited generalisation of reading skills typically reported after intervention. The aim of this study was to provide tailored, theoretically motivated intervention to two children with developmental surface dyslexia. Our objectives were to improve their reading performance, and to evaluate the utility of current reading models in therapeutic practice. Detailed reading and cognitive profiles for two male children with developmental surface dyslexia were compared to the results obtained by age-matched control groups. The specific area of single-word reading difficulty for each child was identified within the dual route model (DRM) of reading, following which a theoretically motivated intervention programme was devised. Both children showed significant improvements in single-word reading ability after training, with generalisation effects observed for untrained words. However, the assessment and intervention results also differed for each child, reinforcing the view that the causes and consequences of developmental dyslexia, even within subtypes, are not homogeneous. Overall, the results of the interventions corresponded more closely with the DRM than other current reading models, in that real word reading improved in the absence of enhanced nonword reading for both children.

  15. Service quality assessment of workers compensation health care delivery programs in New York using SERVQUAL.

    PubMed

    Arunasalam, Mark; Paulson, Albert; Wallace, William

    2003-01-01

    Preferred provider organizations (PPOs) provide healthcare services to an expanding proportion of the U.S. population. This paper presents a programmatic assessment of service quality in the workers' compensation environment using two different models: the PPO program model and the fee-for-service (FFS) payor model. The methodology used here will augment currently available research in workers' compensation, which has been lacking in measuring service quality determinants and assessing programmatic success/failure of managed care type programs. Results indicated that the SERVQUAL tool provided a reliable and valid clinical quality assessment tool that ascertained that PPO marketers should focus on promoting physician outreach (to show empathy) and accessibility (to show reliability) for injured workers.

  16. Assessment in health care education - modelling and implementation of a computer supported scoring process.

    PubMed

    Alfredsson, Jayne; Plichart, Patrick; Zary, Nabil

    2012-01-01

    Research on computer supported scoring of assessments in health care education has mainly focused on automated scoring. Little attention has been given to how informatics can support the currently predominant human-based grading approach. This paper reports steps taken to develop a model for a computer supported scoring process that focuses on optimizing a task that was previously undertaken without computer support. The model was also implemented in the open source assessment platform TAO in order to study its benefits. Ability to score test takers anonymously, analytics on the graders reliability and a more time efficient process are example of observed benefits. A computer supported scoring will increase the quality of the assessment results.

  17. The Value of Information and Geospatial Technologies for the analysis of tidal current patterns in the Guanabara Bay (Rio de Janeiro)

    NASA Astrophysics Data System (ADS)

    Isotta Cristofori, Elena; Demarchi, Alessandro; Facello, Anna; Cámaro, Walther; Hermosilla, Fernando; López, Jaime

    2016-04-01

    The study and validation of tidal current patterns relies on the combination of several data sources such as numerical weather prediction models, hydrodynamic models, weather stations, current drifters and remote sensing observations. The assessment of the accuracy and the reliability of produced patterns and the communication of results, including an easy to understand visualization of data, is crucial for a variety of stakeholders including decision-makers. The large diffusion of geospatial equipment such as GPS, current drifters, aerial photogrammetry, allows to collect data in the field using mobile and portable devices with a relative limited effort in terms of time and economic resources. Theses real-time measurements are essential in order to validate the models and specifically to assess the skill of the model during critical environmental conditions. Moreover, the considerable development in remote sensing technologies, cartographic services and GPS applications have enabled the creation of Geographic Information Systems (GIS) capable to store, analyze, manage and integrate spatial or geographical information with hydro-meteorological data. This valuable contribution of Information and geospatial technologies can benefit manifold decision-makers including high level sport athletes. While the numerical approach, commonly used to validate models with in-situ data, is more familiar for scientific users, high level sport users are not familiar with a numerical representations of data. Therefore the integration of data collected in the field into a GIS allows an immediate visualization of performed analysis into geographic maps. This visualization represents a particularly effective way to communicate current patterns assessment results and uncertainty in information, leading to an increase of confidence level about the forecast. The aim of this paper is to present the methodology set-up in collaboration with the Austrian Sailing Federation, for the study of tidal current patterns of the Guanabara Bay, venue for the sailing competitions of Rio 2016 Olympic Games. The methodology relies on the integration of a consistent amount of data collected in the field, hydrodynamic model output, cartography and "key-signs" visible on the water into a GIS, proving to be particularly useful to simplify the final information, to help the learning process and to improve the decision making.

  18. Forecasting of wet snow avalanche activity: Proof of concept and operational implementation

    NASA Astrophysics Data System (ADS)

    Gobiet, Andreas; Jöbstl, Lisa; Rieder, Hannes; Bellaire, Sascha; Mitterer, Christoph

    2017-04-01

    State-of-the-art tools for the operational assessment of avalanche danger include field observations, recordings from automatic weather stations, meteorological analyses and forecasts, and recently also indices derived from snowpack models. In particular, an index for identifying the onset of wet-snow avalanche cycles (LWCindex), has been demonstrated to be useful. However, its value for operational avalanche forecasting is currently limited, since detailed, physically based snowpack models are usually driven by meteorological data from automatic weather stations only and have therefore no prognostic ability. Since avalanche risk management heavily relies on timely information and early warnings, many avalanche services in Europe nowadays start issuing forecasts for the following days, instead of the traditional assessment of the current avalanche danger. In this context, the prognostic operation of detailed snowpack models has recently been objective of extensive research. In this study a new, observationally constrained setup for forecasting the onset of wet-snow avalanche cycles with the detailed snow cover model SNOWPACK is presented and evaluated. Based on data from weather stations and different numerical weather prediction models, we demonstrate that forecasts of the LWCindex as indicator for wet-snow avalanche cycles can be useful for operational warning services, but is so far not reliable enough to be used as single warning tool without considering other factors. Therefore, further development currently focuses on the improvement of the forecasts by applying ensemble techniques and suitable post processing approaches to the output of numerical weather prediction models. In parallel, the prognostic meteo-snow model chain is operationally used by two regional avalanche warning services in Austria since winter 2016/2017 for the first time. Experiences from the first operational season and first results from current model developments will be reported.

  19. Kuiper Prize Lecture - Present and past climates of the terrestrial planets

    NASA Technical Reports Server (NTRS)

    Pollack, James B.

    1991-01-01

    An evaluation is undertaken of the current understanding of factors shaping the current climates of Venus, Mars, and the earth, in conjunction with the ways in which these planetary climates may have been different in the past. Attention is given to modeling approaches of various levels of sophistication which both characterize current climates and elucidate prior climatic epochs; these are assessed in light of observational data in order to judge degrees of success thus far and formulate major remaining questions for future investigations. Venus is noted to offer excellent opportunities for modeling the greenhouse effect.

  20. Operational oil spill trajectory modelling using HF radar currents: A northwest European continental shelf case study.

    PubMed

    Abascal, Ana J; Sanchez, Jorge; Chiri, Helios; Ferrer, María I; Cárdenas, Mar; Gallego, Alejandro; Castanedo, Sonia; Medina, Raúl; Alonso-Martirena, Andrés; Berx, Barbara; Turrell, William R; Hughes, Sarah L

    2017-06-15

    This paper presents a novel operational oil spill modelling system based on HF radar currents, implemented in a northwest European shelf sea. The system integrates Open Modal Analysis (OMA), Short Term Prediction algorithms (STPS) and an oil spill model to simulate oil spill trajectories. A set of 18 buoys was used to assess the accuracy of the system for trajectory forecast and to evaluate the benefits of HF radar data compared to the use of currents from a hydrodynamic model (HDM). The results showed that simulated trajectories using OMA currents were more accurate than those obtained using a HDM. After 48h the mean error was reduced by 40%. The forecast skill of the STPS method was valid up to 6h ahead. The analysis performed shows the benefits of HF radar data for operational oil spill modelling, which could be easily implemented in other regions with HF radar coverage. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  1. Coloring of the past via respondent's current psychological state, mediation, and the association between childhood disadvantage and morbidity in adulthood.

    PubMed

    Sheikh, Mashhood Ahmed

    2018-05-31

    Many researchers view retrospective reports with skepticism. Indeed, the observed association between retrospectively-reported childhood disadvantage (CD) and morbidity in adulthood has been criticized as an artefactual correlation driven by the psychological state of the respondent at the time of reporting (current psychological state). The aim of this study was to assess the role of current psychological state in the association between childhood disadvantage and morbidity in adulthood. The present analysis used cross-sectional data collected in 2007-2008 within the framework of the Tromsø Study (N = 10,765), a representative study of adult men and women in Norway. The association between CD and the physical health outcomes heart attack, angina pectoris, chronic bronchitis/emphysema/COPD, diabetes mellitus, hypothyroid/low metabolism, migraine, hypertension, and comorbidity (i.e., the sum of these physical health outcomes) was assessed with Poisson regression models. Relative risks (RR) and 95% confidence intervals (CI) were estimated. A wide range of indicators of respondents' current psychological state were included in the models to assess the % attenuation in estimates. CD was associated with an increased risk of heart attack, angina pectoris, chronic bronchitis/emphysema/COPD, diabetes mellitus, hypothyroid/low metabolism, migraine, hypertension, and comorbidity (p < 0.05), independent of respondents' current psychological state. A sizeable proportion (23-42%) of the association between CD and physical health outcomes was driven by recall bias or mediation via respondents' current psychological state. Controlling for indicators of current psychological state reduced the strength of associations between CD and physical health outcomes; however, the independent associations remained in the same direction. The association between retrospectively-reported CD and physical health outcomes in adulthood is not driven entirely by respondent's current psychological state. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Toward a consistent modeling framework to assess multi-sectoral climate impacts.

    PubMed

    Monier, Erwan; Paltsev, Sergey; Sokolov, Andrei; Chen, Y-H Henry; Gao, Xiang; Ejaz, Qudsia; Couzo, Evan; Schlosser, C Adam; Dutkiewicz, Stephanie; Fant, Charles; Scott, Jeffery; Kicklighter, David; Morris, Jennifer; Jacoby, Henry; Prinn, Ronald; Haigh, Martin

    2018-02-13

    Efforts to estimate the physical and economic impacts of future climate change face substantial challenges. To enrich the currently popular approaches to impact analysis-which involve evaluation of a damage function or multi-model comparisons based on a limited number of standardized scenarios-we propose integrating a geospatially resolved physical representation of impacts into a coupled human-Earth system modeling framework. Large internationally coordinated exercises cannot easily respond to new policy targets and the implementation of standard scenarios across models, institutions and research communities can yield inconsistent estimates. Here, we argue for a shift toward the use of a self-consistent integrated modeling framework to assess climate impacts, and discuss ways the integrated assessment modeling community can move in this direction. We then demonstrate the capabilities of such a modeling framework by conducting a multi-sectoral assessment of climate impacts under a range of consistent and integrated economic and climate scenarios that are responsive to new policies and business expectations.

  3. Locally adaptive, spatially explicit projection of US population for 2030 and 2050.

    PubMed

    McKee, Jacob J; Rose, Amy N; Bright, Edward A; Huynh, Timmy; Bhaduri, Budhendra L

    2015-02-03

    Localized adverse events, including natural hazards, epidemiological events, and human conflict, underscore the criticality of quantifying and mapping current population. Building on the spatial interpolation technique previously developed for high-resolution population distribution data (LandScan Global and LandScan USA), we have constructed an empirically informed spatial distribution of projected population of the contiguous United States for 2030 and 2050, depicting one of many possible population futures. Whereas most current large-scale, spatially explicit population projections typically rely on a population gravity model to determine areas of future growth, our projection model departs from these by accounting for multiple components that affect population distribution. Modeled variables, which included land cover, slope, distances to larger cities, and a moving average of current population, were locally adaptive and geographically varying. The resulting weighted surface was used to determine which areas had the greatest likelihood for future population change. Population projections of county level numbers were developed using a modified version of the US Census's projection methodology, with the US Census's official projection as the benchmark. Applications of our model include incorporating multiple various scenario-driven events to produce a range of spatially explicit population futures for suitability modeling, service area planning for governmental agencies, consequence assessment, mitigation planning and implementation, and assessment of spatially vulnerable populations.

  4. Assessment of Rip-Current Hazards Using Alongshore Topographic Anisotropy at Bondi Beach, Australia

    NASA Astrophysics Data System (ADS)

    Hartman, K.; Trimble, S. M.; Bishop, M. P.; Houser, C.

    2016-12-01

    Rip currents are a relatively high-velocity flow of water away from the beach common in coastal environments. As beach morphology adapts to sediment fluxes and wave climate, it is essential to be able to assess rip-current hazard conditions. Furthermore, it is essential to be able to characterize the scale-dependent bathymetric morphology that governs the extent and magnitude of a rip current. Consequently, our primary objective is to assess the alongshore distribution of topographic anisotropy, in order to identify rip-current hazard locations. Specifically, we utilized multi-band satellite imagery to generate a bathymetric digital elevation model (DEM) for Bondi Beach Australia, and collected field data to support our analysis. Scale-dependent spatial analysis of the DEM was conducted to assess the directional dependence of topographic relief, the magnitude of topographic anisotropy, and the degree of anisotropic symmetry. We displayed anisotropy parameters as images and false-color composites to visualize morphological conditions associated with rip channels. Our preliminary results indicate that rip channels generally have a higher anisotropy index and orthogonal orientation compared to dissipative or reflective beach anisotropy and orientation. Scale-dependent variations in anisotropy can be used to assess the spatial extent of rip currents. Furthermore, well-defined rip channels exhibit positive symmetry, while variations in the distribution of symmetry reflect sediment-flux variations alongshore. These results clearly reveal that a well-developed rip channel can be identified and assessed using topographic anisotropy, as scale-dependent anisotropy patterns are unique when compared to the surrounding bathymetry and terrain. In this way, it is possible to evaluate the alongshore distribution of rip currents. Alongshore topographic anisotropy data will be extremely important as input into hazard assessment studies and the development of hazard decision support systems.

  5. An occurrence model for the national assessment of volcanogenic beryllium deposits

    USGS Publications Warehouse

    Foley, Nora K.; Seal, Robert R.; Piatak, Nadine M.; Hetland, Brianna

    2010-01-01

    The general occurrence model summarized here is intended to provide a descriptive basis for the identification and assessment of undiscovered beryllium deposits of a type and style similar to those found at Spor Mountain, Juab County, Utah. The assessment model is restricted in its application in order to provide a coherent basis for assessing the probability of the occurrence of similar economic deposits using the current U.S. Geological Survey methodology. The model is intended to be used to identify tracts of land where volcanogenic epithermal replacement-type beryllium deposits hosted by metaluminous to peraluminous rhyolite are most likely to occur. Only a limited number of deposits or districts of this type are known, and only the ores of the Spor Mountain district have been studied in detail. The model highlights those distinctive aspects and features of volcanogenic epithermal beryllium deposits that pertain to the development of assessment criteria and puts forward a baseline analysis of the geoenvironmental consequences of mining deposits of this type.

  6. Current Advances and Future Directions in Behavior Assessment

    ERIC Educational Resources Information Center

    Riley-Tillman, T. Chris; Johnson, Austin H.

    2017-01-01

    Multi-tiered problem-solving models that focus on promoting positive outcomes for student behavior continue to be emphasized within educational research. Although substantial work has been conducted to support systems-level implementation and intervention for behavior, concomitant advances in behavior assessment have been limited. This is despite…

  7. The Development of a Model of Culturally Responsive Science and Mathematics Teaching

    ERIC Educational Resources Information Center

    Hernandez, Cecilia M.; Morales, Amanda R.; Shroyer, M. Gail

    2013-01-01

    This qualitative theoretical study was conducted in response to the current need for an inclusive and comprehensive model to guide the preparation and assessment of teacher candidates for culturally responsive teaching. The process of developing a model of culturally responsive teaching involved three steps: a comprehensive review of the…

  8. A Markovian model for assessment of personnel hiring plans

    NASA Technical Reports Server (NTRS)

    Katz, L. G.

    1974-01-01

    As a result of the current economic environment, many organizations are having to operate with fewer resources. In the manpower area, these constraints have forced organizations to operate within well-defined hiring plans. Exceeding personnel ceilings is in most cases an intolerable situation. A mathematical model, based on the theory of Markov processes, is presented which can be used to assess the chances of success of personnel hiring plans. The model considers a plan to be successful if the final population size, at the end of the planning period, lies within a range specified by management. Although this model was developed to assess personnel hiring plans at the Goddard Space Flight Center, it is directly applicable wherever personnel hiring plans are used.

  9. Maternal hypothyroidism: An overview of current experimental models.

    PubMed

    Ghanbari, Mahboubeh; Ghasemi, Asghar

    2017-10-15

    Maternal hypothyroidism (MH) is the most common cause of transient congenital hypothyroidism. Different animal models are used for assessing developmental effects of MH in offspring. The severity and status of hypothyroidism in animal models must be a reflection of the actual conditions in humans. To obtain comparable results with different clinical conditions, which lead to MH in humans, several factors have been suggested for researchers to consider before designing the experimental models. Regarding development of fetal body systems during pregnancy, interference at different times provides different results and the appropriate time for induction of hypothyroidism should be selected based on accurate time of development of the system under assessment. Other factors that should be taken into consideration include, physiological and biochemical differences between humans and other species, thyroid hormone-independent effects of anti-thyroid drugs, circadian rhythms in TSH secretion, sex differences, physical and psychological stress. This review addresses essential guidelines for selecting and managing the optimal animal model for MH as well as discussing the pros and cons of currently used models. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Improved Mental Acuity Forecasting with an Individualized Quantitative Sleep Model.

    PubMed

    Winslow, Brent D; Nguyen, Nam; Venta, Kimberly E

    2017-01-01

    Sleep impairment significantly alters human brain structure and cognitive function, but available evidence suggests that adults in developed nations are sleeping less. A growing body of research has sought to use sleep to forecast cognitive performance by modeling the relationship between the two, but has generally focused on vigilance rather than other cognitive constructs affected by sleep, such as reaction time, executive function, and working memory. Previous modeling efforts have also utilized subjective, self-reported sleep durations and were restricted to laboratory environments. In the current effort, we addressed these limitations by employing wearable systems and mobile applications to gather objective sleep information, assess multi-construct cognitive performance, and model/predict changes to mental acuity. Thirty participants were recruited for participation in the study, which lasted 1 week. Using the Fitbit Charge HR and a mobile version of the automated neuropsychological assessment metric called CogGauge, we gathered a series of features and utilized the unified model of performance to predict mental acuity based on sleep records. Our results suggest that individuals poorly rate their sleep duration, supporting the need for objective sleep metrics to model circadian changes to mental acuity. Participant compliance in using the wearable throughout the week and responding to the CogGauge assessments was 80%. Specific biases were identified in temporal metrics across mobile devices and operating systems and were excluded from the mental acuity metric development. Individualized prediction of mental acuity consistently outperformed group modeling. This effort indicates the feasibility of creating an individualized, mobile assessment and prediction of mental acuity, compatible with the majority of current mobile devices.

  11. How to use mechanistic effect models in environmental risk assessment of pesticides: Case studies and recommendations from the SETAC workshop MODELINK.

    PubMed

    Hommen, Udo; Forbes, Valery; Grimm, Volker; Preuss, Thomas G; Thorbek, Pernille; Ducrot, Virginie

    2016-01-01

    Mechanistic effect models (MEMs) are useful tools for ecological risk assessment of chemicals to complement experimentation. However, currently no recommendations exist for how to use them in risk assessments. Therefore, the Society of Environmental Toxicology and Chemistry (SETAC) MODELINK workshop aimed at providing guidance for when and how to apply MEMs in regulatory risk assessments. The workshop focused on risk assessment of plant protection products under Regulation (EC) No 1107/2009 using MEMs at the organism and population levels. Realistic applications of MEMs were demonstrated in 6 case studies covering assessments for plants, invertebrates, and vertebrates in aquatic and terrestrial habitats. From the case studies and their evaluation, 12 recommendations on the future use of MEMs were formulated, addressing the issues of how to translate specific protection goals into workable questions, how to select species and scenarios to be modeled, and where and how to fit MEMs into current and future risk assessment schemes. The most important recommendations are that protection goals should be made more quantitative; the species to be modeled must be vulnerable not only regarding toxic effects but also regarding their life history and dispersal traits; the models should be as realistic as possible for a specific risk assessment question, and the level of conservatism required for a specific risk assessment should be reached by designing appropriately conservative environmental and exposure scenarios; scenarios should include different regions of the European Union (EU) and different crops; in the long run, generic MEMs covering relevant species based on representative scenarios should be developed, which will require EU-level joint initiatives of all stakeholders involved. The main conclusion from the MODELINK workshop is that the considerable effort required for making MEMs an integral part of environmental risk assessment of pesticides is worthwhile, because it will make risk assessments not only more ecologically relevant and less uncertain but also more comprehensive, coherent, and cost effective. © 2015 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of SETAC.

  12. Source term model evaluations for the low-level waste facility performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yim, M.S.; Su, S.I.

    1995-12-31

    The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.

  13. Anaerobic co-digestion of municipal food waste and sewage sludge: A comparative life cycle assessment in the context of a waste service provision.

    PubMed

    Edwards, Joel; Othman, Maazuza; Crossin, Enda; Burn, Stewart

    2017-01-01

    This study used life cycle assessment to evaluate the environmental impact of anaerobic co-digestion (AcoD) and compared it against the current waste management system in two case study areas. Results indicated AcoD to have less environmental impact for all categories modelled excluding human toxicity, despite the need to collect and pre-treat food waste separately. Uncertainty modelling confirmed that AcoD has a 100% likelihood of a smaller global warming potential, and for acidification, eutrophication and fossil fuel depletion AcoD carried a greater than 85% confidence of inducing a lesser impact than the current waste service. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  14. Review and comparison between the Wells-Riley and dose-response approaches to risk assessment of infectious respiratory diseases.

    PubMed

    Sze To, G N; Chao, C Y H

    2010-02-01

    Infection risk assessment is very useful in understanding the transmission dynamics of infectious diseases and in predicting the risk of these diseases to the public. Quantitative infection risk assessment can provide quantitative analysis of disease transmission and the effectiveness of infection control measures. The Wells-Riley model has been extensively used for quantitative infection risk assessment of respiratory infectious diseases in indoor premises. Some newer studies have also proposed the use of dose-response models for such purpose. This study reviews and compares these two approaches to infection risk assessment of respiratory infectious diseases. The Wells-Riley model allows quick assessment and does not require interspecies extrapolation of infectivity. Dose-response models can consider other disease transmission routes in addition to airborne route and can calculate the infectious source strength of an outbreak in terms of the quantity of the pathogen rather than a hypothetical unit. Spatial distribution of airborne pathogens is one of the most important factors in infection risk assessment of respiratory disease. Respiratory deposition of aerosol induces heterogeneous infectivity of intake pathogens and randomness on the intake dose, which are not being well accounted for in current risk models. Some suggestions for further development of the risk assessment models are proposed. This review article summarizes the strengths and limitations of the Wells-Riley and the dose-response models for risk assessment of respiratory diseases. Even with many efforts by various investigators to develop and modify the risk assessment models, some limitations still persist. This review serves as a reference for further development of infection risk assessment models of respiratory diseases. The Wells-Riley model and dose-response model offer specific advantages. Risk assessors can select the approach that is suitable to their particular conditions to perform risk assessment.

  15. Testing and Analysis of NEXT Ion Engine Discharge Cathode Assembly Wear

    NASA Technical Reports Server (NTRS)

    Domonkos, Matthew T.; Foster, John E.; Soulas, George C.; Nakles, Michael

    2003-01-01

    Experimental and analytical investigations were conducted to predict the wear of the discharge cathode keeper in the NASA Evolutionary Xenon Thruster. The ion current to the keeper was found to be highly dependent upon the beam current, and the average beam current density was nearly identical to that of the NSTAR thruster for comparable beam current density. The ion current distribution was highly peaked toward the keeper orifice. A deterministic wear assessment predicted keeper orifice erosion to the same diameter as the cathode tube after processing 375 kg of xenon. A rough estimate of discharge cathode assembly life limit due to sputtering indicated that the current design exceeds the qualification goal of 405 kg. Probabilistic wear analysis showed that the plasma potential and the sputter yield contributed most to the uncertainty in the wear assessment. It was recommended that fundamental experimental and modeling efforts focus on accurately describing the plasma potential and the sputtering yield.

  16. Validation of Bioreactor and Human-on-a-Chip Devices for Chemical Safety Assessment.

    PubMed

    Rebelo, Sofia P; Dehne, Eva-Maria; Brito, Catarina; Horland, Reyk; Alves, Paula M; Marx, Uwe

    2016-01-01

    Equipment and device qualification and test assay validation in the field of tissue engineered human organs for substance assessment remain formidable tasks with only a few successful examples so far. The hurdles seem to increase with the growing complexity of the biological systems, emulated by the respective models. Controlled single tissue or organ culture in bioreactors improves the organ-specific functions and maintains their phenotypic stability for longer periods of time. The reproducibility attained with bioreactor operations is, per se, an advantage for the validation of safety assessment. Regulatory agencies have gradually altered the validation concept from exhaustive "product" to rigorous and detailed process characterization, valuing reproducibility as a standard for validation. "Human-on-a-chip" technologies applying micro-physiological systems to the in vitro combination of miniaturized human organ equivalents into functional human micro-organisms are nowadays thought to be the most elaborate solution created to date. They target the replacement of the current most complex models-laboratory animals. Therefore, we provide here a road map towards the validation of such "human-on-a-chip" models and qualification of their respective bioreactor and microchip equipment along a path currently used for the respective animal models.

  17. An Assessment of Feedback Procedures and Information Provided to Instructors within Computer Managed Learning Environments--Implications for Instruction and Software Redesign.

    ERIC Educational Resources Information Center

    Kotesky, Arturo A.

    Feedback procedures and information provided to instructors within computer managed learning environments were assessed to determine current usefulness and meaningfulness to users, and to present the design of a different instructor feedback instrument. Kaufman's system model was applied to accomplish the needs assessment phase of the study; and…

  18. [Genetically modified food and allergies - an update].

    PubMed

    Niemann, Birgit; Pöting, Annette; Braeuning, Albert; Lampen, Alfonso

    2016-07-01

    Approval by the European Commission is mandatory for placing genetically modified plants as food or feed on the market in member states of the European Union (EU). The approval is preceded by a safety assessment based on the guidance of the European Food Safety Authority EFSA. The assessment of allergenicity of genetically modified plants and their newly expressed proteins is an integral part of this assessment process. Guidance documents for the assessment of allergenicity are currently under revision. For this purpose, an expert workshop was conducted in Brussels on June 17, 2015. There, methodological improvements for the assessment of coeliac disease-causing properties of proteins, as well as the use of complex models for in vitro digestion of proteins were discussed. Using such techniques a refinement of the current, proven system of allergenicity assessment of genetically modified plants can be achieved.

  19. Stem cell-derived models to improve mechanistic understanding and prediction of human drug-induced liver injury.

    PubMed

    Goldring, Christopher; Antoine, Daniel J; Bonner, Frank; Crozier, Jonathan; Denning, Chris; Fontana, Robert J; Hanley, Neil A; Hay, David C; Ingelman-Sundberg, Magnus; Juhila, Satu; Kitteringham, Neil; Silva-Lima, Beatriz; Norris, Alan; Pridgeon, Chris; Ross, James A; Young, Rowena Sison; Tagle, Danilo; Tornesi, Belen; van de Water, Bob; Weaver, Richard J; Zhang, Fang; Park, B Kevin

    2017-02-01

    Current preclinical drug testing does not predict some forms of adverse drug reactions in humans. Efforts at improving predictability of drug-induced tissue injury in humans include using stem cell technology to generate human cells for screening for adverse effects of drugs in humans. The advent of induced pluripotent stem cells means that it may ultimately be possible to develop personalized toxicology to determine interindividual susceptibility to adverse drug reactions. However, the complexity of idiosyncratic drug-induced liver injury means that no current single-cell model, whether of primary liver tissue origin, from liver cell lines, or derived from stem cells, adequately emulates what is believed to occur during human drug-induced liver injury. Nevertheless, a single-cell model of a human hepatocyte which emulates key features of a hepatocyte is likely to be valuable in assessing potential chemical risk; furthermore, understanding how to generate a relevant hepatocyte will also be critical to efforts to build complex multicellular models of the liver. Currently, hepatocyte-like cells differentiated from stem cells still fall short of recapitulating the full mature hepatocellular phenotype. Therefore, we convened a number of experts from the areas of preclinical and clinical hepatotoxicity and safety assessment, from industry, academia, and regulatory bodies, to specifically explore the application of stem cells in hepatotoxicity safety assessment and to make recommendations for the way forward. In this short review, we particularly discuss the importance of benchmarking stem cell-derived hepatocyte-like cells to their terminally differentiated human counterparts using defined phenotyping, to make sure the cells are relevant and comparable between labs, and outline why this process is essential before the cells are introduced into chemical safety assessment. (Hepatology 2017;65:710-721). © 2016 by the American Association for the Study of Liver Diseases.

  20. Mirror neurons and imitation: a computationally guided review.

    PubMed

    Oztop, Erhan; Kawato, Mitsuo; Arbib, Michael

    2006-04-01

    Neurophysiology reveals the properties of individual mirror neurons in the macaque while brain imaging reveals the presence of 'mirror systems' (not individual neurons) in the human. Current conceptual models attribute high level functions such as action understanding, imitation, and language to mirror neurons. However, only the first of these three functions is well-developed in monkeys. We thus distinguish current opinions (conceptual models) on mirror neuron function from more detailed computational models. We assess the strengths and weaknesses of current computational models in addressing the data and speculations on mirror neurons (macaque) and mirror systems (human). In particular, our mirror neuron system (MNS), mental state inference (MSI) and modular selection and identification for control (MOSAIC) models are analyzed in more detail. Conceptual models often overlook the computational requirements for posited functions, while too many computational models adopt the erroneous hypothesis that mirror neurons are interchangeable with imitation ability. Our meta-analysis underlines the gap between conceptual and computational models and points out the research effort required from both sides to reduce this gap.

  1. Modeling of two-phase flow instabilities during startup transients utilizing RAMONA-4B methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paniagua, J.; Rohatgi, U.S.; Prasad, V.

    1996-10-01

    RAMONA-4B code is currently under development for simulating thermal hydraulic instabilities that can occur in Boiling Water Reactors (BWRs) and the Simplified Boiling Water Reactor (SBWR). As one of the missions of RAMONA-4B is to simulate SBWR startup transients, where geysering or condensation-induced instability may be encountered, the code needs to be assessed for this application. This paper outlines the results of the assessments of the current version of RAMONA-4B and the modifications necessary for simulating the geysering or condensation-induced instability. The test selected for assessment are the geysering tests performed by Prof Aritomi (1993).

  2. A fuzzy model for assessing risk of occupational safety in the processing industry.

    PubMed

    Tadic, Danijela; Djapan, Marko; Misita, Mirjana; Stefanovic, Miladin; Milanovic, Dragan D

    2012-01-01

    Managing occupational safety in any kind of industry, especially in processing, is very important and complex. This paper develops a new method for occupational risk assessment in the presence of uncertainties. Uncertain values of hazardous factors and consequence frequencies are described with linguistic expressions defined by a safety management team. They are modeled with fuzzy sets. Consequence severities depend on current hazardous factors, and their values are calculated with the proposed procedure. The proposed model is tested with real-life data from fruit processing firms in Central Serbia.

  3. Assessment of Current Process Modeling Approaches to Determine Their Limitations, Applicability and Developments Needed for Long-Fiber Thermoplastic Injection Molded Composites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Ba Nghiep; Holbery, Jim; Smith, Mark T.

    2006-11-30

    This report describes the status of the current process modeling approaches to predict the behavior and flow of fiber-filled thermoplastics under injection molding conditions. Previously, models have been developed to simulate the injection molding of short-fiber thermoplastics, and an as-formed composite part or component can then be predicted that contains a microstructure resulting from the constituents’ material properties and characteristics as well as the processing parameters. Our objective is to assess these models in order to determine their capabilities and limitations, and the developments needed for long-fiber injection-molded thermoplastics (LFTs). First, the concentration regimes are summarized to facilitate the understandingmore » of different types of fiber-fiber interaction that can occur for a given fiber volume fraction. After the formulation of the fiber suspension flow problem and the simplification leading to the Hele-Shaw approach, the interaction mechanisms are discussed. Next, the establishment of the rheological constitutive equation is presented that reflects the coupled flow/orientation nature. The decoupled flow/orientation approach is also discussed which constitutes a good simplification for many applications involving flows in thin cavities. Finally, before outlining the necessary developments for LFTs, some applications of the current orientation model and the so-called modified Folgar-Tucker model are illustrated through the fiber orientation predictions for selected LFT samples.« less

  4. Review and assessment of turbulence models for hypersonic flows

    NASA Astrophysics Data System (ADS)

    Roy, Christopher J.; Blottner, Frederick G.

    2006-10-01

    Accurate aerodynamic prediction is critical for the design and optimization of hypersonic vehicles. Turbulence modeling remains a major source of uncertainty in the computational prediction of aerodynamic forces and heating for these systems. The first goal of this article is to update the previous comprehensive review of hypersonic shock/turbulent boundary-layer interaction experiments published in 1991 by Settles and Dodson (Hypersonic shock/boundary-layer interaction database. NASA CR 177577, 1991). In their review, Settles and Dodson developed a methodology for assessing experiments appropriate for turbulence model validation and critically surveyed the existing hypersonic experiments. We limit the scope of our current effort by considering only two-dimensional (2D)/axisymmetric flows in the hypersonic flow regime where calorically perfect gas models are appropriate. We extend the prior database of recommended hypersonic experiments (on four 2D and two 3D shock-interaction geometries) by adding three new geometries. The first two geometries, the flat plate/cylinder and the sharp cone, are canonical, zero-pressure gradient flows which are amenable to theory-based correlations, and these correlations are discussed in detail. The third geometry added is the 2D shock impinging on a turbulent flat plate boundary layer. The current 2D hypersonic database for shock-interaction flows thus consists of nine experiments on five different geometries. The second goal of this study is to review and assess the validation usage of various turbulence models on the existing experimental database. Here we limit the scope to one- and two-equation turbulence models where integration to the wall is used (i.e., we omit studies involving wall functions). A methodology for validating turbulence models is given, followed by an extensive evaluation of the turbulence models on the current hypersonic experimental database. A total of 18 one- and two-equation turbulence models are reviewed, and results of turbulence model assessments for the six models that have been extensively applied to the hypersonic validation database are compiled and presented in graphical form. While some of the turbulence models do provide reasonable predictions for the surface pressure, the predictions for surface heat flux are generally poor, and often in error by a factor of four or more. In the vast majority of the turbulence model validation studies we review, the authors fail to adequately address the numerical accuracy of the simulations (i.e., discretization and iterative error) and the sensitivities of the model predictions to freestream turbulence quantities or near-wall y+ mesh spacing. We recommend new hypersonic experiments be conducted which (1) measure not only surface quantities but also mean and fluctuating quantities in the interaction region and (2) provide careful estimates of both random experimental uncertainties and correlated bias errors for the measured quantities and freestream conditions. For the turbulence models, we recommend that a wide-range of turbulence models (including newer models) be re-examined on the current hypersonic experimental database, including the more recent experiments. Any future turbulence model validation efforts should carefully assess the numerical accuracy and model sensitivities. In addition, model corrections (e.g., compressibility corrections) should be carefully examined for their effects on a standard, low-speed validation database. Finally, as new experiments or direct numerical simulation data become available with information on mean and fluctuating quantities, they should be used to improve the turbulence models and thus increase their predictive capability.

  5. Neural network model for growth of Salmonella serotypes in ground chicken subjected to temperature abuse during cold storage for application in HACCP and risk assessment

    USDA-ARS?s Scientific Manuscript database

    With the advent of commercial software applications, it is now easy to develop neural network models for predictive microbiology applications. However, different versions of the model may be required to meet the divergent needs of model users. In the current study, the commercial software applicat...

  6. Real-time assessments of water quality: expanding nowcasting throughout the Great Lakes

    USGS Publications Warehouse

    ,

    2013-01-01

    Nowcasts are systems that inform the public of current bacterial water-quality conditions at beaches on the basis of predictive models. During 2010–12, the U.S. Geological Survey (USGS) worked with 23 local and State agencies to improve existing operational beach nowcast systems at 4 beaches and expand the use of predictive models in nowcasts at an additional 45 beaches throughout the Great Lakes. The predictive models were specific to each beach, and the best model for each beach was based on a unique combination of environmental and water-quality explanatory variables. The variables used most often in models to predict Escherichia coli (E. coli) concentrations or the probability of exceeding a State recreational water-quality standard included turbidity, day of the year, wave height, wind direction and speed, antecedent rainfall for various time periods, and change in lake level over 24 hours. During validation of 42 beach models during 2012, the models performed better than the current method to assess recreational water quality (previous day's E. coli concentration). The USGS will continue to work with local agencies to improve nowcast predictions, enable technology transfer of predictive model development procedures, and implement more operational systems during 2013 and beyond.

  7. A REGIONAL APPROACH TO ECOLOGICAL RISK ASSESSMENTS FOR PESTICIDE REGISTRATION

    EPA Science Inventory

    Currently, most ecological risk assessments for EPA pesticide registration are evaluated at the national scale using a predetermined list of test species (OPPTS 850.4225 and 8504250) as a model system with little regard to where and how the product will ultimately be used. The a...

  8. NEW TECHNOLOGIES TO SOLVE OLD PROBLEMS AND ADDRESS ISSUES IN RISK ASSESSMENT

    EPA Science Inventory

    Appropriate utilization of data is an ongoing concern of the regulated industries and the agencies charged with assessing safety or risk. An area of current interest is the possibility that toxicogenomics will enhance our ability to develop higher or high-throughput models for pr...

  9. Environmental Health and Aging: Activity, Exposure and Biological Models to Improve Risk Assessment and Health Promotion

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) and other public health agencies are concerned that the environmental health of America’s growing population of older adults has not been taken into consideration in current approaches to risk assessment. The reduced capacity to respo...

  10. Assessing the benefits and economic values of trees

    Treesearch

    David J. Nowak

    2017-01-01

    Understanding the environmental, economic, and social/community benefits of nature, in particular trees and forests, can lead to better vegetation management and designs to optimize environmental quality and human health for current and future generations. Computer models have been developed to assess forest composition and its associated effects on environmental...

  11. Taiwanese Model of Early Intervention Needs Assessment System

    ERIC Educational Resources Information Center

    Ho, Hua-Kuo

    2008-01-01

    The purpose of this study was intended to investigate the current system of early intervention needs assessment in Taiwan in order to understand the problems encountered and provide the coping strategies for improving the system. Documentary analysis, phone interview and participant observation were employed in the study to collect the research…

  12. Building equity in: strategies for integrating equity into modelling for a 1.5°C world.

    PubMed

    Sonja, Klinsky; Harald, Winkler

    2018-05-13

    Emission pathways consistent with limiting temperature increase to 1.5°C raise pressing questions from an equity perspective. These pathways would limit impacts and benefit vulnerable communities but also present trade-offs that could increase inequality. Meanwhile, rapid mitigation could exacerbate political debates in which equity has played a central role. In this paper, we first develop a set of elements we suggest are essential for evaluating the equity implications of policy actions consistent with 1.5°C. These elements include (i) assess climate impacts, adaptation, loss and damage; (ii) be sensitive to context; (iii) compare costs of mitigation and adaptation policy action; (iv) incorporate human development and poverty; (v) integrate inequality dynamics; and (vi) be clear about normative assumptions and responsive to users. We then assess the ability of current modelling practices to address each element, focusing on global integrated assessment models augmented by national modelling and scenarios. We find current practices face serious limitations across all six dimensions although the severity of these varies. Finally, based on our assessment we identify strategies that may be best suited for enabling us to generate insights into each of the six elements in the context of assessing pathways for a 1.5°C world.This article is part of the theme issue 'The Paris Agreement: understanding the physical and social challenges for a warming world of 1.5°C above pre-industrial levels'. © 2018 The Author(s).

  13. Applications of the International Space Station Probabilistic Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Grant, Warren; Lutomski, Michael G.

    2011-01-01

    Recently the International Space Station (ISS) has incorporated more Probabilistic Risk Assessments (PRAs) in the decision making process for significant issues. Future PRAs will have major impact to ISS and future spacecraft development and operations. These PRAs will have their foundation in the current complete ISS PRA model and the current PRA trade studies that are being analyzed as requested by ISS Program stakeholders. ISS PRAs have recently helped in the decision making process for determining reliability requirements for future NASA spacecraft and commercial spacecraft, making crew rescue decisions, as well as making operational requirements for ISS orbital orientation, planning Extravehicular activities (EVAs) and robotic operations. This paper will describe some applications of the ISS PRA model and how they impacted the final decision. This paper will discuss future analysis topics such as life extension, requirements of new commercial vehicles visiting ISS.

  14. Predicting Grizzly Bear Density in Western North America

    PubMed Central

    Mowat, Garth; Heard, Douglas C.; Schwarz, Carl J.

    2013-01-01

    Conservation of grizzly bears (Ursus arctos) is often controversial and the disagreement often is focused on the estimates of density used to calculate allowable kill. Many recent estimates of grizzly bear density are now available but field-based estimates will never be available for more than a small portion of hunted populations. Current methods of predicting density in areas of management interest are subjective and untested. Objective methods have been proposed, but these statistical models are so dependent on results from individual study areas that the models do not generalize well. We built regression models to relate grizzly bear density to ultimate measures of ecosystem productivity and mortality for interior and coastal ecosystems in North America. We used 90 measures of grizzly bear density in interior ecosystems, of which 14 were currently known to be unoccupied by grizzly bears. In coastal areas, we used 17 measures of density including 2 unoccupied areas. Our best model for coastal areas included a negative relationship with tree cover and positive relationships with the proportion of salmon in the diet and topographic ruggedness, which was correlated with precipitation. Our best interior model included 3 variables that indexed terrestrial productivity, 1 describing vegetation cover, 2 indices of human use of the landscape and, an index of topographic ruggedness. We used our models to predict current population sizes across Canada and present these as alternatives to current population estimates. Our models predict fewer grizzly bears in British Columbia but more bears in Canada than in the latest status review. These predictions can be used to assess population status, set limits for total human-caused mortality, and for conservation planning, but because our predictions are static, they cannot be used to assess population trend. PMID:24367552

  15. Predicting grizzly bear density in western North America.

    PubMed

    Mowat, Garth; Heard, Douglas C; Schwarz, Carl J

    2013-01-01

    Conservation of grizzly bears (Ursus arctos) is often controversial and the disagreement often is focused on the estimates of density used to calculate allowable kill. Many recent estimates of grizzly bear density are now available but field-based estimates will never be available for more than a small portion of hunted populations. Current methods of predicting density in areas of management interest are subjective and untested. Objective methods have been proposed, but these statistical models are so dependent on results from individual study areas that the models do not generalize well. We built regression models to relate grizzly bear density to ultimate measures of ecosystem productivity and mortality for interior and coastal ecosystems in North America. We used 90 measures of grizzly bear density in interior ecosystems, of which 14 were currently known to be unoccupied by grizzly bears. In coastal areas, we used 17 measures of density including 2 unoccupied areas. Our best model for coastal areas included a negative relationship with tree cover and positive relationships with the proportion of salmon in the diet and topographic ruggedness, which was correlated with precipitation. Our best interior model included 3 variables that indexed terrestrial productivity, 1 describing vegetation cover, 2 indices of human use of the landscape and, an index of topographic ruggedness. We used our models to predict current population sizes across Canada and present these as alternatives to current population estimates. Our models predict fewer grizzly bears in British Columbia but more bears in Canada than in the latest status review. These predictions can be used to assess population status, set limits for total human-caused mortality, and for conservation planning, but because our predictions are static, they cannot be used to assess population trend.

  16. Risk Estimation Modeling and Feasibility Testing for a Mobile eHealth Intervention for Binge Drinking Among Young People: The D-ARIANNA (Digital-Alcohol RIsk Alertness Notifying Network for Adolescents and young adults) Project.

    PubMed

    Carrà, Giuseppe; Crocamo, Cristina; Schivalocchi, Alessandro; Bartoli, Francesco; Carretta, Daniele; Brambilla, Giulia; Clerici, Massimo

    2015-01-01

    Binge drinking is common among young people but often relevant risk factors are not recognized. eHealth apps, attractive for young people, may be useful to enhance awareness of this problem. We aimed at developing a current risk estimation model for binge drinking, incorporated into an eHealth app--D-ARIANNA (Digital-Alcohol RIsk Alertness Notifying Network for Adolescents and young adults)--for young people. A longitudinal approach with phase 1 (risk estimation), phase 2 (design), and phase 3 (feasibility) was followed. Risk/protective factors identified from the literature were used to develop a current risk estimation model for binge drinking. Relevant odds ratios were subsequently pooled through meta-analytic techniques with a random-effects model, deriving weighted estimates to be introduced in a final model. A set of questions, matching identified risk factors, were nested in a questionnaire and assessed for wording, content, and acceptability in focus groups involving 110 adolescents and young adults. Ten risk factors (5 modifiable) and 2 protective factors showed significant associations with binge drinking and were included in the model. Their weighted coefficients ranged between -0.71 (school proficiency) and 1.90 (cannabis use). The model, nested in an eHealth app questionnaire, provides in percent an overall current risk score, accompanied by appropriate images. Factors that mostly contribute are shown in summary messages. Minor changes have been realized after focus groups review. Most of the subjects (74%) regarded the eHealth app as helpful to assess binge drinking risk. We could produce an evidence-based eHealth app for young people, evaluating current risk for binge drinking. Its effectiveness will be tested in a large trial.

  17. MODIS imagery improves pest risk assessment: A case study of wheat stem sawfly (Cephus cinctus, Hymenoptera: Cephidae) in Colorado, USA

    USGS Publications Warehouse

    Lestina, Jordan; Cook, Maxwell; Kumar, Sunil; Morisette, Jeffrey T.; Ode, Paul J.; Peirs, Frank

    2016-01-01

    Wheat stem sawfly (Cephus cinctus Norton, Hymenoptera: Cephidae) has long been a significant insect pest of spring, and more recently, winter wheat in the northern Great Plains. Wheat stem sawfly was first observed infesting winter wheat in Colorado in 2010 and, subsequently, has spread rapidly throughout wheat production regions of the state. Here, we used maximum entropy modeling (MaxEnt) to generate habitat suitability maps in order to predict the risk of crop damage as this species spreads throughout the winter wheat-growing regions of Colorado. We identified environmental variables that influence the current distribution of wheat stem sawfly in the state and evaluated whether remotely sensed variables improved model performance. We used presence localities of C. cinctus and climatic, topographic, soils, and normalized difference vegetation index and enhanced vegetation index data derived from Moderate Resolution Imaging Spectroradiometer (MODIS) imagery as environmental variables. All models had high performance in that they were successful in predicting suitable habitat for C. cinctus in its current distribution in eastern Colorado. The enhanced vegetation index for the month of April improved model performance and was identified as a top contributor to MaxEnt model. Soil clay percent at 0–5 cm, temperature seasonality, and precipitation seasonality were also associated with C. cinctus distribution in Colorado. The improved model performance resulting from integrating vegetation indices in our study demonstrates the ability of remote sensing technologies to enhance species distribution modeling. These risk maps generated can assist managers in planning control measures for current infestations and assess the future risk of C. cinctus establishment in currently uninfested regions.

  18. Lake Michigan nearshore: How modeling scenarios can improve dialog between modelers and ecologists

    EPA Science Inventory

    The nearshore of Lake Michigan, similarly to the other Great Lakes, experiences environmental concerns due to excessive eutrophication. Assessing the nearshore is challenging because fluctuating nutrient loads, and ever-changing currents cause this area to exhibit large spatial a...

  19. Parent-Adolescent Relationship Qualities, Internal Working Models, and Styles as Predictors of Adolescents’ Observed Interactions with Friends

    PubMed Central

    Shomaker, Lauren B.; Furman, Wyndol

    2010-01-01

    This study examined how current parent-adolescent relationship qualities and adolescents’ representations of relationships with parents were related to friendship interactions in 200 adolescent-close friend dyads. Adolescents and friends were observed discussing problems during a series of structured tasks. Negative interactions with mothers were significantly related to adolescents’ greater conflict with friends, poorer focus on tasks, and poorer communication skills. Security of working models (as assessed by interview) was significantly associated with qualities of friendship interactions, whereas security of attachment styles (as assessed by questionnaire) was not. More dismissing (vs. secure) working models were associated with poorer focus on problem discussions and weaker communication skills with friends, even after accounting for gender differences and current parent-adolescent relationship qualities. We discuss possible mechanisms for the observed links between dimensions of parent-adolescent relationships and friendships. We also consider methodological and conceptual differences between working model and style measures of attachment representations. PMID:20174459

  20. Current and Future Effects of Mexican Immigration in California. Executive Summary. R-3365/1-CR.

    ERIC Educational Resources Information Center

    McCarthy, Kevin F.; Valdez, R. Burciaga

    This study to assess the current situation of Mexican immigrants in California and project future possibilities constructs a demographic profile of the immigrants, examines their economic effects on the state, and describes their socioeconomic integration into California society. Models of immigration/integration processes are developed and used…

  1. Performance Assessment of Model-Based Optimal Feedforward and Feedback Current Profile Control in NSTX-U using the TRANSP Code

    NASA Astrophysics Data System (ADS)

    Ilhan, Z.; Wehner, W. P.; Schuster, E.; Boyer, M. D.; Gates, D. A.; Gerhardt, S.; Menard, J.

    2015-11-01

    Active control of the toroidal current density profile is crucial to achieve and maintain high-performance, MHD-stable plasma operation in NSTX-U. A first-principles-driven, control-oriented model describing the temporal evolution of the current profile has been proposed earlier by combining the magnetic diffusion equation with empirical correlations obtained at NSTX-U for the electron density, electron temperature, and non-inductive current drives. A feedforward + feedback control scheme for the requlation of the current profile is constructed by embedding the proposed nonlinear, physics-based model into the control design process. Firstly, nonlinear optimization techniques are used to design feedforward actuator trajectories that steer the plasma to a desired operating state with the objective of supporting the traditional trial-and-error experimental process of advanced scenario planning. Secondly, a feedback control algorithm to track a desired current profile evolution is developed with the goal of adding robustness to the overall control scheme. The effectiveness of the combined feedforward + feedback control algorithm for current profile regulation is tested in predictive simulations carried out in TRANSP. Supported by PPPL.

  2. Current and future assessments of soil erosion by water on the Tibetan Plateau based on RUSLE and CMIP5 climate models.

    PubMed

    Teng, Hongfen; Liang, Zongzheng; Chen, Songchao; Liu, Yong; Viscarra Rossel, Raphael A; Chappell, Adrian; Yu, Wu; Shi, Zhou

    2018-04-18

    Soil erosion by water is accelerated by a warming climate and negatively impacts water security and ecological conservation. The Tibetan Plateau (TP) has experienced warming at a rate approximately twice that observed globally, and heavy precipitation events lead to an increased risk of erosion. In this study, we assessed current erosion on the TP and predicted potential soil erosion by water in 2050. The study was conducted in three steps. During the first step, we used the Revised Universal Soil Equation (RUSLE), publicly available data, and the most recent earth observations to derive estimates of annual erosion from 2002 to 2016 on the TP at 1-km resolution. During the second step, we used a multiple linear regression (MLR) model and a set of climatic covariates to predict rainfall erosivity on the TP in 2050. The MLR was used to establish the relationship between current rainfall erosivity data and a set of current climatic and other covariates. The coefficients of the MLR were generalised with climate covariates for 2050 derived from the fifth phase of the Coupled Model Intercomparison Project (CMIP5) models to estimate rainfall erosivity in 2050. During the third step, soil erosion by water in 2050 was predicted using rainfall erosivity in 2050 and other erosion factors. The results show that the mean annual soil erosion rate on the TP under current conditions is 2.76tha -1 y -1 , which is equivalent to an annual soil loss of 559.59×10 6 t. Our 2050 projections suggested that erosion on the TP will increase to 3.17tha -1 y -1 and 3.91tha -1 y -1 under conditions represented by RCP2.6 and RCP8.5, respectively. The current assessment and future prediction of soil erosion by water on the TP should be valuable for environment protection and soil conservation in this unique region and elsewhere. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Developing Statistical Models to Assess Transplant Outcomes Using National Registries: The Process in the United States.

    PubMed

    Snyder, Jon J; Salkowski, Nicholas; Kim, S Joseph; Zaun, David; Xiong, Hui; Israni, Ajay K; Kasiske, Bertram L

    2016-02-01

    Created by the US National Organ Transplant Act in 1984, the Scientific Registry of Transplant Recipients (SRTR) is obligated to publicly report data on transplant program and organ procurement organization performance in the United States. These reports include risk-adjusted assessments of graft and patient survival, and programs performing worse or better than expected are identified. The SRTR currently maintains 43 risk adjustment models for assessing posttransplant patient and graft survival and, in collaboration with the SRTR Technical Advisory Committee, has developed and implemented a new systematic process for model evaluation and revision. Patient cohorts for the risk adjustment models are identified, and single-organ and multiorgan transplants are defined, then each risk adjustment model is developed following a prespecified set of steps. Model performance is assessed, the model is refit to a more recent cohort before each evaluation cycle, and then it is applied to the evaluation cohort. The field of solid organ transplantation is unique in the breadth of the standardized data that are collected. These data allow for quality assessment across all transplant providers in the United States. A standardized process of risk model development using data from national registries may enhance the field.

  4. Clinical application of 3D imaging for assessment of treatment outcomes

    PubMed Central

    Cevidanes, Lucia H.C.; Oliveira, Ana Emilia Figueiredo; Grauer, Dan; Styner, Martin; Proffit, William R.

    2011-01-01

    This paper outlines the clinical application of CBCT for assessment of treatment outcomes, and discusses current work to superimpose digital dental models and 3D photographs. Superimposition of CBCTs on stable structures of reference now allow assessment of 3D dental, skeletal and soft tissue changes for both growing and non-growing patients. Additionally, we describe clinical findings from CBCT superimpositions in assessment of surgery and skeletal anchorage treatment. PMID:21516170

  5. Improving Hall Thruster Plume Simulation through Refined Characterization of Near-field Plasma Properties

    NASA Astrophysics Data System (ADS)

    Huismann, Tyler D.

    Due to the rapidly expanding role of electric propulsion (EP) devices, it is important to evaluate their integration with other spacecraft systems. Specifically, EP device plumes can play a major role in spacecraft integration, and as such, accurate characterization of plume structure bears on mission success. This dissertation addresses issues related to accurate prediction of plume structure in a particular type of EP device, a Hall thruster. This is done in two ways: first, by coupling current plume simulation models with current models that simulate a Hall thruster's internal plasma behavior; second, by improving plume simulation models and thereby increasing physical fidelity. These methods are assessed by comparing simulated results to experimental measurements. Assessment indicates the two methods improve plume modeling capabilities significantly: using far-field ion current density as a metric, these approaches used in conjunction improve agreement with measurements by a factor of 2.5, as compared to previous methods. Based on comparison to experimental measurements, recent computational work on discharge chamber modeling has been largely successful in predicting properties of internal thruster plasmas. This model can provide detailed information on plasma properties at a variety of locations. Frequently, experimental data is not available at many locations that are of interest regarding computational models. Excepting the presence of experimental data, there are limited alternatives for scientifically determining plasma properties that are necessary as inputs into plume simulations. Therefore, this dissertation focuses on coupling current models that simulate internal thruster plasma behavior with plume simulation models. Further, recent experimental work on atom-ion interactions has provided a better understanding of particle collisions within plasmas. This experimental work is used to update collision models in a current plume simulation code. Previous versions of the code assume an unknown dependence between particles' pre-collision velocities and post-collision scattering angles. This dissertation focuses on updating several of these types of collisions by assuming a curve fit based on the measurements of atom-ion interactions, such that previously unknown angular dependences are well-characterized.

  6. An Innovative Interactive Modeling Tool to Analyze Scenario-Based Physician Workforce Supply and Demand.

    PubMed

    Gupta, Saurabh; Black-Schaffer, W Stephen; Crawford, James M; Gross, David; Karcher, Donald S; Kaufman, Jill; Knapman, Doug; Prystowsky, Michael B; Wheeler, Thomas M; Bean, Sarah; Kumar, Paramhans; Sharma, Raghav; Chamoli, Vaibhav; Ghai, Vikrant; Gogia, Vineet; Weintraub, Sally; Cohen, Michael B; Robboy, Stanley J

    2015-01-01

    Effective physician workforce management requires that the various organizations comprising the House of Medicine be able to assess their current and future workforce supply. This information has direct relevance to funding of graduate medical education. We describe a dynamic modeling tool that examines how individual factors and practice variables can be used to measure and forecast the supply and demand for existing and new physician services. The system we describe, while built to analyze the pathologist workforce, is sufficiently broad and robust for use in any medical specialty. Our design provides a computer-based software model populated with data from surveys and best estimates by specialty experts about current and new activities in the scope of practice. The model describes the steps needed and data required for analysis of supply and demand. Our modeling tool allows educators and policy makers, in addition to physician specialty organizations, to assess how various factors may affect demand (and supply) of current and emerging services. Examples of factors evaluated include types of professional services (3 categories with 16 subcategories), service locations, elements related to the Patient Protection and Affordable Care Act, new technologies, aging population, and changing roles in capitated, value-based, and team-based systems of care. The model also helps identify where physicians in a given specialty will likely need to assume new roles, develop new expertise, and become more efficient in practice to accommodate new value-based payment models.

  7. An Innovative Interactive Modeling Tool to Analyze Scenario-Based Physician Workforce Supply and Demand

    PubMed Central

    Gupta, Saurabh; Black-Schaffer, W. Stephen; Crawford, James M.; Gross, David; Karcher, Donald S.; Kaufman, Jill; Knapman, Doug; Prystowsky, Michael B.; Wheeler, Thomas M.; Bean, Sarah; Kumar, Paramhans; Sharma, Raghav; Chamoli, Vaibhav; Ghai, Vikrant; Gogia, Vineet; Weintraub, Sally; Cohen, Michael B.

    2015-01-01

    Effective physician workforce management requires that the various organizations comprising the House of Medicine be able to assess their current and future workforce supply. This information has direct relevance to funding of graduate medical education. We describe a dynamic modeling tool that examines how individual factors and practice variables can be used to measure and forecast the supply and demand for existing and new physician services. The system we describe, while built to analyze the pathologist workforce, is sufficiently broad and robust for use in any medical specialty. Our design provides a computer-based software model populated with data from surveys and best estimates by specialty experts about current and new activities in the scope of practice. The model describes the steps needed and data required for analysis of supply and demand. Our modeling tool allows educators and policy makers, in addition to physician specialty organizations, to assess how various factors may affect demand (and supply) of current and emerging services. Examples of factors evaluated include types of professional services (3 categories with 16 subcategories), service locations, elements related to the Patient Protection and Affordable Care Act, new technologies, aging population, and changing roles in capitated, value-based, and team-based systems of care. The model also helps identify where physicians in a given specialty will likely need to assume new roles, develop new expertise, and become more efficient in practice to accommodate new value-based payment models. PMID:28725751

  8. AN ASSESSMENT OF THE ABILITY OF 3-D AIR QUALITY MODELS WITH CURRENT THERMODYNAMIC EQUILIBRIUM MODELS TO PREDICT AEROSOL NO3

    EPA Science Inventory

    The partitioning of total nitrate (TNO3) and total ammonium (TNH4) between gas and aerosol phases is studied with two thermodynamic equilibrium models, ISORROPIA and AIM, and three datasets: high time-resolution measurement data from the 1999 Atlanta SuperSite Experiment and from...

  9. Assessing Psychological Symptoms and Well-Being: Application of a Dual-Factor Mental Health Model to Understand College Student Performance

    ERIC Educational Resources Information Center

    Antaramian, Susan

    2015-01-01

    A dual-factor mental health model includes measures of positive psychological well-being in addition to traditional indicators of psychopathology to comprehensively determine mental health status. The current study examined the utility of this model in understanding the psychological adjustment and educational functioning of college students. A…

  10. Development of task network models of human performance in microgravity

    NASA Technical Reports Server (NTRS)

    Diaz, Manuel F.; Adam, Susan

    1992-01-01

    This paper discusses the utility of task-network modeling for quantifying human performance variability in microgravity. The data are gathered for: (1) improving current methodologies for assessing human performance and workload in the operational space environment; (2) developing tools for assessing alternative system designs; and (3) developing an integrated set of methodologies for the evaluation of performance degradation during extended duration spaceflight. The evaluation entailed an analysis of the Remote Manipulator System payload-grapple task performed on many shuttle missions. Task-network modeling can be used as a tool for assessing and enhancing human performance in man-machine systems, particularly for modeling long-duration manned spaceflight. Task-network modeling can be directed toward improving system efficiency by increasing the understanding of basic capabilities of the human component in the system and the factors that influence these capabilities.

  11. Preliminary Assessment of Turbomachinery Codes

    NASA Technical Reports Server (NTRS)

    Mazumder, Quamrul H.

    2007-01-01

    This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.

  12. An Overview of the Effectiveness of Adolescent Substance Abuse Treatment Models.

    ERIC Educational Resources Information Center

    Muck, Randolph; Zempolich, Kristin A.; Titus, Janet C.; Fishman, Marc; Godley, Mark D.; Schwebel, Robert

    2001-01-01

    Describes current approaches to adolescent substance abuse treatment, including the 12-step treatment approach, behavioral treatment approach, family-based treatment approach, and therapeutic community approach. Summarizes research that assesses the effectiveness of these models, offering findings from the Center for Substance Abuse Treatment's…

  13. Assessment of Important SPECIATE Profiles in EPA’s Emissions Modeling Platform and Current Data Gaps

    EPA Science Inventory

    The US Environmental Protection Agency (EPA)’s SPECIATE database contains speciation profiles for both particulate matter (PM) and volatile organic compounds (VOCs) that are key inputs for creating speciated emission inventories for air quality modeling. The objective of th...

  14. Transdisciplinary application of the cross-scale resilience model

    USGS Publications Warehouse

    Sundstrom, Shana M.; Angeler, David G.; Garmestani, Ahjond S.; Garcia, Jorge H.; Allen, Craig R.

    2014-01-01

    The cross-scale resilience model was developed in ecology to explain the emergence of resilience from the distribution of ecological functions within and across scales, and as a tool to assess resilience. We propose that the model and the underlying discontinuity hypothesis are relevant to other complex adaptive systems, and can be used to identify and track changes in system parameters related to resilience. We explain the theory behind the cross-scale resilience model, review the cases where it has been applied to non-ecological systems, and discuss some examples of social-ecological, archaeological/ anthropological, and economic systems where a cross-scale resilience analysis could add a quantitative dimension to our current understanding of system dynamics and resilience. We argue that the scaling and diversity parameters suitable for a resilience analysis of ecological systems are appropriate for a broad suite of systems where non-normative quantitative assessments of resilience are desired. Our planet is currently characterized by fast environmental and social change, and the cross-scale resilience model has the potential to quantify resilience across many types of complex adaptive systems.

  15. Comparison of dark energy models after Planck 2015

    NASA Astrophysics Data System (ADS)

    Xu, Yue-Yao; Zhang, Xin

    2016-11-01

    We make a comparison for ten typical, popular dark energy models according to their capabilities of fitting the current observational data. The observational data we use in this work include the JLA sample of type Ia supernovae observation, the Planck 2015 distance priors of cosmic microwave background observation, the baryon acoustic oscillations measurements, and the direct measurement of the Hubble constant. Since the models have different numbers of parameters, in order to make a fair comparison, we employ the Akaike and Bayesian information criteria to assess the worth of the models. The analysis results show that, according to the capability of explaining observations, the cosmological constant model is still the best one among all the dark energy models. The generalized Chaplygin gas model, the constant w model, and the α dark energy model are worse than the cosmological constant model, but still are good models compared to others. The holographic dark energy model, the new generalized Chaplygin gas model, and the Chevalliear-Polarski-Linder model can still fit the current observations well, but from an economically feasible perspective, they are not so good. The new agegraphic dark energy model, the Dvali-Gabadadze-Porrati model, and the Ricci dark energy model are excluded by the current observations.

  16. Groundwater flow simulation of the Savannah River Site general separations area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flach, G.; Bagwell, L.; Bennett, P.

    The most recent groundwater flow model of the General Separations Area, Savannah River Site, is referred to as the “GSA/PORFLOW” model. GSA/PORFLOW was developed in 2004 by porting an existing General Separations Area groundwater flow model from the FACT code to the PORFLOW code. The preceding “GSA/FACT” model was developed in 1997 using characterization and monitoring data through the mid-1990’s. Both models were manually calibrated to field data. Significantly more field data have been acquired since the 1990’s and model calibration using mathematical optimization software has become routine and recommended practice. The current task involved updating the GSA/PORFLOW model usingmore » selected field data current through at least 2015, and use of the PEST code to calibrate the model and quantify parameter uncertainty. This new GSA groundwater flow model is named “GSA2016” in reference to the year in which most development occurred. The GSA2016 model update is intended to address issues raised by the DOE Low-Level Waste (LLW) Disposal Facility Federal Review Group (LFRG) in a 2008 review of the E-Area Performance Assessment, and by the Nuclear Regulatory Commission in reviews of tank closure and Saltstone Disposal Facility Performance Assessments.« less

  17. USGS River Ecosystem Modeling: Where Are We, How Did We Get Here, and Where Are We Going?

    USGS Publications Warehouse

    Hanson, Leanne; Schrock, Robin; Waddle, Terry; Duda, Jeffrey J.; Lellis, Bill

    2009-01-01

    This report developed as an outcome of the USGS River Ecosystem Modeling Work Group, convened on February 11, 2008 as a preconference session to the second USGS Modeling Conference in Orange Beach, Ala. Work Group participants gained an understanding of the types of models currently being applied to river ecosystem studies within the USGS, learned how model outputs are being used by a Federal land management agency, and developed recommendations for advancing the state of the art in river ecosystem modeling within the USGS. During a break-out session, participants restated many of the recommendations developed at the first USGS Modeling Conference in 2006 and in previous USGS needs assessments. All Work Group recommendations require organization and coordination across USGS disciplines and regions, and include (1) enhancing communications, (2) increasing efficiency through better use of current human and technologic resources, and (3) providing a national infrastructure for river ecosystem modeling resources, making it easier to integrate modeling efforts. By implementing these recommendations, the USGS will benefit from enhanced multi-disciplinary, integrated models for river ecosystems that provide valuable risk assessment and decision support tools for adaptive management of natural and managed riverine ecosystems. These tools generate key information that resource managers need and can use in making decisions about river ecosystem resources.

  18. Discussion of examination of a cored hydraulic fracture in a deep gas well

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nolte, K.G.

    Warpinski et al. document information found from a core through a formation after a hydraulic fracture treatment. As they indicate, the core provides the first detailed evaluation of an actual propped hydraulic fracture away from the well and at a significant depth, and this evaluation leads to findings that deviate substantially from the assumptions incorporated into current fracturing models. In this discussion, a defense of current fracture design assumptions is developed. The affirmation of current assumptions, for general industry applications, is based on an assessment of the global impact of the local complexity found in the core. The assessment leadsmore » to recommendations for the evolution of fracture design practice.« less

  19. Multi-model Ensemble of Ocean Data Assimilation Products in The Northwestern Pacific and Their Quality Assessment

    NASA Astrophysics Data System (ADS)

    Isoguchi, O.; Matsui, K.; Kamachi, M.; Usui, N.; Miyazawa, Y.; Ishikawa, Y.; Hirose, N.

    2017-12-01

    Several operational ocean assimilation models are currently available for the Northwestern Pacific and surrounding marginal seas. One of the main targets is predicting the Kuroshio/Kuroshio Extension, which have an impact not only on social activities, such as fishery and ship routing, but also on local weather. There is a demand to assess their quality comprehensively and make the best out the available products. In the present study, several ocean data assimilation products and their multi-ensemble product were assessed by comparing with satellite-derived sea surface temperature (SST), sea surface height (SSH), and in-situ hydrographic sections. The Kuroshio axes were also computed from the surface currents of these products and were compared with the Kuroshio Axis data produced analyzing satellite-SST, SSH, and in-situ observations by Marine Information Research Center (MIRC). The multi-model ensemble products generally showed the best accuracy in terms of the comparisons with the satellite-derived SST and SSH. On the other hand, the ensemble products didn't result in the best one in the comparison with the hydrographic sections. It is thus suggested that the multi-model ensemble works efficiently for the horizontally 2D parameters for which each assimilation product tends to have random errors while it does not work well for the vertical 2D comparisons for which it tends to have bias errors with respect to in-situ data. In the assessment with the Kuroshio Axis Data, some products showed more energetic behavior than the Kuroshio Axis data, resulting in the large path errors which are defined as a ratio between an area surrounded by the reference and model-derived ones and a path length. It is however not determined which are real, because in-situ observations are still lacking to resolve energetic Kuroshio behavior even though the Kuroshio is one of the strongest current.

  20. Biodiversity, ecosystem functions and services in environmental risk assessment: introduction to the special issue.

    PubMed

    Schäfer, Ralf B

    2012-01-15

    This Special Issue focuses on the questions if and how biodiversity, ecosystem functions and resulting services could be incorporated into the Ecological Risk Assessment (ERA). Therefore, three articles provide a framework for the integration of ecosystem services into ERA of soils, sediments and pesticides. Further articles demonstrate ways how stakeholders can be integrated into an ecosystem service-based ERA for soils and describe how the current monitoring could be adapted to new assessment endpoints that are directly linked to ecosystem services. Case studies show that the current ERA may not be protective for biodiversity, ecosystem functions and resulting services and that both pesticides and salinity currently adversely affect ecosystem functions in the field. Moreover, ecological models can be used for prediction of new protection goals and could finally support their implementation into the ERA. Overall, the Special Issue stresses the urgent need to enhance current procedures of ERA if biodiversity, ecosystem functions and resulting services are to be protected. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Water Stress on U.S. Power Production at Decadal Time Horizons

    NASA Astrophysics Data System (ADS)

    Ganguli, P.; Kumar, D.; Yun, J.; Short, G.; Klausner, J.; Ganguly, A. R.

    2014-12-01

    Thermoelectric power production at risk, owing to current and projected water scarcity and rising stream temperatures, is assessed for the continental United States (US) at decadal scales. Regional water scarcity is driven by climate variability and change, as well as by multi-sector water demand. While a planning horizon of zero to about thirty years is occasionally prescribed by stakeholders, the challenges to risk assessment at these scales include the difficulty in delineating decadal climate trends from intrinsic natural or multiple model variability. Current generation global climate or earth system models are not credible at the spatial resolutions of power plants, especially for surface water quantity and stream temperatures, which further exacerbates the assessment challenge. Population changes, which are anyway difficult to project, cannot serve as adequate proxies for changes in the water demand across sectors. The hypothesis that robust assessments of power production at risks are possible, despite the uncertainties, has been examined as a proof of concept. An approach is presented for delineating water scarcity and temperature from climate models, observations and population storylines, as well as for assessing power production at risk by examining geospatial correlations of power plant locations within regions where the usable water supply for energy production happens to be scarcer and warmer. Acknowledgment: Funding provided by US DOE's ARPA-E through Award DE-AR0000374.

  2. A developmental, biopsychosocial model for the treatment of children with gender identity disorder.

    PubMed

    Zucker, Kenneth J; Wood, Hayley; Singh, Devita; Bradley, Susan J

    2012-01-01

    This article provides a summary of the therapeutic model and approach used in the Gender Identity Service at the Centre for Addiction and Mental Health in Toronto. The authors describe their assessment protocol, describe their current multifactorial case formulation model, including a strong emphasis on developmental factors, and provide clinical examples of how the model is used in the treatment.

  3. Potential Technologies for Assessing Risk Associated with a Mesoscale Forecast

    DTIC Science & Technology

    2015-10-01

    American GFS models, and informally applied on the Weather Research and Forecasting ( WRF ) model. The current CI equation is as follows...Reen B, Penc R. Investigating surface bias errors in the Weather Research and Forecasting ( WRF ) model using a Geographic Information System (GIS). J...Forecast model ( WRF -ARW) with extensions that might include finer terrain resolutions and more detailed representations of the underlying atmospheric

  4. Determining the impacts of climate change and catchment development on future water availability in Tasmania, Australia

    NASA Astrophysics Data System (ADS)

    Post, David

    2010-05-01

    In a water-scarce country such as Australia, detailed, accurate and reliable assessments of current and future water availability are essential in order to adequately manage the limited water resource. This presentation describes a recently completed study which provided an assessment of current water availability in Tasmania, Australia, and also determined how this water availability would be impacted by climate change and proposed catchment development by the year 2030. The Tasmania Sustainable Yields Project (http://www.csiro.au/partnerships/TasSY.html) assessed current water availability through the application of rainfall-runoff models, river models, and recharge and groundwater models. These were calibrated to streamflow records and parameterised using estimates of current groundwater and surface water extractions and use. Having derived a credible estimate of current water availability, the impacts of future climate change on water availability were determined through deriving changes in rainfall and potential evapotranspiration from 15 IPCC AR4 global climate models. These changes in rainfall were then dynamically downscaled using the CSIRO-CCAM model over the relatively small study area (50,000 square km). A future climate sequence was derived by modifying the historical 84-year climate sequence based on these changes in rainfall and potential evapotranspiration. This future climate sequence was then run through the rainfall-runoff, river, recharge and groundwater models to give an estimate of water availability under future climate. To estimate the impacts of future catchment development on water availability, the models were modified and re-run to reflect projected increases in development. Specifically, outputs from the rainfall-runoff and recharge models were reduced over areas of projected future plantation forestry. Conversely, groundwater recharge was increased over areas of new irrigated agriculture and new extractions of water for irrigation were implemented in the groundwater and river models. Results indicate that historical average water availability across the project area was 21,815 GL/year. Of this, 636 GL/year of surface water and 38 GL/year of groundwater are currently extracted for use. By 2030, rainfall is projected to decrease by an average of 3% over the project area. This decrease in rainfall and concurrent increase in potential evapotranspiration leads to a decrease in water availability of 5% by 2030. As a result of lower streamflows, under current cease-to-take rules, currently licensed extractions are projected to decrease by 3% (19 GL/year). This however is offset by an additional 120 GL/year of extractions for proposed new irrigated agriculture. These new extractions, along with the increase in commercial forest plantations lead to a reduction in total surface water of 1% in addition to the 5% reduction due to climate change. Results from this study are being used by the Tasmanian and Australian governments to guide the development of a sustainable irrigated agriculture industry in Tasmania. In part, this is necessary to offset the loss of irrigated agriculture from the southern Murray-Darling Basin where climate change induced reductions in rainfall are projected to be far worse.

  5. Electric field mill network products to improve detection of the lightning hazard

    NASA Technical Reports Server (NTRS)

    Maier, Launa M.

    1987-01-01

    An electric field mill network has been used at Kennedy Space Center for over 10 years as part of the thunderstorm detection system. Several algorithms are currently available to improve the informational output of the electric field mill data. The charge distributions of roughly 50 percent of all lightning can be modeled as if they reduced the charged cloud by a point charge or a point dipole. Using these models, the spatial differences in the lightning induced electric field changes, and a least squares algorithm to obtain an optimum solution, the three-dimensional locations of the lightning charge centers can be located. During the lifetime of a thunderstorm, dynamically induced charging, modeled as a current source, can be located spatially with measurements of Maxwell current density. The electric field mills can be used to calculate the Maxwell current density at times when it is equal to the displacement current density. These improvements will produce more accurate assessments of the potential electrical activity, identify active cells, and forecast thunderstorm termination.

  6. Assessment of Dependency, Agreeableness, and Their Relationship

    ERIC Educational Resources Information Center

    Lowe, Jennifer Ruth; Edmundson, Maryanne; Widiger, Thomas A.

    2009-01-01

    Agreeableness is central to the 5-factor model conceptualization of dependency. However, 4 meta-analyses of the relationship of agreeableness with dependency have failed to identify a consistent relationship. It was the hypothesis of the current study that these findings might be due in part to an emphasis on the assessment of adaptive, rather…

  7. Improving Science Assessments by Situating Them in a Virtual Environment

    ERIC Educational Resources Information Center

    Ketelhut, Diane Jass; Nelson, Brian; Schifter, Catherine; Kim, Younsu

    2013-01-01

    Current science assessments typically present a series of isolated fact-based questions, poorly representing the complexity of how real-world science is constructed. The National Research Council asserts that this needs to change to reflect a more authentic model of science practice. We strongly concur and suggest that good science assessments…

  8. Improving clinical models based on knowledge extracted from current datasets: a new approach.

    PubMed

    Mendes, D; Paredes, S; Rocha, T; Carvalho, P; Henriques, J; Morais, J

    2016-08-01

    The Cardiovascular Diseases (CVD) are the leading cause of death in the world, being prevention recognized to be a key intervention able to contradict this reality. In this context, although there are several models and scores currently used in clinical practice to assess the risk of a new cardiovascular event, they present some limitations. The goal of this paper is to improve the CVD risk prediction taking into account the current models as well as information extracted from real and recent datasets. This approach is based on a decision tree scheme in order to assure the clinical interpretability of the model. An innovative optimization strategy is developed in order to adjust the decision tree thresholds (rule structure is fixed) based on recent clinical datasets. A real dataset collected in the ambit of the National Registry on Acute Coronary Syndromes, Portuguese Society of Cardiology is applied to validate this work. In order to assess the performance of the new approach, the metrics sensitivity, specificity and accuracy are used. This new approach achieves sensitivity, a specificity and an accuracy values of, 80.52%, 74.19% and 77.27% respectively, which represents an improvement of about 26% in relation to the accuracy of the original score.

  9. Simulation Modeling of Resilience Assessment in Indonesian Fertiliser Industry Supply Networks

    NASA Astrophysics Data System (ADS)

    Utami, I. D.; Holt, R. J.; McKay, A.

    2018-01-01

    Supply network resilience is a significant aspect in the performance of the Indonesian fertiliser industry. Decision makers use risk assessment and port management reports to evaluate the availability of infrastructure. An opportunity was identified to incorporate both types of data into an approach for the measurement of resilience. A framework, based on a synthesis of literature and interviews with industry practitioners, covering both social and technical factors is introduced. A simulation model was then built to allow managers to explore implications for resilience and predict levels of risk in different scenarios. Result of interview with respondens from Indonesian fertiliser industry indicated that the simulation model could be valuable in the assessment. This paper provides details of the simulation model for decision makers to explore levels of risk in supply networks. For practitioners, the model could be used by government to assess the current condition of supply networks in Indonesian industries. On the other hand, for academia, the approach provides a new application of agent-based models in research on supply network resilience and presents a real example of how agent-based modeling could be used as to support the assessment approach.

  10. Proposals for enhanced health risk assessment and stratification in an integrated care scenario.

    PubMed

    Dueñas-Espín, Ivan; Vela, Emili; Pauws, Steffen; Bescos, Cristina; Cano, Isaac; Cleries, Montserrat; Contel, Joan Carles; de Manuel Keenoy, Esteban; Garcia-Aymerich, Judith; Gomez-Cabrero, David; Kaye, Rachelle; Lahr, Maarten M H; Lluch-Ariet, Magí; Moharra, Montserrat; Monterde, David; Mora, Joana; Nalin, Marco; Pavlickova, Andrea; Piera, Jordi; Ponce, Sara; Santaeugenia, Sebastià; Schonenberg, Helen; Störk, Stefan; Tegner, Jesper; Velickovski, Filip; Westerteicher, Christoph; Roca, Josep

    2016-04-15

    Population-based health risk assessment and stratification are considered highly relevant for large-scale implementation of integrated care by facilitating services design and case identification. The principal objective of the study was to analyse five health-risk assessment strategies and health indicators used in the five regions participating in the Advancing Care Coordination and Telehealth Deployment (ACT) programme (http://www.act-programme.eu). The second purpose was to elaborate on strategies toward enhanced health risk predictive modelling in the clinical scenario. The five ACT regions: Scotland (UK), Basque Country (ES), Catalonia (ES), Lombardy (I) and Groningen (NL). Responsible teams for regional data management in the five ACT regions. We characterised and compared risk assessment strategies among ACT regions by analysing operational health risk predictive modelling tools for population-based stratification, as well as available health indicators at regional level. The analysis of the risk assessment tool deployed in Catalonia in 2015 (GMAs, Adjusted Morbidity Groups) was used as a basis to propose how population-based analytics could contribute to clinical risk prediction. There was consensus on the need for a population health approach to generate health risk predictive modelling. However, this strategy was fully in place only in two ACT regions: Basque Country and Catalonia. We found marked differences among regions in health risk predictive modelling tools and health indicators, and identified key factors constraining their comparability. The research proposes means to overcome current limitations and the use of population-based health risk prediction for enhanced clinical risk assessment. The results indicate the need for further efforts to improve both comparability and flexibility of current population-based health risk predictive modelling approaches. Applicability and impact of the proposals for enhanced clinical risk assessment require prospective evaluation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  11. Quantum chemistry in environmental pesticide risk assessment.

    PubMed

    Villaverde, Juan J; López-Goti, Carmen; Alcamí, Manuel; Lamsabhi, Al Mokhtar; Alonso-Prados, José L; Sandín-España, Pilar

    2017-11-01

    The scientific community and regulatory bodies worldwide, currently promote the development of non-experimental tests that produce reliable data for pesticide risk assessment. The use of standard quantum chemistry methods could allow the development of tools to perform a first screening of compounds to be considered for the experimental studies, improving the risk assessment. This fact results in a better distribution of resources and in better planning, allowing a more exhaustive study of the pesticides and their metabolic products. The current paper explores the potential of quantum chemistry in modelling toxicity and environmental behaviour of pesticides and their by-products by using electronic descriptors obtained computationally. Quantum chemistry has potential to estimate the physico-chemical properties of pesticides, including certain chemical reaction mechanisms and their degradation pathways, allowing modelling of the environmental behaviour of both pesticides and their by-products. In this sense, theoretical methods can contribute to performing a more focused risk assessment of pesticides used in the market, and may lead to higher quality and safer agricultural products. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  12. Space-Based Sensorweb Monitoring of Wildfires in Thailand

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Doubleday, Joshua; Mclaren, David; Davies, Ashley; Tran, Daniel; Tanpipat, Veerachai; Akaakara, Siri; Ratanasuwan, Anuchit; Mandl, Daniel

    2011-01-01

    We describe efforts to apply sensorweb technologies to the monitoring of forest fires in Thailand. In this approach, satellite data and ground reports are assimilated to assess the current state of the forest system in terms of forest fire risk, active fires, and likely progression of fires and smoke plumes. This current and projected assessment can then be used to actively direct sensors and assets to best acquire further information. This process operates continually with new data updating models of fire activity leading to further sensing and updating of models. As the fire activity is tracked, products such as active fire maps, burn scar severity maps, and alerts are automatically delivered to relevant parties.We describe the current state of the Thailand Fire Sensorweb which utilizes the MODIS-based FIRMS system to track active fires and trigger Earth Observing One / Advanced Land Imager to acquire imagery and produce active fire maps, burn scar severity maps, and alerts. We describe ongoing work to integrate additional sensor sources and generate additional products.

  13. Predicted aircraft effects on stratospheric ozone

    NASA Technical Reports Server (NTRS)

    Ko, Malcolm K. W.; Wofsy, Steve; Kley, Dieter; Zhadin, Evgeny A.; Johnson, Colin; Weisenstein, Debra; Prather, Michael J.; Wuebbles, Donald J.

    1991-01-01

    The possibility that the current fleet of subsonic aircraft may already have caused detectable changes in both the troposphere and stratosphere has raised concerns about the impact of such operations on stratospheric ozone and climate. Recent interest in the operation of supersonic aircraft in the lower stratosphere has heightened such concerns. Previous assessments of impacts from proposed supersonic aircraft were based mostly on one-dimensional model results although a limited number of multidimensional models were used. In the past 15 years, our understanding of the processes that control the atmospheric concentrations of trace gases has changed dramatically. This better understanding was achieved through accumulation of kinetic data and field observations as well as development of new models. It would be beneficial to start examining the impact of subsonic aircraft to identify opportunities to study and validate the mechanisms that were proposed to explain the ozone responses. The two major concerns are the potential for a decrease in the column abundance of ozone leading to an increase in ultraviolet radiation at the ground, and redistribution of ozone in the lower stratosphere and upper troposphere leading to changes in the Earth's climate. Two-dimensional models were used extensively for ozone assessment studies, with a focus on responses to chlorine perturbations. There are problems specific to the aircraft issues that are not adequately addressed by the current models. This chapter reviews the current status of the research on aircraft impact on ozone with emphasis on immediate model improvements necessary for extending our understanding. The discussion will be limited to current and projected commercial aircraft that are equipped with air-breathing engines using conventional jet fuel. The impacts are discussed in terms of the anticipated fuel use at cruise altitude.

  14. Decision analysis and risk models for land development affecting infrastructure systems.

    PubMed

    Thekdi, Shital A; Lambert, James H

    2012-07-01

    Coordination and layering of models to identify risks in complex systems such as large-scale infrastructure of energy, water, and transportation is of current interest across application domains. Such infrastructures are increasingly vulnerable to adjacent commercial and residential land development. Land development can compromise the performance of essential infrastructure systems and increase the costs of maintaining or increasing performance. A risk-informed approach to this topic would be useful to avoid surprise, regret, and the need for costly remedies. This article develops a layering and coordination of models for risk management of land development affecting infrastructure systems. The layers are: system identification, expert elicitation, predictive modeling, comparison of investment alternatives, and implications of current decisions for future options. The modeling layers share a focus on observable factors that most contribute to volatility of land development and land use. The relevant data and expert evidence include current and forecasted growth in population and employment, conservation and preservation rules, land topography and geometries, real estate assessments, market and economic conditions, and other factors. The approach integrates to a decision framework of strategic considerations based on assessing risk, cost, and opportunity in order to prioritize needs and potential remedies that mitigate impacts of land development to the infrastructure systems. The approach is demonstrated for a 5,700-mile multimodal transportation system adjacent to 60,000 tracts of potential land development. © 2011 Society for Risk Analysis.

  15. Machine Learning for Social Services: A Study of Prenatal Case Management in Illinois.

    PubMed

    Pan, Ian; Nolan, Laura B; Brown, Rashida R; Khan, Romana; van der Boor, Paul; Harris, Daniel G; Ghani, Rayid

    2017-06-01

    To evaluate the positive predictive value of machine learning algorithms for early assessment of adverse birth risk among pregnant women as a means of improving the allocation of social services. We used administrative data for 6457 women collected by the Illinois Department of Human Services from July 2014 to May 2015 to develop a machine learning model for adverse birth prediction and improve upon the existing paper-based risk assessment. We compared different models and determined the strongest predictors of adverse birth outcomes using positive predictive value as the metric for selection. Machine learning algorithms performed similarly, outperforming the current paper-based risk assessment by up to 36%; a refined paper-based assessment outperformed the current assessment by up to 22%. We estimate that these improvements will allow 100 to 170 additional high-risk pregnant women screened for program eligibility each year to receive services that would have otherwise been unobtainable. Our analysis exhibits the potential for machine learning to move government agencies toward a more data-informed approach to evaluating risk and providing social services. Overall, such efforts will improve the efficiency of allocating resource-intensive interventions.

  16. SHALLOW SUBSURFACE MAPPING BY ELECTROMAGNETIC SOUNDING IN THE 300 KHZTO 30 MHZ RANGE: MODEL STUDIES AND PROTOTYPE SYSTEM ASSESSMENT

    EPA Science Inventory

    A new instrument designedfor frequency-domain sounding in the depth range 0-10 m uses short coil spacings of 5 m or less and a frequency range of 300 kHz to 30 MHz. In this frequency range, both conduction currents (controlled by electrical conductibity) and displacement currents...

  17. Dyadic OPTION: Measuring perceptions of shared decision-making in practice.

    PubMed

    Melbourne, Emma; Roberts, Stephen; Durand, Marie-Anne; Newcombe, Robert; Légaré, France; Elwyn, Glyn

    2011-04-01

    Current models of the medical consultation emphasize shared decision-making (SDM), whereby the expertise of both the doctor and the patient are recognised and seen to equally contribute to the consultation. The evidence regarding the desirability and effectiveness of the SDM approach is often conflicting. It is proposed that the conflicts are due to the nature of assessment, with current assessments from the perspective of an outside observer. To empirically assess perceived involvement in the medical consultation using the dyadic OPTION instrument. 36 simulated medical consultations were organised between general practitioners and standardized- patients, using the observer OPTION and the newly developed dyadic OPTION instruments. SDM behaviours observed in the consultations were seen to depend on both members of the doctor and patient dyad, rather than each in isolation. Thus a dyadic approach to measurement is supported. This current study highlights the necessity for a dyadic approach to assessment and introduces a novel research instrument: the dyadic OPTION instrument. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  18. GUIDANCE FOR THE PERFORMANCE EVALUATION OF THREE-DIMENSIONAL AIR QUALITY MODELING SYSTEMS FOR PARTICULATE MATTER AND VISIBILITY

    EPA Science Inventory

    The National Ambient Air Quality Standards for particulate matter (PM) and the federal regional haze regulations place some emphasis on the assessment of fine particle (PM; 5) concentrations. Current air quality models need to be improved and evaluated against observations to a...

  19. Teaching Using Computer Games

    ERIC Educational Resources Information Center

    Miller, Lee Dee; Shell, Duane; Khandaker, Nobel; Soh, Leen-Kiat

    2011-01-01

    Computer games have long been used for teaching. Current reviews lack categorization and analysis using learning models which would help instructors assess the usefulness of computer games. We divide the use of games into two classes: game playing and game development. We discuss the Input-Process-Outcome (IPO) model for the learning process when…

  20. CONCEPTUAL BASIS FOR MULTI-ROUTE INTAKE DOSE MODELING USING AN ENERGY EXPENDITURE APPROACH

    EPA Science Inventory

    This paper provides the conceptual basis for a modeling logic that is currently being developed in the National Exposure Research Laboratory (NERL) of the U.S. Environmental Protection Agency ( EPA) for use in intake dose assessments involving substances that can enter the body...

  1. Multisite evaluation of APEX for water quality: 1. Best professional judgement parameterization

    USDA-ARS?s Scientific Manuscript database

    The Agricultural and Policy Environmental Extender (APEX) model is capable of estimating edge-of-field water, nutrient, and sediment transport and is used to assess the environmental impacts of management practices. The current practice is to fully calibrate the model for each site simulation, a tas...

  2. Predicting the regeneration of Appalachian hardwoods: adapting the REGEN model for the Appalachian Plateau

    Treesearch

    Lance A. Vickers; Thomas R. Fox; David L. Loftis; David A. Boucugnani

    2013-01-01

    The difficulty of achieving reliable oak (Quercus spp.) regeneration is well documented. Application of silvicultural techniques to facilitate oak regeneration largely depends on current regeneration potential. A computer model to assess regeneration potential based on existing advanced reproduction in Appalachian hardwoods was developed by David...

  3. Using Social Media to Assess Conceptualizations of Sexuality

    ERIC Educational Resources Information Center

    Zeglin, Robert J.; Mitchell, Julie

    2014-01-01

    There is currently no validated model explaining the variability of sexual expression. This has created a scenario where sexuality, as a construct, is purely intuitive. Sexuality educators have frequently presented the Circles of Sexuality, a model that contends that sexuality is a combination of intimacy, sensuality, sexual health/behaviors,…

  4. The AgMIP Coordinated Global and Regional Assessments (CGRA) of Climate Change Impacts on Agriculture and Food Security

    NASA Technical Reports Server (NTRS)

    Ruane, Alex; Rosenzweig, Cynthia; Elliott, Joshua; Antle, John

    2015-01-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) has been working since 2010 to construct a protocol-based framework enabling regional assessments (led by regional experts and modelers) that can provide consistent inputs to global economic and integrated assessment models. These global models can then relay important global-level information that drive regional decision-making and outcomes throughout an interconnected agricultural system. AgMIPs community of nearly 800 climate, crop, livestock, economics, and IT experts has improved the state-of-the-art through model intercomparisons, validation exercises, regional integrated assessments, and the launch of AgMIP programs on all six arable continents. AgMIP is now launching Coordinated Global and Regional Assessments (CGRA) of climate change impacts on agriculture and food security to link global and regional crop and economic models using a protocol-based framework. The CGRA protocols are being developed to utilize historical observations, climate projections, and RCPsSSPs from CMIP5 (and potentially CMIP6), and will examine stakeholder-driven agricultural development and adaptation scenarios to provide cutting-edge assessments of climate changes impact on agriculture and food security. These protocols will build on the foundation of established protocols from AgMIPs 30+ activities, and will emphasize the use of multiple models, scenarios, and scales to enable an accurate assessment of related uncertainties. The CGRA is also designed to provide the outputs necessary to feed into integrated assessment models (IAMs), nutrition and food security assessments, nitrogen and carbon cycle models, and additional impact-sector assessments (e.g., water resources, land-use, biomes, urban areas). This presentation will describe the current status of CGRA planning and initial prototype experiments to demonstrate key aspects of the protocols before wider implementation ahead of the IPCC Sixth Assessment Report.

  5. The AgMIP Coordinated Global and Regional Assessments (CGRA) of Climate Change Impacts on Agriculture and Food Security

    NASA Astrophysics Data System (ADS)

    Ruane, A. C.; Rosenzweig, C.; Antle, J. M.; Elliott, J. W.

    2015-12-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) has been working since 2010 to construct a protocol-based framework enabling regional assessments (led by regional experts and modelers) that can provide consistent inputs to global economic and integrated assessment models. These global models can then relay important global-level information that drive regional decision-making and outcomes throughout an interconnected agricultural system. AgMIP's community of nearly 800 climate, crop, livestock, economics, and IT experts has improved the state-of-the-art through model intercomparisons, validation exercises, regional integrated assessments, and the launch of AgMIP programs on all six arable continents. AgMIP is now launching Coordinated Global and Regional Assessments (CGRA) of climate change impacts on agriculture and food security to link global and regional crop and economic models using a protocol-based framework. The CGRA protocols are being developed to utilize historical observations, climate projections, and RCPs/SSPs from CMIP5 (and potentially CMIP6), and will examine stakeholder-driven agricultural development and adaptation scenarios to provide cutting-edge assessments of climate change's impact on agriculture and food security. These protocols will build on the foundation of established protocols from AgMIP's 30+ activities, and will emphasize the use of multiple models, scenarios, and scales to enable an accurate assessment of related uncertainties. The CGRA is also designed to provide the outputs necessary to feed into integrated assessment models (IAMs), nutrition and food security assessments, nitrogen and carbon cycle models, and additional impact-sector assessments (e.g., water resources, land-use, biomes, urban areas). This presentation will describe the current status of CGRA planning and initial prototype experiments to demonstrate key aspects of the protocols before wider implementation ahead of the IPCC Sixth Assessment Report.

  6. Incorporating Nonchemical Stressors Into Cumulative Risk Assessments

    PubMed Central

    Rider, Cynthia V.; Dourson, Michael L.; Hertzberg, Richard C.; Mumtaz, Moiz M.; Price, Paul S.; Simmons, Jane Ellen

    2012-01-01

    The role of nonchemical stressors in modulating the human health risk associated with chemical exposures is an area of increasing attention. On 9 March 2011, a workshop titled “Approaches for Incorporating Nonchemical Stressors into Cumulative Risk Assessment” took place during the 50th Anniversary Annual Society of Toxicology Meeting in Washington D.C. Objectives of the workshop included describing the current state of the science from various perspectives (i.e., regulatory, exposure, modeling, and risk assessment) and presenting expert opinions on currently available methods for incorporating nonchemical stressors into cumulative risk assessments. Herein, distinct frameworks for characterizing exposure to, joint effects of, and risk associated with chemical and nonchemical stressors are discussed. PMID:22345310

  7. The environmental analysis of helicopter operations by Federal agencies: Current procedures and research needs

    NASA Technical Reports Server (NTRS)

    Smith, C. C.; Warner, D. B.; Dajani, J. S.

    1977-01-01

    The technical, economic, and environmental problems restricting commercial helicopter passenger operations are reviewed. The key considerations for effective assessment procedures are outlined and a preliminary model for the environmental analysis of helicopters is developed. It is recommended that this model, or some similar approach, be used as a common base for the development of comprehensive environmental assessment methods for each of the federal agencies concerned with helicopters. A description of the critical environmental research issues applicable to helicopters is also presented.

  8. Human induced pluripotent stem cell‐derived versus adult cardiomyocytes: an in silico electrophysiological study on effects of ionic current block

    PubMed Central

    Paci, M; Hyttinen, J; Rodriguez, B

    2015-01-01

    Background and Purpose Two new technologies are likely to revolutionize cardiac safety and drug development: in vitro experiments on human‐induced pluripotent stem cell‐derived cardiomyocytes (hiPSC‐CMs) and in silico human adult ventricular cardiomyocyte (hAdultV‐CM) models. Their combination was recently proposed as a potential replacement for the present hERG‐based QT study for pharmacological safety assessments. Here, we systematically compared in silico the effects of selective ionic current block on hiPSC‐CM and hAdultV‐CM action potentials (APs), to identify similarities/differences and to illustrate the potential of computational models as supportive tools for evaluating new in vitro technologies. Experimental Approach In silico AP models of ventricular‐like and atrial‐like hiPSC‐CMs and hAdultV‐CM were used to simulate the main effects of four degrees of block of the main cardiac transmembrane currents. Key Results Qualitatively, hiPSC‐CM and hAdultV‐CM APs showed similar responses to current block, consistent with results from experiments. However, quantitatively, hiPSC‐CMs were more sensitive to block of (i) L‐type Ca2+ currents due to the overexpression of the Na+/Ca2+ exchanger (leading to shorter APs) and (ii) the inward rectifier K+ current due to reduced repolarization reserve (inducing diastolic potential depolarization and repolarization failure). Conclusions and Implications In silico hiPSC‐CMs and hAdultV‐CMs exhibit a similar response to selective current blocks. However, overall hiPSC‐CMs show greater sensitivity to block, which may facilitate in vitro identification of drug‐induced effects. Extrapolation of drug effects from hiPSC‐CM to hAdultV‐CM and pro‐arrhythmic risk assessment can be facilitated by in silico predictions using biophysically‐based computational models. PMID:26276951

  9. Competency in health care management: a training model in epidemiologic methods for assessing and improving the quality of clinical practice through evidence-based decision making.

    PubMed

    Hudak, R P; Jacoby, I; Meyer, G S; Potter, A L; Hooper, T I; Krakauer, H

    1997-01-01

    This article describes a training model that focuses on health care management by applying epidemiologic methods to assess and improve the quality of clinical practice. The model's uniqueness is its focus on integrating clinical evidence-based decision making with fundamental principles of resource management to achieve attainable, cost-effective, high-quality health outcomes. The target students are current and prospective clinical and administrative executives who must optimize decision making at the clinical and managerial levels of health care organizations.

  10. Technosocial Modeling of IED Threat Scenarios and Attacks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitney, Paul D.; Brothers, Alan J.; Coles, Garill A.

    2009-03-23

    This paper describes an approach for integrating sociological and technical models to develop more complete threat assessment. Current approaches to analyzing and addressing threats tend to focus on the technical factors. This paper addresses development of predictive models that encompass behavioral as well as these technical factors. Using improvised explosive device (IED) attacks as motivation, this model supports identification of intervention activities 'left of boom' as well as prioritizing attack modalities. We show how Bayes nets integrate social factors associated with IED attacks into general threat model containing technical and organizational steps from planning through obtaining the IED to initiationmore » of the attack. The social models are computationally-based representations of relevant social science literature that describes human decision making and physical factors. When combined with technical models, the resulting model provides improved knowledge integration into threat assessment for monitoring. This paper discusses the construction of IED threat scenarios, integration of diverse factors into an analytical framework for threat assessment, indicator identification for future threats, and future research directions.« less

  11. A Probabilistic Model for Hydrokinetic Turbine Collision Risks: Exploring Impacts on Fish

    PubMed Central

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals. PMID:25730314

  12. A probabilistic model for hydrokinetic turbine collision risks: exploring impacts on fish.

    PubMed

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals.

  13. The impact of composite AUC estimates on the prediction of systemic exposure in toxicology experiments.

    PubMed

    Sahota, Tarjinder; Danhof, Meindert; Della Pasqua, Oscar

    2015-06-01

    Current toxicity protocols relate measures of systemic exposure (i.e. AUC, Cmax) as obtained by non-compartmental analysis to observed toxicity. A complicating factor in this practice is the potential bias in the estimates defining safe drug exposure. Moreover, it prevents the assessment of variability. The objective of the current investigation was therefore (a) to demonstrate the feasibility of applying nonlinear mixed effects modelling for the evaluation of toxicokinetics and (b) to assess the bias and accuracy in summary measures of systemic exposure for each method. Here, simulation scenarios were evaluated, which mimic toxicology protocols in rodents. To ensure differences in pharmacokinetic properties are accounted for, hypothetical drugs with varying disposition properties were considered. Data analysis was performed using non-compartmental methods and nonlinear mixed effects modelling. Exposure levels were expressed as area under the concentration versus time curve (AUC), peak concentrations (Cmax) and time above a predefined threshold (TAT). Results were then compared with the reference values to assess the bias and precision of parameter estimates. Higher accuracy and precision were observed for model-based estimates (i.e. AUC, Cmax and TAT), irrespective of group or treatment duration, as compared with non-compartmental analysis. Despite the focus of guidelines on establishing safety thresholds for the evaluation of new molecules in humans, current methods neglect uncertainty, lack of precision and bias in parameter estimates. The use of nonlinear mixed effects modelling for the analysis of toxicokinetics provides insight into variability and should be considered for predicting safe exposure in humans.

  14. The main pillar: Assessment of space weather observational asset performance supporting nowcasting, forecasting, and research to operations.

    PubMed

    Posner, A; Hesse, M; St Cyr, O C

    2014-04-01

    Space weather forecasting critically depends upon availability of timely and reliable observational data. It is therefore particularly important to understand how existing and newly planned observational assets perform during periods of severe space weather. Extreme space weather creates challenging conditions under which instrumentation and spacecraft may be impeded or in which parameters reach values that are outside the nominal observational range. This paper analyzes existing and upcoming observational capabilities for forecasting, and discusses how the findings may impact space weather research and its transition to operations. A single limitation to the assessment is lack of information provided to us on radiation monitor performance, which caused us not to fully assess (i.e., not assess short term) radiation storm forecasting. The assessment finds that at least two widely spaced coronagraphs including L4 would provide reliability for Earth-bound CMEs. Furthermore, all magnetic field measurements assessed fully meet requirements. However, with current or even with near term new assets in place, in the worst-case scenario there could be a near-complete lack of key near-real-time solar wind plasma data of severe disturbances heading toward and impacting Earth's magnetosphere. Models that attempt to simulate the effects of these disturbances in near real time or with archival data require solar wind plasma observations as input. Moreover, the study finds that near-future observational assets will be less capable of advancing the understanding of extreme geomagnetic disturbances at Earth, which might make the resulting space weather models unsuitable for transition to operations. Manuscript assesses current and near-future space weather assetsCurrent assets unreliable for forecasting of severe geomagnetic stormsNear-future assets will not improve the situation.

  15. The main pillar: Assessment of space weather observational asset performance supporting nowcasting, forecasting, and research to operations

    PubMed Central

    Posner, A; Hesse, M; St Cyr, O C

    2014-01-01

    Space weather forecasting critically depends upon availability of timely and reliable observational data. It is therefore particularly important to understand how existing and newly planned observational assets perform during periods of severe space weather. Extreme space weather creates challenging conditions under which instrumentation and spacecraft may be impeded or in which parameters reach values that are outside the nominal observational range. This paper analyzes existing and upcoming observational capabilities for forecasting, and discusses how the findings may impact space weather research and its transition to operations. A single limitation to the assessment is lack of information provided to us on radiation monitor performance, which caused us not to fully assess (i.e., not assess short term) radiation storm forecasting. The assessment finds that at least two widely spaced coronagraphs including L4 would provide reliability for Earth-bound CMEs. Furthermore, all magnetic field measurements assessed fully meet requirements. However, with current or even with near term new assets in place, in the worst-case scenario there could be a near-complete lack of key near-real-time solar wind plasma data of severe disturbances heading toward and impacting Earth's magnetosphere. Models that attempt to simulate the effects of these disturbances in near real time or with archival data require solar wind plasma observations as input. Moreover, the study finds that near-future observational assets will be less capable of advancing the understanding of extreme geomagnetic disturbances at Earth, which might make the resulting space weather models unsuitable for transition to operations. Key Points Manuscript assesses current and near-future space weather assets Current assets unreliable for forecasting of severe geomagnetic storms Near-future assets will not improve the situation PMID:26213516

  16. Improving an Assessment of Tidal Stream Energy Resource for Anchorage, Alaska

    NASA Astrophysics Data System (ADS)

    Xu, T.; Haas, K. A.

    2016-12-01

    Increasing global energy demand is driving the pursuit of new and innovative energy sources leading to the need for assessing and utilizing alternative, productive and reliable energy resources. Tidal currents, characterized by periodicity and predictability, have long been explored and studied as a potential energy source, focusing on many different locations with significant tidal ranges. However, a proper resource assessment cannot be accomplished without accurate knowledge of the spatial-temporal distribution and availability of tidal currents. Known for possessing one of the top tidal energy sources along the U.S. coastline, Cook Inlet, Alaska is the area of interest for this project. A previous regional scaled resource assessment has been completed, however, the present study is to focus the assessment on the available power specifically near Anchorage while significantly improving the accuracy of the assessment following IEC guidelines. The Coupled-Ocean-Atmosphere-Wave-Sediment Transport (COAWST) modeling system is configured to simulate the tidal flows with grid refinement techniques for a minimum of 32 days, encompassing an entire lunar cycle. Simulation results are validated by extracting tidal constituents with harmonic analysis and comparing tidal components with National Oceanic and Atmospheric Administration (NOAA) observations and predictions. Model calibration includes adjustments to bottom friction coefficients and the usage of different tidal database. Differences between NOAA observations and COAWST simulations after applying grid refinement decrease, compared with results from a former study without grid refinement. Also, energy extraction is simulated at potential sites to study the impact on the tidal resources. This study demonstrates the enhancement of the resource assessment using grid refinement to evaluate tidal energy near Anchorage within Cook Inlet, Alaska, the productivity that energy extraction can achieve and the change in tidal currents caused by energy extraction.

  17. A Web-Based System for Bayesian Benchmark Dose Estimation.

    PubMed

    Shao, Kan; Shapiro, Andrew J

    2018-01-11

    Benchmark dose (BMD) modeling is an important step in human health risk assessment and is used as the default approach to identify the point of departure for risk assessment. A probabilistic framework for dose-response assessment has been proposed and advocated by various institutions and organizations; therefore, a reliable tool is needed to provide distributional estimates for BMD and other important quantities in dose-response assessment. We developed an online system for Bayesian BMD (BBMD) estimation and compared results from this software with U.S. Environmental Protection Agency's (EPA's) Benchmark Dose Software (BMDS). The system is built on a Bayesian framework featuring the application of Markov chain Monte Carlo (MCMC) sampling for model parameter estimation and BMD calculation, which makes the BBMD system fundamentally different from the currently prevailing BMD software packages. In addition to estimating the traditional BMDs for dichotomous and continuous data, the developed system is also capable of computing model-averaged BMD estimates. A total of 518 dichotomous and 108 continuous data sets extracted from the U.S. EPA's Integrated Risk Information System (IRIS) database (and similar databases) were used as testing data to compare the estimates from the BBMD and BMDS programs. The results suggest that the BBMD system may outperform the BMDS program in a number of aspects, including fewer failed BMD and BMDL calculations and estimates. The BBMD system is a useful alternative tool for estimating BMD with additional functionalities for BMD analysis based on most recent research. Most importantly, the BBMD has the potential to incorporate prior information to make dose-response modeling more reliable and can provide distributional estimates for important quantities in dose-response assessment, which greatly facilitates the current trend for probabilistic risk assessment. https://doi.org/10.1289/EHP1289.

  18. Towards a meaningful assessment of marine ecological impacts in life cycle assessment (LCA).

    PubMed

    Woods, John S; Veltman, Karin; Huijbregts, Mark A J; Verones, Francesca; Hertwich, Edgar G

    2016-01-01

    Human demands on marine resources and space are currently unprecedented and concerns are rising over observed declines in marine biodiversity. A quantitative understanding of the impact of industrial activities on the marine environment is thus essential. Life cycle assessment (LCA) is a widely applied method for quantifying the environmental impact of products and processes. LCA was originally developed to assess the impacts of land-based industries on mainly terrestrial and freshwater ecosystems. As such, impact indicators for major drivers of marine biodiversity loss are currently lacking. We review quantitative approaches for cause-effect assessment of seven major drivers of marine biodiversity loss: climate change, ocean acidification, eutrophication-induced hypoxia, seabed damage, overexploitation of biotic resources, invasive species and marine plastic debris. Our review shows that impact indicators can be developed for all identified drivers, albeit at different levels of coverage of cause-effect pathways and variable levels of uncertainty and spatial coverage. Modeling approaches to predict the spatial distribution and intensity of human-driven interventions in the marine environment are relatively well-established and can be employed to develop spatially-explicit LCA fate factors. Modeling approaches to quantify the effects of these interventions on marine biodiversity are less well-developed. We highlight specific research challenges to facilitate a coherent incorporation of marine biodiversity loss in LCA, thereby making LCA a more comprehensive and robust environmental impact assessment tool. Research challenges of particular importance include i) incorporation of the non-linear behavior of global circulation models (GCMs) within an LCA framework and ii) improving spatial differentiation, especially the representation of coastal regions in GCMs and ocean-carbon cycle models. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Assessment of the importance of the current-wave coupling in the shelf ocean forecasts

    NASA Astrophysics Data System (ADS)

    Jordà, G.; Bolaños, R.; Espino, M.; Sánchez-Arcilla, A.

    2006-10-01

    The effects of wave-current interactions on shelf ocean forecasts is investigated in the framework of the MFSTEP (Mediterranean Forecasting System Project Towards Enviromental Predictions) project. A one way sequential coupling approach is adopted to link the wave model (WAM) to the circulation model (SYMPHONIE). The coupling of waves and currents has been done considering four main processes: wave refraction due to currents, surface wind drag and bo€ttom drag modifications due to waves, and the wave induced mass flux. The coupled modelling system is implemented in the southern Catalan shelf (NW Mediterranean), a region with characteristics similar to most of the Mediterranean shelves. The sensitivity experiments are run in a typical operational configuration. The wave refraction by currents seems to be not very relevant in a microtidal context such as the western Mediterranean. The main effect of waves on current forecasts is through the modification of the wind drag. The Stokes drift also plays a significant role due to its spatial and temporal characteristics. Finally, the enhanced bottom friction is just noticeable in the inner shelf.

  20. Exploring the role of fire, succession, climate, and weather on landscape dynamics using comparative modeling

    Treesearch

    Robert E. Keane; Geoffrey J. Cary; Mike D. Flannigan; Russell A. Parsons; Ian D. Davies; Karen J. King; Chao Li; Ross A. Bradstock; Malcolm Gill

    2013-01-01

    An assessment of the relative importance of vegetation change and disturbance as agents of landscape change under current and future climates would (1) provide insight into the controls of landscape dynamics, (2) help inform the design and development of coarse scale spatially explicit ecosystem models such as Dynamic Global Vegetation Models (DGVMs), and (3) guide...

  1. Integrated System Dynamics Modelling for water scarcity assessment: case study of the Kairouan region.

    PubMed

    Sušnik, Janez; Vamvakeridou-Lyroudia, Lydia S; Savić, Dragan A; Kapelan, Zoran

    2012-12-01

    A System Dynamics Model (SDM) assessing water scarcity and potential impacts of socio-economic policies in a complex hydrological system is developed. The model, simulating water resources deriving from numerous catchment sources and demand from four sectors (domestic, industrial, agricultural, external pumping), contains multiple feedback loops and sub-models. The SDM is applied to the Merguellil catchment, Tunisia; the first time such an integrated model has been developed for the water scarce Kairouan region. The application represents an early step in filling a critical research gap. The focus of this paper is to a) assess the applicability of SDM for assessment of the evolution of a water-scarce catchment and b) to analyse the current and future behaviour of the catchment to evaluate water scarcity, focusing on understanding trends to inform policy. Baseline results indicate aquifer over-exploitation, agreeing with observed trends. If current policy and social behaviour continue, serious aquifer depletion is possible in the not too distant future, with implications for the economy and environment. This is unlikely to occur because policies preventing depletion will be implemented. Sensitivity tests were carried out to show which parameters most impacted aquifer behaviour. Results show non-linear model behaviour. Some tests showed negligible change in behaviour. Others showed unrealistic exponential changes in demand, revenue and aquifer water volume. Policy-realistic parameters giving the greatest positive impact on model behaviour were those controlling per-capita domestic water demand and the pumped volume to coastal cities. All potentially beneficial policy options should be considered, giving the best opportunity for preservation of Kairouan aquifer water quantity/quality, ecologically important habitats and the agricultural socio-economic driver of regional development. SDM is a useful tool for assessing the potential impacts of possible policy measures with respect to the evolution of water scarcity in critical regions. This work was undertaken for the EC FP7 project 'WASSERMed'. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Integrating pixel- and polygon-based approaches to wildfire risk assessment: Application to a high-value watershed on the Pike and San Isabel National Forests, Colorado, USA

    Treesearch

    Matthew P. Thompson; Julie W. Gilbertson-Day; Joe H. Scott

    2015-01-01

    We develop a novel risk assessment approach that integrates complementary, yet distinct, spatial modeling approaches currently used in wildfire risk assessment. Motivation for this work stems largely from limitations of existing stochastic wildfire simulation systems, which can generate pixel-based outputs of fire behavior as well as polygon-based outputs of simulated...

  3. Issues and Challenges in Situation Assessment (Level 2 Fusion)

    DTIC Science & Technology

    2006-12-01

    2 SAW–Comprehension of the current situa- tion ² Level 3 SAW–Projection of future states Operators of dynamic systems use their SAW in de - termining...1) build a model by either editing an existing template/model or create a new one; (2) activate/ de -activate existing models; or (3) view active models...and any evidence that has been associated with the model over time. Different political, military, economic, social , infrastructure, and informa

  4. Near Earth Asteroid Characteristics for Asteroid Threat Assessment

    NASA Technical Reports Server (NTRS)

    Dotson, Jessie

    2015-01-01

    Information about the physical characteristics of Near Earth Asteroids (NEAs) is needed to model behavior during atmospheric entry, to assess the risk of an impact, and to model possible mitigation techniques. The intrinsic properties of interest to entry and mitigation modelers, however, rarely are directly measureable. Instead we measure other properties and infer the intrinsic physical properties, so determining the complete set of characteristics of interest is far from straightforward. In addition, for the majority of NEAs, only the basic measurements exist so often properties must be inferred from statistics of the population of more completely characterized objects. We will provide an assessment of the current state of knowledge about the physical characteristics of importance to asteroid threat assessment. In addition, an ongoing effort to collate NEA characteristics into a readily accessible database for use by the planetary defense community will be discussed.

  5. A critical analysis of the numerical and analytical methods used in the construction of the lunar gravity potential model.

    NASA Astrophysics Data System (ADS)

    Tuckness, D. G.; Jost, B.

    1995-08-01

    Current knowledge of the lunar gravity field is presented. The various methods used in determining these gravity fields are investigated and analyzed. It will be shown that weaknesses exist in the current models of the lunar gravity field. The dominant part of this weakness is caused by the lack of lunar tracking data information (farside, polar areas), which makes modeling the total lunar potential difficult. Comparisons of the various lunar models reveal an agreement in the low-order coefficients of the Legendre polynomials expansions. However, substantial differences in the models can exist in the higher-order harmonics. The main purpose of this study is to assess today's lunar gravity field models for use in tomorrow's lunar mission designs and operations.

  6. Neurocognitive performance and prior injury among U.S. Department of Defense military personnel.

    PubMed

    Proctor, Susan P; Nieto, Kenneth; Heaton, Kristin J; Dillon, Caitlin C; Schlegel, Robert E; Russell, Michael L; Vincent, Andrea S

    2015-06-01

    This study examined the neurocognitive performance of U.S. military personnel completing the Automated Neuropsychological Assessment Metrics (version 4) TBI Military (ANAM4 TBI-MIL) battery as part of the Department of Defense Neurocognitive Functional Assessment Program. Descriptive analyses utilizing the ANAM4TBI Military Performance Database were performed. We examined ANAM Composite Score (ACS) differences between five injury subgroups (no injury, brain injury with current symptoms, brain injury without current symptoms, nonbrain injury with current symptoms, and nonbrain injury without current symptoms) using general linear mixed modeling. Almost 11% (70,472/641,285) reported brain injury in the 4 years before assessment. The ACS differed significantly by injury group (p < 0.0001). In comparison to the no injury group, those reporting brain injury with current symptoms (d = -0.44) and nonbrain injury with current symptoms (d = -0.24) demonstrated significantly reduced ACS scores (p < 0.0001) indicative of reduced neurocognitive proficiency. In this population-based study of U.S. military personnel, neurocognitive performance was significantly associated with reported injury within the past 4 years among those experiencing current symptoms. Occupational programs focusing on prospective brain health of injured population groups are warranted. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.

  7. Assessment of the sustainability of bushmeat hunting based on dynamic bioeconomic models.

    PubMed

    Ling, S; Milner-Gulland, E J

    2006-08-01

    Open-access hunting is a dynamic system in which individual hunters respond to changes in system variables such as costs of hunting and prices obtained for their catch. Sustainability indices used by conservationists ignore these human processes and focus only on the biological sustainability of current offtake levels. This focus implicitly assumes that offtake is constant, says little about the actual sustainability of the system, and fails to provide any basis for predicting the impact of most feasible management interventions. A bioeconomic approach overcomes these limitations by explicitly integrating both the biological and human components of the system. We present a graphical representation of a simple bioeconomic model of bushmeat hunting and use it to demonstrate the importance of considering system dynamics when assessing sustainability. Our results show that commonly used static sustainability indices are often misleading. The best method to assess hunting sustainability is situation dependent, but characterizing supply and demand curves, even crudely, has greater potential than current approaches to provide robust predictions in the medium term.

  8. A New Trans-Disciplinary Approach to Regional Integrated Assessment of Climate Impact and Adaptation in Agricultural Systems (Invited)

    NASA Astrophysics Data System (ADS)

    Antle, J. M.; Valdivia, R. O.; Jones, J.; Rosenzweig, C.; Ruane, A. C.

    2013-12-01

    This presentation provides an overview of the new methods developed by researchers in the Agricultural Model Inter-comparison and Improvement Project (AgMIP) for regional climate impact assessment and analysis of adaptation in agricultural systems. This approach represents a departure from approaches in the literature in several dimensions. First, the approach is based on the analysis of agricultural systems (not individual crops) and is inherently trans-disciplinary: it is based on a deep collaboration among a team of climate scientists, agricultural scientists and economists to design and implement regional integrated assessments of agricultural systems. Second, in contrast to previous approaches that have imposed future climate on models based on current socio-economic conditions, this approach combines bio-physical and economic models with a new type of pathway analysis (Representative Agricultural Pathways) to parameterize models consistent with a plausible future world in which climate change would be occurring. Third, adaptation packages for the agricultural systems in a region are designed by the research team with a level of detail that is useful to decision makers, such as research administrators and donors, who are making agricultural R&D investment decisions. The approach is illustrated with examples from AgMIP's projects currently being carried out in Africa and South Asia.

  9. Assessment of the GHG Reduction Potential from Energy Crops Using a Combined LCA and Biogeochemical Process Models: A Review

    PubMed Central

    Jiang, Dong; Hao, Mengmeng; Wang, Qiao; Huang, Yaohuan; Fu, Xinyu

    2014-01-01

    The main purpose for developing biofuel is to reduce GHG (greenhouse gas) emissions, but the comprehensive environmental impact of such fuels is not clear. Life cycle analysis (LCA), as a complete comprehensive analysis method, has been widely used in bioenergy assessment studies. Great efforts have been directed toward establishing an efficient method for comprehensively estimating the greenhouse gas (GHG) emission reduction potential from the large-scale cultivation of energy plants by combining LCA with ecosystem/biogeochemical process models. LCA presents a general framework for evaluating the energy consumption and GHG emission from energy crop planting, yield acquisition, production, product use, and postprocessing. Meanwhile, ecosystem/biogeochemical process models are adopted to simulate the fluxes and storage of energy, water, carbon, and nitrogen in the soil-plant (energy crops) soil continuum. Although clear progress has been made in recent years, some problems still exist in current studies and should be addressed. This paper reviews the state-of-the-art method for estimating GHG emission reduction through developing energy crops and introduces in detail a new approach for assessing GHG emission reduction by combining LCA with biogeochemical process models. The main achievements of this study along with the problems in current studies are described and discussed. PMID:25045736

  10. Assessing the bioaccumulation potential of ionizable organic ...

    EPA Pesticide Factsheets

    The objective of the present study is to review current knowledge regarding the bioaccumulation potential of IOCs, with a focus on the availability of empirical data for fish. Aspects of the bioaccumulation potential of IOCs in fish that can be characterized relatively well include the pH-dependence of gill uptake and elimination, uptake in the gut, and sorption to phospholipids (membrane-water partitioning). Key challenges include the lack of empirical data for biotransformation and binding in plasma. Fish possess a diverse array of proteins which may transport IOCs across cell membranes. Except in a few cases, however, the significance of this transport for uptake and accumulation of environmental contaminants is unknown. Two case studies are presented. The first describes modeled effects of pH and biotransformation on bioconcentration of organic acids and bases, while the second employs an updated model to investigate factors responsible for accumulation of perfluoroalkylated acids (PFAA). The PFAA case study is notable insofar as it illustrates the likely importance of membrane transporters in the kidney and highlights the potential value of read across approaches. Recognizing the current need to perform bioaccumulation hazard assessments and ecological and exposure risk assessment for IOCs, we provide a tiered strategy that progresses (as needed) from conservative assumptions (models and associated data) to more sophisticated models requiring chemical-speci

  11. Intergovernmental Panel on Climate Change (IPCC)\\, Working Group 1, 1994: Modelling Results Relating Future Atmospheric CO2 Concentrations to Industrial Emissions (DB1009)

    DOE Data Explorer

    Enting, I. G.; Wigley, M. L.; Heimann, M.

    1995-01-01

    This database contains the results of various projections of the relation between future CO2 concentrations and future industrial emissions. These projections were contributed by groups from a number of countries as part of the scientific assessment for the report, "Radiative Forcing of Climate Change" (1994), issued by Working Group 1 of the Intergovernmental Panel on Climate Change. There were three types of calculations: (1) forward projections, calculating the atmospheric CO2 concentrations resulting from specified emissions scenarios; (2) inverse calculations, determining the emission rates that would be required to achieve stabilization of CO2 concentrations via specified pathways; (3) impulse response function calculations, required for determining Global Warming Potentials. The projections were extrapolations of global carbon cycle models from pre-industrial times (starting at 1765) to 2100 or 2200 A.D. There were two aspects to the exercise: (1) an assessment of the uncertainty due to uncertainties regarding the current carbon budget, and (2) an assessment of the uncertainties arising from differences between models. To separate these effects, a set of standard conditions was used to explore inter-model differences and then a series of sensitivity studies was used to explore the consequences of current uncertainties in the carbon cycle.

  12. Indicators of Dysphagia in Aged Care Facilities.

    PubMed

    Pu, Dai; Murry, Thomas; Wong, May C M; Yiu, Edwin M L; Chan, Karen M K

    2017-09-18

    The current cross-sectional study aimed to investigate risk factors for dysphagia in elderly individuals in aged care facilities. A total of 878 individuals from 42 aged care facilities were recruited for this study. The dependent outcome was speech therapist-determined swallowing function. Independent factors were Eating Assessment Tool score, oral motor assessment score, Mini-Mental State Examination, medical history, and various functional status ratings. Binomial logistic regression was used to identify independent variables associated with dysphagia in this cohort. Two statistical models were constructed. Model 1 used variables from case files without the need for hands-on assessment, and Model 2 used variables that could be obtained from hands-on assessment. Variables positively associated with dysphagia identified in Model 1 were male gender, total dependence for activities of daily living, need for feeding assistance, mobility, requiring assistance walking or using a wheelchair, and history of pneumonia. Variables positively associated with dysphagia identified in Model 2 were Mini-Mental State Examination score, edentulousness, and oral motor assessments score. Cognitive function, dentition, and oral motor function are significant indicators associated with the presence of swallowing in the elderly. When assessing the frail elderly, case file information can help clinicians identify frail elderly individuals who may be suffering from dysphagia.

  13. Modeling the Energy Use of a Connected and Automated Transportation System (Poster)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonder, J.; Brown, A.

    Early research points to large potential impacts of connected and automated vehicles (CAVs) on transportation energy use - dramatic savings, increased use, or anything in between. Due to a lack of suitable data and integrated modeling tools to explore these complex future systems, analyses to date have relied on simple combinations of isolated effects. This poster proposes a framework for modeling the potential energy implications from increasing penetration of CAV technologies and for assessing technology and policy options to steer them toward favorable energy outcomes. Current CAV modeling challenges include estimating behavior change, understanding potential vehicle-to-vehicle interactions, and assessing trafficmore » flow and vehicle use under different automation scenarios. To bridge these gaps and develop a picture of potential future automated systems, NREL is integrating existing modeling capabilities with additional tools and data inputs to create a more fully integrated CAV assessment toolkit.« less

  14. Application of 3D reconstruction system in diabetic foot ulcer injury assessment

    NASA Astrophysics Data System (ADS)

    Li, Jun; Jiang, Li; Li, Tianjian; Liang, Xiaoyao

    2018-04-01

    To deal with the considerable deviation of transparency tracing method and digital planimetry method used in current clinical diabetic foot ulcer injury assessment, this paper proposes a 3D reconstruction system which can be used to get foot model with good quality texture, then injury assessment is done by measuring the reconstructed model. The system uses the Intel RealSense SR300 depth camera which is based on infrared structured-light as input device, the required data from different view is collected by moving the camera around the scanned object. The geometry model is reconstructed by fusing the collected data, then the mesh is sub-divided to increase the number of mesh vertices and the color of each vertex is determined using a non-linear optimization, all colored vertices compose the surface texture of the reconstructed model. Experimental results indicate that the reconstructed model has millimeter-level geometric accuracy and texture with few artificial effect.

  15. Evolution of an Implementation-Ready Interprofessional Pain Assessment Reference Model

    PubMed Central

    Collins, Sarah A; Bavuso, Karen; Swenson, Mary; Suchecki, Christine; Mar, Perry; Rocha, Roberto A.

    2017-01-01

    Standards to increase consistency of comprehensive pain assessments are important for safety, quality, and analytics activities, including meeting Joint Commission requirements and learning the best management strategies and interventions for the current prescription Opioid epidemic. In this study we describe the development and validation of a Pain Assessment Reference Model ready for implementation on EHR forms and flowsheets. Our process resulted in 5 successive revisions of the reference model, which more than doubled the number of data elements to 47. The organization of the model evolved during validation sessions with panels totaling 48 subject matter experts (SMEs) to include 9 sets of data elements, with one set recommended as a minimal data set. The reference model also evolved when implemented into EHR forms and flowsheets, indicating specifications such as cascading logic that are important to inform secondary use of data. PMID:29854125

  16. PACE: Probabilistic Assessment for Contributor Estimation- A machine learning-based assessment of the number of contributors in DNA mixtures.

    PubMed

    Marciano, Michael A; Adelman, Jonathan D

    2017-03-01

    The deconvolution of DNA mixtures remains one of the most critical challenges in the field of forensic DNA analysis. In addition, of all the data features required to perform such deconvolution, the number of contributors in the sample is widely considered the most important, and, if incorrectly chosen, the most likely to negatively influence the mixture interpretation of a DNA profile. Unfortunately, most current approaches to mixture deconvolution require the assumption that the number of contributors is known by the analyst, an assumption that can prove to be especially faulty when faced with increasingly complex mixtures of 3 or more contributors. In this study, we propose a probabilistic approach for estimating the number of contributors in a DNA mixture that leverages the strengths of machine learning. To assess this approach, we compare classification performances of six machine learning algorithms and evaluate the model from the top-performing algorithm against the current state of the art in the field of contributor number classification. Overall results show over 98% accuracy in identifying the number of contributors in a DNA mixture of up to 4 contributors. Comparative results showed 3-person mixtures had a classification accuracy improvement of over 6% compared to the current best-in-field methodology, and that 4-person mixtures had a classification accuracy improvement of over 20%. The Probabilistic Assessment for Contributor Estimation (PACE) also accomplishes classification of mixtures of up to 4 contributors in less than 1s using a standard laptop or desktop computer. Considering the high classification accuracy rates, as well as the significant time commitment required by the current state of the art model versus seconds required by a machine learning-derived model, the approach described herein provides a promising means of estimating the number of contributors and, subsequently, will lead to improved DNA mixture interpretation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. Assessment of Hybrid Coordinate Model Velocity Fields During Agulhas Return Current 2012 Cruise

    DTIC Science & Technology

    2013-06-01

    Forecasts GDEM Generalized Digital Environmental Model GPS Global Positioning System HYCOM HYbrid Coordinate Ocean Model MICOM Miami Isopycnal...speed profiles was climatology from the Generalized Digital Environmental Model ( GDEM ; Teague et al. 1990). Made operational in 1999, the Modular... GDEM was the only tool a naval oceanographer had at his or her disposal to characterize ocean conditions where in-situ observations could not be

  18. Current modeling practice may lead to falsely high benchmark dose estimates.

    PubMed

    Ringblom, Joakim; Johanson, Gunnar; Öberg, Mattias

    2014-07-01

    Benchmark dose (BMD) modeling is increasingly used as the preferred approach to define the point-of-departure for health risk assessment of chemicals. As data are inherently variable, there is always a risk to select a model that defines a lower confidence bound of the BMD (BMDL) that, contrary to expected, exceeds the true BMD. The aim of this study was to investigate how often and under what circumstances such anomalies occur under current modeling practice. Continuous data were generated from a realistic dose-effect curve by Monte Carlo simulations using four dose groups and a set of five different dose placement scenarios, group sizes between 5 and 50 animals and coefficients of variations of 5-15%. The BMD calculations were conducted using nested exponential models, as most BMD software use nested approaches. "Non-protective" BMDLs (higher than true BMD) were frequently observed, in some scenarios reaching 80%. The phenomenon was mainly related to the selection of the non-sigmoidal exponential model (Effect=a·e(b)(·dose)). In conclusion, non-sigmoid models should be used with caution as it may underestimate the risk, illustrating that awareness of the model selection process and sound identification of the point-of-departure is vital for health risk assessment. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Reliability of Current U.S. Modeling of Atmospheric Plumes Questioned

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    The deficiencies of atmospheric modeling used to determine the dispersion of chemical, radiological, or biological plumes came under fire during a 2 June hearing in the U.S. House of Representative. Several members of Congress said at that time that current modeling efforts provide inadequate information to assess plumes that could result from a terrorist incident, warfare, or some other cause. Part of the hearing, held by the House Subcommittee on National Security, Emerging Threats, and International Relations, focused on two reports released just that day: one by the U.S. National Academy of Sciences (NAS), and the other by the U.S. General Accounting Office (GAO).

  20. Predictive Performance of Physiologically Based Pharmacokinetic Models for the Effect of Food on Oral Drug Absorption: Current Status

    PubMed Central

    Zhao, Ping; Pan, Yuzhuo; Wagner, Christian

    2017-01-01

    A comprehensive search in literature and published US Food and Drug Administration reviews was conducted to assess whether physiologically based pharmacokinetic (PBPK) modeling could be prospectively used to predict clinical food effect on oral drug absorption. Among the 48 resulted food effect predictions, ∼50% were predicted within 1.25‐fold of observed, and 75% within 2‐fold. Dissolution rate and precipitation time were commonly optimized parameters when PBPK modeling was not able to capture the food effect. The current work presents a knowledgebase for documenting PBPK experience to predict food effect. PMID:29168611

  1. Microdose Induced Drain Leakage Effects in Power Trench MOSFETs: Experiment and Modeling

    NASA Astrophysics Data System (ADS)

    Zebrev, Gennady I.; Vatuev, Alexander S.; Useinov, Rustem G.; Emeliyanov, Vladimir V.; Anashin, Vasily S.; Gorbunov, Maxim S.; Turin, Valentin O.; Yesenkov, Kirill A.

    2014-08-01

    We study experimentally and theoretically the micro-dose induced drain-source leakage current in the trench power MOSFETs under irradiation with high-LET heavy ions. We found experimentally that cumulative increase of leakage current occurs by means of stochastic spikes corresponding to a strike of single heavy ion into the MOSFET gate oxide. We simulate this effect with the proposed analytic model allowing to describe (including Monte Carlo methods) both the deterministic (cumulative dose) and stochastic (single event) aspects of the problem. Based on this model the survival probability assessment in space heavy ion environment with high LETs was proposed.

  2. Sustainable dimension adaptation measure in green township assessment criteria

    NASA Astrophysics Data System (ADS)

    Yaman, R.; Thadaniti, S.; Ahmad, N.; Halil, F. M.; Nasir, N. M.

    2018-05-01

    Urbanized areas are typically the most significant sources of environmental degradation, thus, an urban assessment criteria tools aiming at equally adapted sustainability dimensions need to be firmly embedded in benchmarking planning and design framework and upon occupancy. The need for integral systematic rating is recognized in order to evaluate the performance of sustainable neighborhood and to promote sustainable urban development. In this study, Green Building Index Township Assessment Criteria (GBI-TAC) will be measure on holistic sustainable dimension pillar (SDP) adaptation in order to assess and redefine the current sustainability assessment criteria for future sustainable neighborhood development (SND). The objective of the research is to find-out whether the current GBI-TAC and its variables fulfilled the holistic SDP adaptations towards sustainable neighborhood development in Malaysia. The stakeholder-inclusion approached is used in this research in order to gather professional’s stakeholders’ opinions regarding the SDP adaptations for sustainable neighborhood development. The data were analysed using IBM SPSS AMOS22 Structural Equation Modelling. The findings suggested an adaptation gap of SDP in current GBI-TAC even though all core-criteria supported SDP adaptation, hence lead to further review and refinement for future Neighborhood Assessment Criteria in Malaysia.

  3. Numerical modelling as a cost-reduction tool for probability of detection of bolt hole eddy current testing

    NASA Astrophysics Data System (ADS)

    Mandache, C.; Khan, M.; Fahr, A.; Yanishevsky, M.

    2011-03-01

    Probability of detection (PoD) studies are broadly used to determine the reliability of specific nondestructive inspection procedures, as well as to provide data for damage tolerance life estimations and calculation of inspection intervals for critical components. They require inspections on a large set of samples, a fact that makes these statistical assessments time- and cost-consuming. Physics-based numerical simulations of nondestructive testing inspections could be used as a cost-effective alternative to empirical investigations. They realistically predict the inspection outputs as functions of the input characteristics related to the test piece, transducer and instrument settings, which are subsequently used to partially substitute and/or complement inspection data in PoD analysis. This work focuses on the numerical modelling aspects of eddy current testing for the bolt hole inspections of wing box structures typical of the Lockheed Martin C-130 Hercules and P-3 Orion aircraft, found in the air force inventory of many countries. Boundary element-based numerical modelling software was employed to predict the eddy current signal responses when varying inspection parameters related to probe characteristics, crack geometry and test piece properties. Two demonstrator exercises were used for eddy current signal prediction when lowering the driver probe frequency and changing the material's electrical conductivity, followed by subsequent discussions and examination of the implications on using simulated data in the PoD analysis. Despite some simplifying assumptions, the modelled eddy current signals were found to provide similar results to the actual inspections. It is concluded that physics-based numerical simulations have the potential to partially substitute or complement inspection data required for PoD studies, reducing the cost, time, effort and resources necessary for a full empirical PoD assessment.

  4. Scientific Foundations for an IUCN Red List of Ecosystems

    PubMed Central

    Keith, David A.; Rodríguez, Jon Paul; Rodríguez-Clark, Kathryn M.; Nicholson, Emily; Aapala, Kaisu; Alonso, Alfonso; Asmussen, Marianne; Bachman, Steven; Basset, Alberto; Barrow, Edmund G.; Benson, John S.; Bishop, Melanie J.; Bonifacio, Ronald; Brooks, Thomas M.; Burgman, Mark A.; Comer, Patrick; Comín, Francisco A.; Essl, Franz; Faber-Langendoen, Don; Fairweather, Peter G.; Holdaway, Robert J.; Jennings, Michael; Kingsford, Richard T.; Lester, Rebecca E.; Nally, Ralph Mac; McCarthy, Michael A.; Moat, Justin; Oliveira-Miranda, María A.; Pisanu, Phil; Poulin, Brigitte; Regan, Tracey J.; Riecken, Uwe; Spalding, Mark D.; Zambrano-Martínez, Sergio

    2013-01-01

    An understanding of risks to biodiversity is needed for planning action to slow current rates of decline and secure ecosystem services for future human use. Although the IUCN Red List criteria provide an effective assessment protocol for species, a standard global assessment of risks to higher levels of biodiversity is currently limited. In 2008, IUCN initiated development of risk assessment criteria to support a global Red List of ecosystems. We present a new conceptual model for ecosystem risk assessment founded on a synthesis of relevant ecological theories. To support the model, we review key elements of ecosystem definition and introduce the concept of ecosystem collapse, an analogue of species extinction. The model identifies four distributional and functional symptoms of ecosystem risk as a basis for assessment criteria: A) rates of decline in ecosystem distribution; B) restricted distributions with continuing declines or threats; C) rates of environmental (abiotic) degradation; and D) rates of disruption to biotic processes. A fifth criterion, E) quantitative estimates of the risk of ecosystem collapse, enables integrated assessment of multiple processes and provides a conceptual anchor for the other criteria. We present the theoretical rationale for the construction and interpretation of each criterion. The assessment protocol and threat categories mirror those of the IUCN Red List of species. A trial of the protocol on terrestrial, subterranean, freshwater and marine ecosystems from around the world shows that its concepts are workable and its outcomes are robust, that required data are available, and that results are consistent with assessments carried out by local experts and authorities. The new protocol provides a consistent, practical and theoretically grounded framework for establishing a systematic Red List of the world’s ecosystems. This will complement the Red List of species and strengthen global capacity to report on and monitor the status of biodiversity PMID:23667454

  5. Scientific foundations for an IUCN Red List of ecosystems.

    PubMed

    Keith, David A; Rodríguez, Jon Paul; Rodríguez-Clark, Kathryn M; Nicholson, Emily; Aapala, Kaisu; Alonso, Alfonso; Asmussen, Marianne; Bachman, Steven; Basset, Alberto; Barrow, Edmund G; Benson, John S; Bishop, Melanie J; Bonifacio, Ronald; Brooks, Thomas M; Burgman, Mark A; Comer, Patrick; Comín, Francisco A; Essl, Franz; Faber-Langendoen, Don; Fairweather, Peter G; Holdaway, Robert J; Jennings, Michael; Kingsford, Richard T; Lester, Rebecca E; Mac Nally, Ralph; McCarthy, Michael A; Moat, Justin; Oliveira-Miranda, María A; Pisanu, Phil; Poulin, Brigitte; Regan, Tracey J; Riecken, Uwe; Spalding, Mark D; Zambrano-Martínez, Sergio

    2013-01-01

    An understanding of risks to biodiversity is needed for planning action to slow current rates of decline and secure ecosystem services for future human use. Although the IUCN Red List criteria provide an effective assessment protocol for species, a standard global assessment of risks to higher levels of biodiversity is currently limited. In 2008, IUCN initiated development of risk assessment criteria to support a global Red List of ecosystems. We present a new conceptual model for ecosystem risk assessment founded on a synthesis of relevant ecological theories. To support the model, we review key elements of ecosystem definition and introduce the concept of ecosystem collapse, an analogue of species extinction. The model identifies four distributional and functional symptoms of ecosystem risk as a basis for assessment criteria: A) rates of decline in ecosystem distribution; B) restricted distributions with continuing declines or threats; C) rates of environmental (abiotic) degradation; and D) rates of disruption to biotic processes. A fifth criterion, E) quantitative estimates of the risk of ecosystem collapse, enables integrated assessment of multiple processes and provides a conceptual anchor for the other criteria. We present the theoretical rationale for the construction and interpretation of each criterion. The assessment protocol and threat categories mirror those of the IUCN Red List of species. A trial of the protocol on terrestrial, subterranean, freshwater and marine ecosystems from around the world shows that its concepts are workable and its outcomes are robust, that required data are available, and that results are consistent with assessments carried out by local experts and authorities. The new protocol provides a consistent, practical and theoretically grounded framework for establishing a systematic Red List of the world's ecosystems. This will complement the Red List of species and strengthen global capacity to report on and monitor the status of biodiversity.

  6. Bayesian algorithm implementation in a real time exposure assessment model on benzene with calculation of associated cancer risks.

    PubMed

    Sarigiannis, Dimosthenis A; Karakitsios, Spyros P; Gotti, Alberto; Papaloukas, Costas L; Kassomenos, Pavlos A; Pilidis, Georgios A

    2009-01-01

    The objective of the current study was the development of a reliable modeling platform to calculate in real time the personal exposure and the associated health risk for filling station employees evaluating current environmental parameters (traffic, meteorological and amount of fuel traded) determined by the appropriate sensor network. A set of Artificial Neural Networks (ANNs) was developed to predict benzene exposure pattern for the filling station employees. Furthermore, a Physiology Based Pharmaco-Kinetic (PBPK) risk assessment model was developed in order to calculate the lifetime probability distribution of leukemia to the employees, fed by data obtained by the ANN model. Bayesian algorithm was involved in crucial points of both model sub compartments. The application was evaluated in two filling stations (one urban and one rural). Among several algorithms available for the development of the ANN exposure model, Bayesian regularization provided the best results and seemed to be a promising technique for prediction of the exposure pattern of that occupational population group. On assessing the estimated leukemia risk under the scope of providing a distribution curve based on the exposure levels and the different susceptibility of the population, the Bayesian algorithm was a prerequisite of the Monte Carlo approach, which is integrated in the PBPK-based risk model. In conclusion, the modeling system described herein is capable of exploiting the information collected by the environmental sensors in order to estimate in real time the personal exposure and the resulting health risk for employees of gasoline filling stations.

  7. Bayesian Algorithm Implementation in a Real Time Exposure Assessment Model on Benzene with Calculation of Associated Cancer Risks

    PubMed Central

    Sarigiannis, Dimosthenis A.; Karakitsios, Spyros P.; Gotti, Alberto; Papaloukas, Costas L.; Kassomenos, Pavlos A.; Pilidis, Georgios A.

    2009-01-01

    The objective of the current study was the development of a reliable modeling platform to calculate in real time the personal exposure and the associated health risk for filling station employees evaluating current environmental parameters (traffic, meteorological and amount of fuel traded) determined by the appropriate sensor network. A set of Artificial Neural Networks (ANNs) was developed to predict benzene exposure pattern for the filling station employees. Furthermore, a Physiology Based Pharmaco-Kinetic (PBPK) risk assessment model was developed in order to calculate the lifetime probability distribution of leukemia to the employees, fed by data obtained by the ANN model. Bayesian algorithm was involved in crucial points of both model sub compartments. The application was evaluated in two filling stations (one urban and one rural). Among several algorithms available for the development of the ANN exposure model, Bayesian regularization provided the best results and seemed to be a promising technique for prediction of the exposure pattern of that occupational population group. On assessing the estimated leukemia risk under the scope of providing a distribution curve based on the exposure levels and the different susceptibility of the population, the Bayesian algorithm was a prerequisite of the Monte Carlo approach, which is integrated in the PBPK-based risk model. In conclusion, the modeling system described herein is capable of exploiting the information collected by the environmental sensors in order to estimate in real time the personal exposure and the resulting health risk for employees of gasoline filling stations. PMID:22399936

  8. Principles of a multistack electrochemical wastewater treatment design

    NASA Astrophysics Data System (ADS)

    Elsahwi, Essam S.; Dawson, Francis P.; Ruda, Harry E.

    2018-02-01

    Electrolyzer stacks in a bipolar architecture (cells connected in series) are desirable since power provided to a stack can be transferred at high voltages and low currents and thus the losses in the power bus can be reduced. The anode electrodes (active electrodes) considered as part of this study are single sided but there are manufacturing cost advantages to implementing double side anodes in the future. One of the main concerns with a bipolar stack implementation is the existence of leakage currents (bypass currents). The leakage current is associated with current paths that are not between adjacent anode and cathode pairs. This leads to non uniform current density distributions which compromise the electrochemical conversion efficiency of the stack and can also lead to unwanted side reactions. The objective of this paper is to develop modelling tools for a bipolar architecture consisting of two single sided cells that use single sided anodes. It is assumed that chemical reactions are single electron transfer rate limited and that diffusion and convection effects can be ignored. The design process consists of the flowing two steps: development of a large signal model for the stack, and then the extraction of a small signal model from the large signal model. The small signal model facilitates the design of a controller that satisfies current or voltage regulation requirements. A model has been developed for a single cell and two cells in series but can be generalized to more than two cells in series and to incorporate double sided anode configurations in the future. The developed model is able to determine the leakage current and thus provide a quantitative assessment on the performance of the cell.

  9. Peripheral nerve magnetic stimulation: influence of tissue non-homogeneity

    PubMed Central

    Krasteva, Vessela TZ; Papazov, Sava P; Daskalov, Ivan K

    2003-01-01

    Background Peripheral nerves are situated in a highly non-homogeneous environment, including muscles, bones, blood vessels, etc. Time-varying magnetic field stimulation of the median and ulnar nerves in the carpal region is studied, with special consideration of the influence of non-homogeneities. Methods A detailed three-dimensional finite element model (FEM) of the anatomy of the wrist region was built to assess the induced currents distribution by external magnetic stimulation. The electromagnetic field distribution in the non-homogeneous domain was defined as an internal Dirichlet problem using the finite element method. The boundary conditions were obtained by analysis of the vector potential field excited by external current-driven coils. Results The results include evaluation and graphical representation of the induced current field distribution at various stimulation coil positions. Comparative study for the real non-homogeneous structure with anisotropic conductivities of the tissues and a mock homogeneous media is also presented. The possibility of achieving selective stimulation of either of the two nerves is assessed. Conclusion The model developed could be useful in theoretical prediction of the current distribution in the nerves during diagnostic stimulation and therapeutic procedures involving electromagnetic excitation. The errors in applying homogeneous domain modeling rather than real non-homogeneous biological structures are demonstrated. The practical implications of the applied approach are valid for any arbitrary weakly conductive medium. PMID:14693034

  10. Beyond JCAHO: using competency models to change healthcare organizations. Part 2: Developing competence assessment systems.

    PubMed

    Decker, P J; Strader, M K; Wise, R J

    1997-01-01

    In 1996, JCAHO required hospitals to assess, prove, track, and improve the competence of all employees. This article is the second part of a review of the concept of competency assessment and the implications of meeting and exceeding the JCAHO standards. Part 1 (in the previous issue of Hospital Topics) provided the theory of competence assessment, the current situation in JCAHO surveys, and an overview of the problems inherent in competency assessment. This part puts competence assessment in the context of quality improvement and provides the details of developing competence assessment systems.

  11. Evaluation of HFIR LEU Fuel Using the COMSOL Multiphysics Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Primm, Trent; Ruggles, Arthur; Freels, James D

    2009-03-01

    A finite element computational approach to simulation of the High Flux Isotope Reactor (HFIR) Core Thermal-Fluid behavior is developed. These models were developed to facilitate design of a low enriched core for the HFIR, which will have different axial and radial flux profiles from the current HEU core and thus will require fuel and poison load optimization. This report outlines a stepwise implementation of this modeling approach using the commercial finite element code, COMSOL, with initial assessment of fuel, poison and clad conduction modeling capability, followed by assessment of mating of the fuel conduction models to a one dimensional fluidmore » model typical of legacy simulation techniques for the HFIR core. The model is then extended to fully couple 2-dimensional conduction in the fuel to a 2-dimensional thermo-fluid model of the coolant for a HFIR core cooling sub-channel with additional assessment of simulation outcomes. Finally, 3-dimensional simulations of a fuel plate and cooling channel are presented.« less

  12. Limitations of bootstrap current models

    DOE PAGES

    Belli, Emily A.; Candy, Jefferey M.; Meneghini, Orso; ...

    2014-03-27

    We assess the accuracy and limitations of two analytic models of the tokamak bootstrap current: (1) the well-known Sauter model and (2) a recent modification of the Sauter model by Koh et al. For this study, we use simulations from the first-principles kinetic code NEO as the baseline to which the models are compared. Tests are performed using both theoretical parameter scans as well as core- to-edge scans of real DIII-D and NSTX plasma profiles. The effects of extreme aspect ratio, large impurity fraction, energetic particles, and high collisionality are studied. In particular, the error in neglecting cross-species collisional couplingmore » – an approximation inherent to both analytic models – is quantified. Moreover, the implications of the corrections from kinetic NEO simulations on MHD equilibrium reconstructions is studied via integrated modeling with kinetic EFIT.« less

  13. New drug adoption models: a review and assessment of future needs.

    PubMed

    Agrawal, M; Calantone, R J

    1995-01-01

    New drug products today are the key to survival in the pharmaceutical industry. However, the new product development process in the pharmaceutical industry also happens to be one of the riskiest and most expensive undertakings because of the huge research and development costs involved. Consequently market forecasting of new pharmaceutical products takes on added importance if the formidable investments are to be recovered. New drug adoption models provide the marketer with a means to assess new product potential. Although several adoption models are available in the marketing literature for assessing potential of common consumer goods, the unique characteristics of the prescription drug market makes it necessary to examine the current state of pharmaceutical innovations. The purpose of this study, therefore, is to: (1) review new drug adoption models in the pharmaceutical literature, (2) evaluate the existing models of new drug adoption using the ten criteria for a good model as prescribed by Zaltman and Wallendorf (1983), and (3) provide an overall assessment and a ¿prescription¿ for better forecasting of new drug products.

  14. Global Times Call for Global Measures: Investigating Automated Essay Scoring in Linguistically-Diverse MOOCs

    ERIC Educational Resources Information Center

    Reilly, Erin D.; Williams, Kyle M.; Stafford, Rose E.; Corliss, Stephanie B.; Walkow, Janet C.; Kidwell, Donna K.

    2016-01-01

    This paper utilizes a case-study design to discuss global aspects of massive open online course (MOOC) assessment. Drawing from the literature on open-course models and linguistic gatekeeping in education, we position freeform assessment in MOOCs as both challenging and valuable, with an emphasis on current practices and student resources. We…

  15. ANALYTIC ELEMENT MODELING FOR SOURCE WATER ASSESSMENTS OF PUBLIC WATER SUPPLY WELLS: CASE STUDIES IN GLACIAL OUTWASH AND BASIN-AND-RANGE

    EPA Science Inventory

    Over the last 10 years the EPA has invested in analytic elements as a computational method used in public domain software supporting capture zone delineation for source water assessments and wellhead protection. The current release is called WhAEM2000 (wellhead analytic element ...

  16. Southern Forest Resource Assessment and Linkages to the National RPA

    Treesearch

    Fredrick Cubbage; Jacek Siry; Steverson Moffat; David N. Wear; Robert Abt

    1998-01-01

    We developed a Southern Forest Resource Assessment Consortium (SOFAC) in 1994, which is designed to enhance our capabilities to analyze and model the southern forest and timber resources. Southern growth and yield analyses prepared for the RPA via SOFAC indicate that substantial increases in timber productivity can occur given current technology. A survey about NIPF...

  17. Do Differing Types of Field Experiences Make a Difference in Teacher Candidates' Perceived Level of Competence?

    ERIC Educational Resources Information Center

    Caprano, Mary Margaret; Caprano, Robert M.; Helfeldt, Jack

    2010-01-01

    Little research has been conducted to directly compare the effectiveness of different models of field-based learning experiences and little has been reported on the use of the Interstate New Teacher Assessment and Support Consortium (INTASC) standards in establishing a formative assessment for teacher candidates (TCs). The current study used the…

  18. Towards a Personal Best: A Case for Introducing Ipsative Assessment in Higher Education

    ERIC Educational Resources Information Center

    Hughes, Gwyneth

    2011-01-01

    The central role that assessment plays is recognised in higher education, in particular how formative feedback guides learning. A model for effective feedback practice is used to argue that, in current schemes, formative feedback is often not usable because it is strongly linked to external criteria and standards, rather than to the processes of…

  19. Evaluating the ecological sustainability of a pinyon-juniper grassland ecosystem in northern Arizona

    Treesearch

    Reuben Weisz; Jack Triepke; Don Vandendriesche; Mike Manthei; Jim Youtz; Jerry Simon; Wayne Robbie

    2010-01-01

    In order to develop strategic land management plans, managers must assess current and future ecological conditions. Climate change has expanded the need to assess the sustainability of ecosystems and predict their conditions under different climate change and management scenarios using landscape dynamics simulation models. We present a methodology for developing a...

  20. Narrowing Historical Uncertainty: Probabilistic Classification of Ambiguously Identified Tree Species in Historical Forest Survey Data

    Treesearch

    David J. Mladenoff; Sally E. Dahir; Eric V. Nordheim; Lisa A. Schulte; Glenn G. Gutenspergen

    2002-01-01

    Historical data have increasingly become appreciated for insight into the past conditions of ecosystems, Uses of such data include assessing the extent of ecosystem change; deriving ecological baselines for management, restoration, and modeling; and assessing the importance of past conditions on the composition and function of current systems. One historical data set...

  1. Clinical Implications in the Treatment of Mania: Reducing Risk Behavior in Manic Patients

    ERIC Educational Resources Information Center

    Leahy, Robert L.

    2005-01-01

    Bipolar individuals engage in risky behavior during manic phases that contributes to their vulnerability to regret during their depressive phases. A cognitive model of risk assessment is proposed in which manic risk assessment is based on exaggeration of current and future resources, high utility for gains, low demands for information to assess…

  2. Re-defining and quantifying inorganic phosphate pools in the Soil and Water Assessment Tool

    USDA-ARS?s Scientific Manuscript database

    Abstract The Soil and Water Assessment Tool (SWAT), a large-scale hydrologic model, can be used to estimate the impact of land management practices on phosphate (P) loading in streams and water bodies. Three inorganic soil P pools (labile, active, and stable P) are currently defined in the SWAT mo...

  3. SOFRA and RPA: two views of the future of southern timber supply.

    Treesearch

    Darius Adams; John Mills; Ralph Alig; Richard Haynes

    2005-01-01

    Two recent studies provide alternative views of the current state and future prospects of southern forests and timber supply: the Southern Forest Resource Assessment (SOFRA) and the Fifth Resources Planning Act Timber Assessment (RPA). Using apparently comparable data but different models and methods, the studies portray futures that in some aspects are quite similar...

  4. Using a Modified Pyramidal Training Model to Teach Special Education Teachers to Conduct Trial-Based Functional Analyses

    ERIC Educational Resources Information Center

    Kunnavatana, S. Shanun; Bloom, Sarah E.; Samaha, Andrew L.; Lignugaris/Kraft, Benjamin; Dayton, Elizabeth; Harris, Shannon K.

    2013-01-01

    Functional behavioral assessments are commonly used in school settings to assess and develop interventions for problem behavior. The trial-based functional analysis is an approach that teachers can use in their classrooms to identify the function of problem behavior. The current study evaluates the effectiveness of a modified pyramidal training…

  5. A resistive mesh phantom for assessing the performance of EIT systems.

    PubMed

    Gagnon, Hervé; Cousineau, Martin; Adler, Andy; Hartinger, Alzbeta E

    2010-09-01

    Assessing the performance of electrical impedance tomography (EIT) systems usually requires a phantom for validation, calibration, or comparison purposes. This paper describes a resistive mesh phantom to assess the performance of EIT systems while taking into account cabling stray effects similar to in vivo conditions. This phantom is built with 340 precision resistors on a printed circuit board representing a 2-D circular homogeneous medium. It also integrates equivalent electrical models of the Ag/AgCl electrode impedances. The parameters of the electrode models were fitted from impedance curves measured with an impedance analyzer. The technique used to build the phantom is general and applicable to phantoms of arbitrary shape and conductivity distribution. We describe three performance indicators that can be measured with our phantom for every measurement of an EIT data frame: SNR, accuracy, and modeling accuracy. These performance indicators were evaluated on our EIT system under different frame rates and applied current intensities. The performance indicators are dependent on frame rate, operating frequency, applied current intensity, measurement strategy, and intermodulation distortion when performing simultaneous measurements at several frequencies. These parameter values should, therefore, always be specified when reporting performance indicators to better appreciate their significance.

  6. Making the Case for Reusable Booster Systems: The Operations Perspective

    NASA Technical Reports Server (NTRS)

    Zapata, Edgar

    2012-01-01

    Presentation to the Aeronautics Space Engineering Board National Research Council Reusable Booster System: Review and Assessment Committee. Addresses: the criteria and assumptions used in the formulation of current RBS plans; the methodologies used in the current cost estimates for RBS; the modeling methodology used to frame the business case for an RBS capability including: the data used in the analysis, the models' robustness if new data become available, and the impact of unclassified government data that was previously unavailable and which will be supplied by the USAF; the technical maturity of key elements critical to RBS implementation and the ability of current technology development plans to meet technical readiness milestones.

  7. Using an integral projection model to assess the effect of temperature on the growth of gilthead seabream Sparus aurata.

    PubMed

    Heather, F J; Childs, D Z; Darnaude, A M; Blanchard, J L

    2018-01-01

    Accurate information on the growth rates of fish is crucial for fisheries stock assessment and management. Empirical life history parameters (von Bertalanffy growth) are widely fitted to cross-sectional size-at-age data sampled from fish populations. This method often assumes that environmental factors affecting growth remain constant over time. The current study utilized longitudinal life history information contained in otoliths from 412 juveniles and adults of gilthead seabream, Sparus aurata, a commercially important species fished and farmed throughout the Mediterranean. Historical annual growth rates over 11 consecutive years (2002-2012) in the Gulf of Lions (NW Mediterranean) were reconstructed to investigate the effect of temperature variations on the annual growth of this fish. S. aurata growth was modelled linearly as the relationship between otolith size at year t against otolith size at the previous year t-1. The effect of temperature on growth was modelled with linear mixed effects models and a simplified linear model to be implemented in a cohort Integral Projection Model (cIPM). The cIPM was used to project S. aurata growth, year to year, under different temperature scenarios. Our results determined current increasing summer temperatures to have a negative effect on S. aurata annual growth in the Gulf of Lions. They suggest that global warming already has and will further have a significant impact on S. aurata size-at-age, with important implications for age-structured stock assessments and reference points used in fisheries.

  8. Design of electrodes and current limits for low frequency electrical impedance tomography of the brain.

    PubMed

    Gilad, O; Horesh, L; Holder, D S

    2007-07-01

    For the novel application of recording of resistivity changes related to neuronal depolarization in the brain with electrical impedance tomography, optimal recording is with applied currents below 100 Hz, which might cause neural stimulation of skin or underlying brain. The purpose of this work was to develop a method for application of low frequency currents to the scalp, which delivered the maximum current without significant stimulation of skin or underlying brain. We propose a recessed electrode design which enabled current injection with an acceptable skin sensation to be increased from 100 muA using EEG electrodes, to 1 mA in 16 normal volunteers. The effect of current delivered to the brain was assessed with an anatomically realistic finite element model of the adult head. The modelled peak cerebral current density was 0.3 A/m(2), which was 5 to 25-fold less than the threshold for stimulation of the brain estimated from literature review.

  9. Current and future flood risk to railway infrastructure in Europe

    NASA Astrophysics Data System (ADS)

    Bubeck, Philip; Kellermann, Patric; Alfieri, Lorenzo; Feyen, Luc; Dillenardt, Lisa; Thieken, Annegret H.

    2017-04-01

    Railway infrastructure plays an important role in the transportation of freight and passengers across the European Union. According to Eurostat, more than four billion passenger-kilometres were travelled on national and international railway lines of the EU28 in 2014. To further strengthen transport infrastructure in Europe, the European Commission will invest another € 24.05 billion in the transnational transport network until 2020 as part of its new transport infrastructure policy (TEN-T), including railway infrastructure. Floods pose a significant risk to infrastructure elements. Damage data of recent flood events in Europe show that infrastructure losses can make up a considerable share of overall losses. For example, damage to state and municipal infrastructure in the federal state of Saxony (Germany) accounted for nearly 60% of overall losses during the large-scale event in June 2013. Especially in mountainous areas with little usable space available, roads and railway lines often follow floodplains or are located along steep and unsteady slopes. In Austria, for instance, the flood of 2013 caused € 75 million of direct damage to railway infrastructure. Despite the importance of railway infrastructure and its exposure to flooding, assessments of potential damage and risk (i.e. probability * damage) are still in its infancy compared with other sectors, such as the residential or industrial sector. Infrastructure-specific assessments at the regional scale are largely lacking. Regional assessment of potential damage to railway infrastructure has been hampered by a lack of infrastructure-specific damage models and data availability. The few available regional approaches have used damage models that assess damage to various infrastructure elements (e.g. roads, railway, airports and harbours) using one aggregated damage function and cost estimate. Moreover, infrastructure elements are often considerably underrepresented in regional land cover data, such as CORINE, due to their line shapes. To assess current and future damage and risk to railway infrastructure in Europe, we apply the damage model RAIL -' RAilway Infrastructure Loss' that was specifically developed for railway infrastructure using empirical damage data. To adequately and comprehensively capture the line-shaped features of railway infrastructure, the assessment makes use of the open-access data set of openrailway.org. Current and future flood hazard in Europe is obtained with the LISFLOOD-based pan-European flood hazard mapping procedure combined with ensemble projections of extreme streamflow for the current century based on EURO-CORDEX RCP 8.5 climate scenarios. The presentation shows first results of the combination of the hazard data and the model RAIL for Europe.

  10. The Path to Graduation: A Model Interactive Web Site Design Supporting Doctoral Students

    ERIC Educational Resources Information Center

    Simmons-Johnson, Nicole

    2012-01-01

    Objective. This 2-phase mixed method study assessed 2nd-year doctoral students' and dissertation students' perceptions of the current Graduate School of Education dissertation support Web site, with implications for designing a model dissertation support Web site. Methods. Phase 1 collected quantitative and qualitative data through an…

  11. Assessment of important SPECIATE Profiles in EPA’s Emissions Modeling Platform and Current Data Gaps (US EPA 2017 International Emissions Inventory Conference)

    EPA Science Inventory

    The US Environmental Protection Agency (EPA)’s SPECIATE database contains speciation profiles for both particulate matter (PM) and volatile organic compounds (VOCs) that are key inputs for creating speciated emission inventories for air quality modeling. The objective of th...

  12. Hydrothermal assessment of temporal variability in seedbed microclimate

    Treesearch

    Stuart P. Hardegree; Corey A. Moffet; Gerald N. Flerchinger; Jaepil Cho; Bruce A. Roundy; Thomas A. Jones; Jeremy J. James; Patrick E. Clark; Frederick B. Pierson

    2013-01-01

    The microclimatic requirements for successful seedling establishment are much more restrictive than those required for adult plant survival. The purpose of the current study was to use hydrothermal germination models and a soil energy and water flux model to evaluate intra- and interannual variability in seedbed microclimate relative to potential germination response...

  13. Looking beyond Psychopathology: The Dual-Factor Model of Mental Health in Youth

    ERIC Educational Resources Information Center

    Suldo, Shannon M.; Shaffer, Emily J.

    2008-01-01

    In a dual-factor model of mental health (cf. Greenspoon & Saklofske, 2001), assessments of positive indicators of wellness (i.e., subjective well-being--SWB) are coupled with traditional negative indicators of illness (i.e., psychopathology) to comprehensively measure mental health. The current study examined the existence and utility of a…

  14. Quality Concerns in Technical Education in India: A Quantifiable Quality Enabled Model

    ERIC Educational Resources Information Center

    Gambhir, Victor; Wadhwa, N. C.; Grover, Sandeep

    2016-01-01

    Purpose: The paper aims to discuss current Technical Education scenarios in India. It proposes modelling the factors affecting quality in a technical institute and then applying a suitable technique for assessment, comparison and ranking. Design/methodology/approach: The paper chose graph theoretic approach for quantification of quality-enabled…

  15. A Watershed-based spatially-explicit demonstration of an Integrated Environmental Modeling Framework for Ecosystem Services in the Coal River Basin (WV, USA)

    EPA Science Inventory

    We demonstrate a spatially-explicit regional assessment of current condition of aquatic ecoservices in the Coal River Basin (CRB), with limited sensitivity analysis for the atmospheric contaminant mercury. The integrated modeling framework (IMF) forecasts water quality and quant...

  16. Assessing Intelligence in Children and Youth Living in the Netherlands

    ERIC Educational Resources Information Center

    Hurks, Petra P. M.; Bakker, Helen

    2016-01-01

    In this article, we briefly describe the history of intelligence test use with children and youth in the Netherlands, explain which models of intelligence guide decisions about test use, and detail how intelligence tests are currently being used in Dutch school settings. Empirically supported and theoretical models studying the structure of human…

  17. Defining and Comparing the Reading Comprehension Construct: A Cognitive-Psychometric Modeling Approach

    ERIC Educational Resources Information Center

    Svetina, Dubravka; Gorin, Joanna S.; Tatsuoka, Kikumi K.

    2011-01-01

    As a construct definition, the current study develops a cognitive model describing the knowledge, skills, and abilities measured by critical reading test items on a high-stakes assessment used for selection decisions in the United States. Additionally, in order to establish generalizability of the construct meaning to other similarly structured…

  18. An Investigation of Sample Size Splitting on ATFIND and DIMTEST

    ERIC Educational Resources Information Center

    Socha, Alan; DeMars, Christine E.

    2013-01-01

    Modeling multidimensional test data with a unidimensional model can result in serious statistical errors, such as bias in item parameter estimates. Many methods exist for assessing the dimensionality of a test. The current study focused on DIMTEST. Using simulated data, the effects of sample size splitting for use with the ATFIND procedure for…

  19. Evaluating Cognitive Theory: A Joint Modeling Approach Using Responses and Response Times

    ERIC Educational Resources Information Center

    Klein Entink, Rinke H.; Kuhn, Jorg-Tobias; Hornke, Lutz F.; Fox, Jean-Paul

    2009-01-01

    In current psychological research, the analysis of data from computer-based assessments or experiments is often confined to accuracy scores. Response times, although being an important source of additional information, are either neglected or analyzed separately. In this article, a new model is developed that allows the simultaneous analysis of…

  20. Using eddy covariance and flux partitioning to assess basal, soil, and stress coefficients for crop evapotranspiration models

    USDA-ARS?s Scientific Manuscript database

    Current approaches to scheduling crop irrigation using reference evapotranspiration (ET0) recommend using a dual-coefficient approach using basal (Kcb) and soil (Ke) coefficients along with a stress coefficient (Ks) to model crop evapotranspiration (ETc), [e.g. ETc=(Ks*Kcb+Ke)*ET0]. However, indepe...

  1. Judgment Research and the Dimensional Model of Personality

    ERIC Educational Resources Information Center

    Garb, Howard N.

    2008-01-01

    Comments on the original article "Plate tectonics in the classification of personality disorder: Shifting to a dimensional model," by T. A. Widiger and T. J. Trull. The purpose of this comment is to address (a) whether psychologists know how personality traits are currently assessed by clinicians and (b) the reliability and validity of those…

  2. Diagnostic Classification Models: Which One Should I Use?

    ERIC Educational Resources Information Center

    Jiao, Hong

    2009-01-01

    Diagnostic assessment is currently an active research area in educational measurement. Literature related to diagnostic modeling has been in existence for several decades, but a great deal of research has been conducted within the last decade or so, especially within the last five years. The author summarizes the key components in the application…

  3. Effects of Prompting Multiple Solutions for Modelling Problems on Students' Performance

    ERIC Educational Resources Information Center

    Schukajlow, Stanislaw; Krug, André; Rakoczy, Katrin

    2015-01-01

    Prompting students to construct multiple solutions for modelling problems with vague conditions has been found to be an effective way to improve students' performance on interest-oriented measures. In the current study, we investigated the influence of this teaching element on students' performance. To assess the impact of prompting multiple…

  4. A coupled modeling approach to assess the impact of fuel treatments on post-wildfire runoff and erosion

    USDA-ARS?s Scientific Manuscript database

    The hydrological consequences of wildfires are some of the most significant and long-lasting effects. Since wildfire severity impacts post-fire hydrological response, fuel treatments can be a useful tool for land managers to moderate this response. However, current models focus on only one aspect of...

  5. Chlorofluoromethanes and the Stratosphere

    NASA Technical Reports Server (NTRS)

    Hudson, R. D. (Editor)

    1977-01-01

    The conclusions of a workshop held by the National Aeronautics and Space Administration to assess the current knowledge of the impact of chlorofluoromethane release in the troposphere on stratospheric ozone concentrations. The following topics are discussed; (1) Laboratory measurements; (2) Ozone measurements and trends; (3) Minor species and aerosol measurements; (4) One dimensional modeling; and (5) Multidimensional modeling.

  6. Human Behavior Based Exploratory Model for Successful Implementation of Lean Enterprise in Industry

    ERIC Educational Resources Information Center

    Sawhney, Rupy; Chason, Stewart

    2005-01-01

    Currently available Lean tools such as Lean Assessments, Value Stream Mapping, and Process Flow Charting focus on system requirements and overlook human behavior. A need is felt for a tool that allows one to baseline personnel, determine personnel requirements and align system requirements with personnel requirements. Our exploratory model--The…

  7. Assessing Students' Understandings of Biological Models and Their Use in Science to Evaluate a Theoretical Framework

    ERIC Educational Resources Information Center

    Grünkorn, Juliane; Upmeier zu Belzen, Annette; Krüger, Dirk

    2014-01-01

    Research in the field of students' understandings of models and their use in science describes different frameworks concerning these understandings. Currently, there is no conjoint framework that combines these structures and so far, no investigation has focused on whether it reflects students' understandings sufficiently (empirical evaluation).…

  8. Response to Nuclear Regulatory Commission`s ten questions pertaining to site-specific models for use in the license termination rule: Multimedia Environmental Pollutant Assessment System (MEPAS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buck, J.W.; Whelan, G.; Strenge, D.L.

    This paper is in response to the US Nuclear Regulatory Commission (NRC) ten questions posed at the Modeling Workshop held November 13 and 14, 1997. The ten questions were developed in advance of the workshop to allow model developers to prepare a presentation at the Workshop. This paper is an expanded version of the Multimedia Environmental Pollutant Assessment System (MEPAS) presentation given at the Modeling Workshop by Pacific Northwest National Laboratory (PNNL) staff. This paper is organized by the ten questions asked by the NRC, each section devoted to a single question. The current version of methodology is MEPAS 3.2more » (NRC 1997) and the discussion in this paper will pertain to that version. In some cases, MEPAS 4.0, which is currently being developed under the Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) (Whelan et al. 1997), will be referenced to inform the reader of potential capabilities in the near future. A separate paper is included in the document that discusses the FRAMES concept.« less

  9. Can the REBT theory explain loneliness? Theoretical and clinical applications.

    PubMed

    Hyland, Philip; McGinty, Gráinne; Karatzias, Thanos; Murphy, Jamie; Vallières, Frédérique; McHugh Power, Joanna

    2018-06-05

    Loneliness is a common psychological experience affecting a significant minority of the general population. Loneliness may in part be related to the existence of dysfunctional cognitive evaluations. To date, however, loneliness has yet to be explicitly assessed within a cognitive-behavioural theoretical framework. The current study sought to determine the association between negative cognitions, within the context of Rational Emotive Behaviour Therapy (REBT), and the experience of loneliness. A multinational sample of university students (n = 397) completed self-report assessments of rational and irrational beliefs, and loneliness. Structural equation modelling results found that the REBT model of psychopathology, and the REBT model of psychological health, provided satisfactory representations of loneliness, explaining 36% and 23% of variance in loneliness, respectively. Several dysfunctional ("Demandingness", "Catastrophising" and "Self-Downing" beliefs) and functional ("Preferences" and "Self-Acceptance" beliefs) cognitions were directly and indirectly associated with loneliness. These results highlight that cognitions and loneliness are meaningfully related, and indicate that cognitive-behavioural models may be useful in understanding loneliness. More specifically, current results suggest that REBT may offer a viable psychotherapeutic approach to treating loneliness.

  10. Population modeling for pesticide risk assessment of threatened species-A case study of a terrestrial plant, Boltonia decurrens.

    PubMed

    Schmolke, Amelie; Brain, Richard; Thorbek, Pernille; Perkins, Daniel; Forbes, Valery

    2017-02-01

    Although population models are recognized as necessary tools in the ecological risk assessment of pesticides, particularly for species listed under the Endangered Species Act, their application in this context is currently limited to very few cases. The authors developed a detailed, individual-based population model for a threatened plant species, the decurrent false aster (Boltonia decurrens), for application in pesticide risk assessment. Floods and competition with other plant species are known factors that drive the species' population dynamics and were included in the model approach. The authors use the model to compare the population-level effects of 5 toxicity surrogates applied to B. decurrens under varying environmental conditions. The model results suggest that the environmental conditions under which herbicide applications occur may have a higher impact on populations than organism-level sensitivities to an herbicide within a realistic range. Indirect effects may be as important as the direct effects of herbicide applications by shifting competition strength if competing species have different sensitivities to the herbicide. The model approach provides a case study for population-level risk assessments of listed species. Population-level effects of herbicides can be assessed in a realistic and species-specific context, and uncertainties can be addressed explicitly. The authors discuss how their approach can inform the future development and application of modeling for population-level risk assessments of listed species, and ecological risk assessment in general. Environ Toxicol Chem 2017;36:480-491. © 2016 SETAC. © 2016 SETAC.

  11. Principle considerations for the risk assessment of sprayed consumer products.

    PubMed

    Steiling, W; Bascompta, M; Carthew, P; Catalano, G; Corea, N; D'Haese, A; Jackson, P; Kromidas, L; Meurice, P; Rothe, H; Singal, M

    2014-05-16

    In recent years, the official regulation of chemicals and chemical products has been intensified. Explicitly for spray products enhanced requirements to assess the consumers'/professionals' exposure to such product type have been introduced. In this regard the Aerosol-Dispensers-Directive (75/324/EEC) with obligation for marketing aerosol dispensers, and the Cosmetic-Products-Regulation (1223/2009/EC) which obliges the insurance of a safety assessment, have to be mentioned. Both enactments, similar to the REACH regulation (1907/2006/EC), require a robust chemical safety assessment. From such assessment, appropriate risk management measures may be identified to adequately control the risk of these chemicals/products to human health and the environment when used. Currently, the above-mentioned regulations lack the guidance on which data are needed for preparing a proper hazard analysis and safety assessment of spray products. Mandatory in the process of inhalation risk and safety assessment is the determination and quantification of the actual exposure to the spray product and more specifically, its ingredients. In this respect the current article, prepared by the European Aerosol Federation (FEA, Brussels) task force "Inhalation Toxicology", intends to introduce toxicological principles and the state of the art in currently available exposure models adapted for typical application scenarios. This review on current methodologies is intended to guide safety assessors to better estimate inhalation exposure by using the most relevant data. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  12. REVIEW OF MECHANISTIC UNDERSTANDING AND MODELING AND UNCERTAINTY ANALYSIS METHODS FOR PREDICTING CEMENTITIOUS BARRIER PERFORMANCE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langton, C.; Kosson, D.

    2009-11-30

    Cementitious barriers for nuclear applications are one of the primary controls for preventing or limiting radionuclide release into the environment. At the present time, performance and risk assessments do not fully incorporate the effectiveness of engineered barriers because the processes that influence performance are coupled and complicated. Better understanding the behavior of cementitious barriers is necessary to evaluate and improve the design of materials and structures used for radioactive waste containment, life extension of current nuclear facilities, and design of future nuclear facilities, including those needed for nuclear fuel storage and processing, nuclear power production and waste management. The focusmore » of the Cementitious Barriers Partnership (CBP) literature review is to document the current level of knowledge with respect to: (1) mechanisms and processes that directly influence the performance of cementitious materials (2) methodologies for modeling the performance of these mechanisms and processes and (3) approaches to addressing and quantifying uncertainties associated with performance predictions. This will serve as an important reference document for the professional community responsible for the design and performance assessment of cementitious materials in nuclear applications. This review also provides a multi-disciplinary foundation for identification, research, development and demonstration of improvements in conceptual understanding, measurements and performance modeling that would be lead to significant reductions in the uncertainties and improved confidence in the estimating the long-term performance of cementitious materials in nuclear applications. This report identifies: (1) technology gaps that may be filled by the CBP project and also (2) information and computational methods that are in currently being applied in related fields but have not yet been incorporated into performance assessments of cementitious barriers. The various chapters contain both a description of the mechanism or and a discussion of the current approaches to modeling the phenomena.« less

  13. Suicide risk factors for young adults: testing a model across ethnicities.

    PubMed

    Gutierrez, P M; Rodriguez, P J; Garcia, P

    2001-06-01

    A general path model based on existing suicide risk research was developed to test factors contributing to current suicidal ideation in young adults. A sample of 673 undergraduate students completed a packet of questionnaires containing the Beck Depression Inventory, Adult Suicidal Ideation Questionnaire, and Multi-Attitude Suicide Tendency Scale. They also provided information on history of suicidality and exposure to attempted and completed suicide in others. Structural equation modeling was used to test the fit of the data to the hypothesized model. Goodness-of-fit indices were adequate and supported the interactive effects of exposure, repulsion by life, depression, and history of self-harm on current ideation. Model fit for three subgroups based on race/ethnicity (i.e., White, Black, and Hispanic) determined that repulsion by life and depression function differently across groups. Implications of these findings for current methods of suicide risk assessment and future research are discussed in the context of the importance of culture.

  14. Dynamics of aircraft antiskid braking systems. [conducted at the Langley aircraft landing loads and traction facility

    NASA Technical Reports Server (NTRS)

    Tanner, J. A.; Stubbs, S. M.; Dreher, R. C.; Smith, E. G.

    1982-01-01

    A computer study was performed to assess the accuracy of three brake pressure-torque mathematical models. The investigation utilized one main gear wheel, brake, and tire assembly of a McDonnell Douglas DC-9 series 10 airplane. The investigation indicates that the performance of aircraft antiskid braking systems is strongly influenced by tire characteristics, dynamic response of the antiskid control valve, and pressure-torque response of the brake. The computer study employed an average torque error criterion to assess the accuracy of the models. The results indicate that a variable nonlinear spring with hysteresis memory function models the pressure-torque response of the brake more accurately than currently used models.

  15. Development and application of a 3-D geometry/mass model for LDEF satellite ionizing radiation assessments

    NASA Technical Reports Server (NTRS)

    Colborn, B. L.; Armstrong, T. W.

    1992-01-01

    A computer model of the three dimensional geometry and material distributions for the LDEF spacecraft, experiment trays, and, for selected trays, the components of experiments within a tray was developed for use in ionizing radiation assessments. The model is being applied to provide 3-D shielding distributions around radiation dosimeters to aid in data interpretation, particularly in assessing the directional properties of the radiation exposure. Also, the model has been interfaced with radiation transport codes for 3-D dosimetry response predictions and for calculations related to determining the accuracy of trapped proton and cosmic ray environment models. The methodology is described used in developing the 3-D LDEF model and the level of detail incorporated. Currently, the trays modeled in detail are F2, F8, and H12 and H3. Applications of the model which are discussed include the 3-D shielding distributions around various dosimeters, the influence of shielding on dosimetry responses, and comparisons of dose predictions based on the present 3-D model vs those from 1-D geometry model approximations used in initial estimates.

  16. Projected Changes to Streamflow Characteristics in Quebec Basins as Simulated by the Canadian Regional Climate Model (CRCM4)

    NASA Astrophysics Data System (ADS)

    Huziy, O.; Sushama, L.; Khaliq, M.; Lehner, B.; Laprise, R.; Roy, R.

    2011-12-01

    According to the Intergovernmental Panel on Climate Change (IPCC, 2007), an intensification of the global hydrological cycle and increase in precipitation for some regions around the world, including the northern mid- to high-latitudes, is expected in future climate. This will have an impact on mean and extreme flow characteristics, which need to be assessed for better development of adaptation strategies. Analysis of the mean and extreme streamflow characteristics for Quebec (North-eastern Canada) basins in current climate and their projected changes in future climate are assessed using a 10 member ensemble of current (1970 - 1999) and future (2041 - 2070) Canadian RCM (CRCM4) simulations. Validation of streamflow characteristics, performed by comparing modeled values with those observed, available from the Centre d'expertise hydrique du Quebec (CEHQ) shows that the model captures reasonably well the high flows. Results suggest increase in mean and 10 year return levels of 1 day high flows, which appear significant for most of the northern basins.

  17. A modeling study of the radar signatures of rip currents with comparisons to data

    NASA Astrophysics Data System (ADS)

    O'Dea, A.; Haller, M. C.

    2016-12-01

    Rip currents are important components of nearshore circulation systems and can pose serious dangers to swimmers. In recent years, X-band imaging radar has been shown to be an effective remote sensor of rip currents over large spatial scales, for long durations, and with high temporal resolution. In contrast to remote sensing methods that infer rip location through the identification of morphological features (i.e. rip channels), rip detection in radar arises directly from the backscatter characteristics of the rip current flow field, thus offering the potential of direct extraction of quantitative information on rip current hydrodynamics. In this study, we present a model for the radar imaging of rip currents based on the wave action balance equation and the changes to the wind-wave spectrum at Bragg (capillary) wavelengths induced by the underlying rip current field. Model results are compared to field data (both in situ and remote sensing) from a 10-day experiment at Duck, NC conducted in September 2010. The model/data comparisons are then used to assess the physical mechanisms contributing to the radar imaging of rip currents including the role of rip current strength, wind speed, wind direction, and very short-scale wave breaking in rip current imaging. Following the methodology of Rascle et al. (J. Phys. Oceanography, 2014), the radar imaging model uses a relaxation approach that models perturbations to the equilibrium wave action spectrum induced by gradients in the underlying current field (specifically, the divergence and strain components of the deformation tensor). From the perturbed wind-wave spectrum, changes in the mean square slope (MSS) are then calculated and taken as a proxy for the change in radar backscatter intensity due to rip currents. Model simulations of rip current velocity fields for the field experiments were developed previously by Wilson et al. (J. Geophys. Res., 2014) using ROMS. The modeled velocities are used as input into the backscatter model and the predicted changes in MSS are compared with the radar observations. Modeled changes in MSS are shown to compare well with the observed occurrence and spatial scales of the rips, including their oblique orientation and their offshore extent. Remaining questions include the effect of wind direction and fetch on the imaging of rips.

  18. Status of LDEF radiation modeling

    NASA Technical Reports Server (NTRS)

    Watts, John W.; Armstrong, T. W.; Colborn, B. L.

    1995-01-01

    The current status of model prediction and comparison with LDEF radiation dosimetry measurements is summarized with emphasis on major results obtained in evaluating the uncertainties of present radiation environment model. The consistency of results and conclusions obtained from model comparison with different sets of LDEF radiation data (dose, activation, fluence, LET spectra) is discussed. Examples where LDEF radiation data and modeling results can be utilized to provide improved radiation assessments for planned LEO missions (e.g., Space Station) are given.

  19. GRACE gravity model: assssment in terms of deep ocean currents from hydrography and from the ECCO ocean model

    NASA Technical Reports Server (NTRS)

    Zlotnicki, V.; Stammer, D.; Fukumori, I.

    2003-01-01

    Here we assess the new generation of gravity models, derived from GRACE data. The differences between a global geoid model (one from GRACE data and one the well-known EGM-96), minus a Mean Sea Surface derived from over a decade of altimetric data are compared to hydrographic data from the Levitus compilation and to the ECCO numerical ocean model, which assimilates altimetry and other data.

  20. Genetically modified plants and food hypersensitivity diseases: usage and implications of experimental models for risk assessment.

    PubMed

    Prescott, Vanessa E; Hogan, Simon P

    2006-08-01

    The recent advances in biotechnology in the plant industry have led to increasing crop production and yield that in turn has increased the usage of genetically modified (GM) food in the human food chain. The usage of GM foods for human consumption has raised a number of fundamental questions including the ability of GM foods to elicit potentially harmful immunological responses, including allergic hypersensitivity. To assess the safety of foods derived from GM plants including allergenic potential, the US FDA, Food and Agriculture Organization of the United Nations (FAO)/World Health Organization (WHO), and the EU have developed approaches for evaluation assessment. One assessment approach that has been a very active area of research and debate is the development and usage of animal models to assess the potential allergenicity of GM foods. A number of specific animal models employing rodents, pigs, and dogs have been developed for allergenicity assessment. However, validation of these models is needed and consideration of the criteria for an appropriate animal model for the assessment of allergenicity in GM plants is required. We have recently employed a BALB/c mouse model to assess the potential allergenicity of GM plants. We have been able to demonstrate that this model is able to detect differences in antigenicity and identify aspects of protein post-translational modifications that can alter antigenicity. Furthermore, this model has also enabled us to examine the usage of GM plants as a therapeutic approach for the treatment of allergic diseases. This review discusses the current approaches to assess the allergenic potential of GM food and particularly focusing on the usage of animal models to determine the potential allergenicity of GM foods and gives an overview of our recent findings and implications of these studies.

  1. Modelling NO2 concentrations at the street level in the GAINS integrated assessment model: projections under current legislation

    NASA Astrophysics Data System (ADS)

    Kiesewetter, G.; Borken-Kleefeld, J.; Schöpp, W.; Heyes, C.; Thunis, P.; Bessagnet, B.; Gsella, A.; Amann, M.

    2013-08-01

    NO2 concentrations at the street level are a major concern for urban air quality in Europe and have been regulated under the EU Thematic Strategy on Air Pollution. Despite the legal requirements, limit values are exceeded at many monitoring stations with little or no improvement during recent years. In order to assess the effects of future emission control regulations on roadside NO2 concentrations, a downscaling module has been implemented in the GAINS integrated assessment model. The module follows a hybrid approach based on atmospheric dispersion calculations and observations from the AirBase European air quality data base that are used to estimate site-specific parameters. Pollutant concentrations at every monitoring site with sufficient data coverage are disaggregated into contributions from regional background, urban increment, and local roadside increment. The future evolution of each contribution is assessed with a model of the appropriate scale - 28 × 28 km grid based on the EMEP Model for the regional background, 7 × 7 km urban increment based on the CHIMERE Chemistry Transport Model, and a chemical box model for the roadside increment. Thus, different emission scenarios and control options for long-range transport, regional and local emissions can be analysed. Observed concentrations and historical trends are well captured, in particular the differing NO2 and total NOx = NO + NO2 trends. Altogether, more than 1950 air quality monitoring stations in the EU are covered by the model, including more than 400 traffic stations and 70% of the critical stations. Together with its well-established bottom-up emission and dispersion calculation scheme, GAINS is thus able to bridge the scales from European-wide policies to impacts in street canyons. As an application of the model, we assess the evolution of attainment of NO2 limit values under current legislation until 2030. Strong improvements are expected with the introduction of the Euro 6 emission standard for light duty vehicles; however, for some major European cities, further measures may be required, in particular if aiming to achieve compliance at an earlier time.

  2. Modelling NO2 concentrations at the street level in the GAINS integrated assessment model: projections under current legislation

    NASA Astrophysics Data System (ADS)

    Kiesewetter, G.; Borken-Kleefeld, J.; Schöpp, W.; Heyes, C.; Thunis, P.; Bessagnet, B.; Terrenoire, E.; Gsella, A.; Amann, M.

    2014-01-01

    NO2 concentrations at the street level are a major concern for urban air quality in Europe and have been regulated under the EU Thematic Strategy on Air Pollution. Despite the legal requirements, limit values are exceeded at many monitoring stations with little or no improvement in recent years. In order to assess the effects of future emission control regulations on roadside NO2 concentrations, a downscaling module has been implemented in the GAINS integrated assessment model. The module follows a hybrid approach based on atmospheric dispersion calculations and observations from the AirBase European air quality database that are used to estimate site-specific parameters. Pollutant concentrations at every monitoring site with sufficient data coverage are disaggregated into contributions from regional background, urban increment, and local roadside increment. The future evolution of each contribution is assessed with a model of the appropriate scale: 28 × 28 km grid based on the EMEP Model for the regional background, 7 × 7 km urban increment based on the CHIMERE Chemistry Transport Model, and a chemical box model for the roadside increment. Thus, different emission scenarios and control options for long-range transport as well as regional and local emissions can be analysed. Observed concentrations and historical trends are well captured, in particular the differing NO2 and total NOx = NO + NO2 trends. Altogether, more than 1950 air quality monitoring stations in the EU are covered by the model, including more than 400 traffic stations and 70% of the critical stations. Together with its well-established bottom-up emission and dispersion calculation scheme, GAINS is thus able to bridge the scales from European-wide policies to impacts in street canyons. As an application of the model, we assess the evolution of attainment of NO2 limit values under current legislation until 2030. Strong improvements are expected with the introduction of the Euro 6 emission standard for light duty vehicles; however, for some major European cities, further measures may be required, in particular if aiming to achieve compliance at an earlier time.

  3. Use of Influenza Risk Assessment Tool for Prepandemic Preparedness

    PubMed Central

    Trock, Susan C.

    2018-01-01

    In 2010, the Centers for Disease Control and Prevention began to develop an Influenza Risk Assessment Tool (IRAT) to methodically capture and assess information relating to influenza A viruses not currently circulating among humans. The IRAT uses a multiattribute, additive model to generate a summary risk score for each virus. Although the IRAT is not intended to predict the next pandemic influenza A virus, it has provided input into prepandemic preparedness decisions. PMID:29460739

  4. Probing the Relative Importance of Different Attributes in L2 Reading and Listening Comprehension Items: An Application of Cognitive Diagnostic Models

    ERIC Educational Resources Information Center

    Yi, Yeon-Sook

    2017-01-01

    The present study examines the relative importance of attributes within and across items by applying four cognitive diagnostic assessment models. The current study utilizes the function of the models that can indicate inter-attribute relationships that reflect the response behaviors of examinees to analyze scored test-taker responses to four forms…

  5. Using the Many-Facet Rasch Model to Evaluate Standard-Setting Judgments: Setting Performance Standards for Advanced Placement® Examinations

    ERIC Educational Resources Information Center

    Kaliski, Pamela; Wind, Stefanie A.; Engelhard, George, Jr.; Morgan, Deanna; Plake, Barbara; Reshetar, Rosemary

    2012-01-01

    The Many-Facet Rasch (MFR) Model is traditionally used to evaluate the quality of ratings on constructed response assessments; however, it can also be used to evaluate the quality of judgments from panel-based standard setting procedures. The current study illustrates the use of the MFR Model by examining the quality of ratings obtained from a…

  6. Multi-hazard risk analysis related to hurricanes

    NASA Astrophysics Data System (ADS)

    Lin, Ning

    Hurricanes present major hazards to the United States. Associated with extreme winds, heavy rainfall, and storm surge, landfalling hurricanes often cause enormous structural damage to coastal regions. Hurricane damage risk assessment provides the basis for loss mitigation and related policy-making. Current hurricane risk models, however, often oversimplify the complex processes of hurricane damage. This dissertation aims to improve existing hurricane risk assessment methodology by coherently modeling the spatial-temporal processes of storm landfall, hazards, and damage. Numerical modeling technologies are used to investigate the multiplicity of hazards associated with landfalling hurricanes. The application and effectiveness of current weather forecasting technologies to predict hurricane hazards is investigated. In particular, the Weather Research and Forecasting model (WRF), with Geophysical Fluid Dynamics Laboratory (GFDL)'s hurricane initialization scheme, is applied to the simulation of the wind and rainfall environment during hurricane landfall. The WRF model is further coupled with the Advanced Circulation (AD-CIRC) model to simulate storm surge in coastal regions. A case study examines the multiple hazards associated with Hurricane Isabel (2003). Also, a risk assessment methodology is developed to estimate the probability distribution of hurricane storm surge heights along the coast, particularly for data-scarce regions, such as New York City. This methodology makes use of relatively simple models, specifically a statistical/deterministic hurricane model and the Sea, Lake and Overland Surges from Hurricanes (SLOSH) model, to simulate large numbers of synthetic surge events, and conducts statistical analysis. The estimation of hurricane landfall probability and hazards are combined with structural vulnerability models to estimate hurricane damage risk. Wind-induced damage mechanisms are extensively studied. An innovative windborne debris risk model is developed based on the theory of Poisson random measure, substantiated by a large amount of empirical data. An advanced vulnerability assessment methodology is then developed, by integrating this debris risk model and a component-based pressure damage model, to predict storm-specific or annual damage to coastal residential neighborhoods. The uniqueness of this vulnerability model lies in its detailed description of the interaction between wind pressure and windborne debris effects over periods of strong winds, which is a major mechanism leading to structural failures during hurricanes.

  7. Assessing Hydrologic Impacts of Land Configuration Changes Using an Integrated Hydrologic Model at the Rocky Flats Environmental Technology Site, Colorado

    NASA Astrophysics Data System (ADS)

    Prucha, R. H.; Dayton, C. S.; Hawley, C. M.

    2002-12-01

    The Rocky Flats Environmental Technology Site (RFETS) in Golden, Colorado, a former Department of Energy nuclear weapons manufacturing facility, is currently undergoing closure. The natural semi-arid interaction between surface and subsurface flow at RFETS is complex and complicated by the industrial modifications to the flow system. Using a substantial site data set, a distributed parameter, fully-integrated hydrologic model was developed to assess the hydrologic impact of different hypothetical site closure configurations on the current flow system and to better understand the integrated hydrologic behavior of the system. An integrated model with this level of detail has not been previously developed in a semi-arid area, and a unique, but comprehensive, approach was required to calibrate and validate the model. Several hypothetical scenarios were developed to simulate hydrologic effects of modifying different aspects of the site. For example, some of the simulated modifications included regrading the current land surface, changing the existing surface channel network, removing subsurface trenches and gravity drain flow systems, installing a slurry wall and geotechnical cover, changing the current vegetative cover, and converting existing buildings and pavement to permeable soil areas. The integrated flow model was developed using a rigorous physically-based code so that realistic design parameters can simulate these changes. This code also permitted evaluation of changes to complex integrated hydrologic system responses that included channelized and overland flow, pond levels, unsaturated zone storage, groundwater heads and flow directions, and integrated water balances for key areas. Results generally show that channel flow offsite decreases substantially for different scenarios, while groundwater heads generally increase within the reconfigured industrial area most of which is then discharged as evapotranspiration. These changes have significant implications to site closure and operation.

  8. Assessing Cognitive and Affective Empathy Through the Interpersonal Reactivity Index: An Argument Against a Two-Factor Model.

    PubMed

    Chrysikou, Evangelia G; Thompson, W Jake

    2016-12-01

    One aspect of higher order social cognition is empathy, a psychological construct comprising a cognitive (recognizing emotions) and an affective (responding to emotions) component. The complex nature of empathy complicates the accurate measurement of these components. The most widely used measure of empathy is the Interpersonal Reactivity Index (IRI). However, the factor structure of the IRI as it is predominantly used in the psychological literature differs from Davis's original four-factor model in that it arbitrarily combines the subscales to form two factors: cognitive and affective empathy. This two-factor model of the IRI, although popular, has yet to be examined for psychometric support. In the current study, we examine, for the first time, the validity of this alternative model. A confirmatory factor analysis showed poor model fit for this two-factor structure. Additional analyses offered support for the original four-factor model, as well as a hierarchical model for the scale. In line with previous findings, females scored higher on the IRI than males. Our findings indicate that the IRI, as it is currently used in the literature, does not accurately measure cognitive and affective empathy and highlight the advantages of using the original four-factor structure of the scale for empathy assessments. © The Author(s) 2015.

  9. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  10. Ecosystem Model Skill Assessment. Yes We Can!

    PubMed Central

    Olsen, Erik; Fay, Gavin; Gaichas, Sarah; Gamble, Robert; Lucey, Sean; Link, Jason S.

    2016-01-01

    Need to Assess the Skill of Ecosystem Models Accelerated changes to global ecosystems call for holistic and integrated analyses of past, present and future states under various pressures to adequately understand current and projected future system states. Ecosystem models can inform management of human activities in a complex and changing environment, but are these models reliable? Ensuring that models are reliable for addressing management questions requires evaluating their skill in representing real-world processes and dynamics. Skill has been evaluated for just a limited set of some biophysical models. A range of skill assessment methods have been reviewed but skill assessment of full marine ecosystem models has not yet been attempted. Northeast US Atlantis Marine Ecosystem Model We assessed the skill of the Northeast U.S. (NEUS) Atlantis marine ecosystem model by comparing 10-year model forecasts with observed data. Model forecast performance was compared to that obtained from a 40-year hindcast. Multiple metrics (average absolute error, root mean squared error, modeling efficiency, and Spearman rank correlation), and a suite of time-series (species biomass, fisheries landings, and ecosystem indicators) were used to adequately measure model skill. Overall, the NEUS model performed above average and thus better than expected for the key species that had been the focus of the model tuning. Model forecast skill was comparable to the hindcast skill, showing that model performance does not degenerate in a 10-year forecast mode, an important characteristic for an end-to-end ecosystem model to be useful for strategic management purposes. Skill Assessment Is Both Possible and Advisable We identify best-practice approaches for end-to-end ecosystem model skill assessment that would improve both operational use of other ecosystem models and future model development. We show that it is possible to not only assess the skill of a complicated marine ecosystem model, but that it is necessary do so to instill confidence in model results and encourage their use for strategic management. Our methods are applicable to any type of predictive model, and should be considered for use in fields outside ecology (e.g. economics, climate change, and risk assessment). PMID:26731540

  11. Assessing performance of gravity models in the Arctic and the implications for polar oceanography

    NASA Astrophysics Data System (ADS)

    Thomas, S. F.; McAdoo, D. C.; Farrell, S. L.; Brozena, J. M.; Childers, V. A.; Ziebart, M. K.; Shepherd, A.

    2014-12-01

    The circulation of the Arctic Ocean is of great interest to both the oceanographic and cryospheric communities. Understanding both the steady state and variations of this circulation is essential to building our knowledge of Arctic climate. With the advent of high inclination altimeter missions such as CryoSat and ICESat, it is now feasible to produce Mean Dynamic Topography (MDT) products for the region, which allow a comprehensive investigation of geostrophic currents. However, the accuracy of these products is largely limited by our knowledge of the marine geoid in the Arctic. There are a number of publicly available gravity models commonly used to derive the geoid. These use different combinations of available data (satellite gravimetry, altimetry, laser ranging, and in-situ) and are calculated using different mathematical techniques. However, the effect of these differences on the real world performance of these models when used for oceanographic studies in the Arctic is not well known. Given the unique problems for gravimetry in the region (especially data gaps) and their potential impact on MDT products, it is especially important that the relative performance of these models be assessed We consider the needs of the "end user" satellite oceanographer in the Arctic with respect to gravimetry, and the relationship between the precision of gravity data and the accuracy of a final MDT/current velocity product. Using high-precision aerogravity data collected over 3 years of campaigns by NASA's Operation IceBridge we inter-compare 10 of the leading gravity models and assess their performance in the Arctic. We also use historical data from campaigns flown by the US Naval Research Laboratory (NRL) to demonstrate the impact of gravity errors on MDT products. We describe how gravity models for the region might be improved in the future, in an effort to maximize the level at which Arctic currents may be resolved.

  12. Cross-validation pitfalls when selecting and assessing regression and classification models.

    PubMed

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  13. Regulatory assessment of chemical mixtures: Requirements, current approaches and future perspectives.

    PubMed

    Kienzler, Aude; Bopp, Stephanie K; van der Linden, Sander; Berggren, Elisabet; Worth, Andrew

    2016-10-01

    This paper reviews regulatory requirements and recent case studies to illustrate how the risk assessment (RA) of chemical mixtures is conducted, considering both the effects on human health and on the environment. A broad range of chemicals, regulations and RA methodologies are covered, in order to identify mixtures of concern, gaps in the regulatory framework, data needs, and further work to be carried out. Also the current and potential future use of novel tools (Adverse Outcome Pathways, in silico tools, toxicokinetic modelling, etc.) in the RA of combined effects were reviewed. The assumptions made in the RA, predictive model specifications and the choice of toxic reference values can greatly influence the assessment outcome, and should therefore be specifically justified. Novel tools could support mixture RA mainly by providing a better understanding of the underlying mechanisms of combined effects. Nevertheless, their use is currently limited because of a lack of guidance, data, and expertise. More guidance is needed to facilitate their application. As far as the authors are aware, no prospective RA concerning chemicals related to various regulatory sectors has been performed to date, even though numerous chemicals are registered under several regulatory frameworks. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Best practices for evaluating the capability of nondestructive evaluation (NDE) and structural health monitoring (SHM) techniques for damage characterization

    NASA Astrophysics Data System (ADS)

    Aldrin, John C.; Annis, Charles; Sabbagh, Harold A.; Lindgren, Eric A.

    2016-02-01

    A comprehensive approach to NDE and SHM characterization error (CE) evaluation is presented that follows the framework of the `ahat-versus-a' regression analysis for POD assessment. Characterization capability evaluation is typically more complex with respect to current POD evaluations and thus requires engineering and statistical expertise in the model-building process to ensure all key effects and interactions are addressed. Justifying the statistical model choice with underlying assumptions is key. Several sizing case studies are presented with detailed evaluations of the most appropriate statistical model for each data set. The use of a model-assisted approach is introduced to help assess the reliability of NDE and SHM characterization capability under a wide range of part, environmental and damage conditions. Best practices of using models are presented for both an eddy current NDE sizing and vibration-based SHM case studies. The results of these studies highlight the general protocol feasibility, emphasize the importance of evaluating key application characteristics prior to the study, and demonstrate an approach to quantify the role of varying SHM sensor durability and environmental conditions on characterization performance.

  15. Defending Against Advanced Persistent Threats Using Game-Theory.

    PubMed

    Rass, Stefan; König, Sandra; Schauer, Stefan

    2017-01-01

    Advanced persistent threats (APT) combine a variety of different attack forms ranging from social engineering to technical exploits. The diversity and usual stealthiness of APT turns them into a central problem of contemporary practical system security, since information on attacks, the current system status or the attacker's incentives is often vague, uncertain and in many cases even unavailable. Game theory is a natural approach to model the conflict between the attacker and the defender, and this work investigates a generalized class of matrix games as a risk mitigation tool for an advanced persistent threat (APT) defense. Unlike standard game and decision theory, our model is tailored to capture and handle the full uncertainty that is immanent to APTs, such as disagreement among qualitative expert risk assessments, unknown adversarial incentives and uncertainty about the current system state (in terms of how deeply the attacker may have penetrated into the system's protective shells already). Practically, game-theoretic APT models can be derived straightforwardly from topological vulnerability analysis, together with risk assessments as they are done in common risk management standards like the ISO 31000 family. Theoretically, these models come with different properties than classical game theoretic models, whose technical solution presented in this work may be of independent interest.

  16. Fire risk in San Diego County, California: A weighted Bayesian model approach

    USGS Publications Warehouse

    Kolden, Crystal A.; Weigel, Timothy J.

    2007-01-01

    Fire risk models are widely utilized to mitigate wildfire hazards, but models are often based on expert opinions of less understood fire-ignition and spread processes. In this study, we used an empirically derived weights-of-evidence model to assess what factors produce fire ignitions east of San Diego, California. We created and validated a dynamic model of fire-ignition risk based on land characteristics and existing fire-ignition history data, and predicted ignition risk for a future urbanization scenario. We then combined our empirical ignition-risk model with a fuzzy fire behavior-risk model developed by wildfire experts to create a hybrid model of overall fire risk. We found that roads influence fire ignitions and that future growth will increase risk in new rural development areas. We conclude that empirically derived risk models and hybrid models offer an alternative method to assess current and future fire risk based on management actions.

  17. Forecast of drifter trajectories using a Rapid Environmental Assessment based on CTD observations

    NASA Astrophysics Data System (ADS)

    Sorgente, R.; Tedesco, C.; Pessini, F.; De Dominicis, M.; Gerin, R.; Olita, A.; Fazioli, L.; Di Maio, A.; Ribotti, A.

    2016-11-01

    A high resolution submesoscale resolving ocean model was implemented in a limited area north of Island of Elba where a maritime exercise, named Serious Game 1 (SG1), took place on May 2014 in the framework of the project MEDESS-4MS (Mediterranean Decision Support System for Marine Safety). During the exercise, CTD data have been collected responding to the necessity of a Rapid Environmental Assessment, i.e. to a rapid evaluation of the marine conditions able to provide sensible information for initialisation of modelling tools, in the scenario of possible maritime accidents. The aim of this paper is to evaluate the impact of such mesoscale-resolving CTD observations on short-term forecasts of the surface currents, within the framework of possible oil-spill related emergencies. For this reason, modelling outputs were compared with Lagrangian observations at sea: the high resolution modelled currents, together with the ones of the coarser sub-regional model WMED, are used to force the MEDSLIK-II oil-spill model to simulate drifter trajectories. Both ocean models have been assessed by comparing the prognostic scalar and vector fields as an independent CTD data set and with real drifter trajectories acquired during SG1. The diagnosed and prognosed circulation reveals that the area was characterised by water masses of Atlantic origin influenced by small mesoscale cyclonic and anti-cyclonic eddies, which govern the spatial and temporal evolution of the drifter trajectories and of the water masses distribution. The assimilation of CTD data into the initial conditions of the high resolution model highly improves the accuracy of the short-term forecast in terms of location and structure of the thermocline and positively influence the ability of the model in reproducing the observed paths of the surface drifters.

  18. Future directions for LDEF ionizing radiation modeling and assessments

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    1992-01-01

    Data from the ionizing radiation dosimetry aboard LDEF provide a unique opportunity for assessing the accuracy of current space radiation models and in identifying needed improvements for future mission applications. Details are given of the LDEF data available for radiation model evaluations. The status is given of model comparisons with LDEF data, along with future directions of planned modeling efforts and data comparison assessments. The methodology is outlined which is related to modeling being used to help insure that the LDEF ionizing radiation results can be used to address ionizing radiation issues for future missions. In general, the LDEF radiation modeling has emphasized quick-look predictions using simplified methods to make comparisons with absorbed dose measurements and induced radioactivity measurements of emissions. Modeling and LDEF data comparisons related to linear energy transfer spectra are of importance for several reasons which are outlined. The planned modeling and LDEF data comparisons for LET spectra is discussed, including components of the LET spectra due to different environment sources, contribution from different production mechanisms, and spectra in plastic detectors vs silicon.

  19. Scientific white paper on concentration-QTc modeling.

    PubMed

    Garnett, Christine; Bonate, Peter L; Dang, Qianyu; Ferber, Georg; Huang, Dalong; Liu, Jiang; Mehrotra, Devan; Riley, Steve; Sager, Philip; Tornoe, Christoffer; Wang, Yaning

    2018-06-01

    The International Council for Harmonisation revised the E14 guideline through the questions and answers process to allow concentration-QTc (C-QTc) modeling to be used as the primary analysis for assessing the QTc interval prolongation risk of new drugs. A well-designed and conducted QTc assessment based on C-QTc modeling in early phase 1 studies can be an alternative approach to a thorough QT study for some drugs to reliably exclude clinically relevant QTc effects. This white paper provides recommendations on how to plan and conduct a definitive QTc assessment of a drug using C-QTc modeling in early phase clinical pharmacology and thorough QT studies. Topics included are: important study design features in a phase 1 study; modeling objectives and approach; exploratory plots; the pre-specified linear mixed effects model; general principles for model development and evaluation; and expectations for modeling analysis plans and reports. The recommendations are based on current best modeling practices, scientific literature and personal experiences of the authors. These recommendations are expected to evolve as their implementation during drug development provides additional data and with advances in analytical methodology.

  20. Posttraumatic stress mediates the relationship between childhood victimization and current mental health burden in newly incarcerated adults.

    PubMed

    Greene, Carolyn A; Ford, Julian D; Wakefield, Dorothy B; Barry, Lisa C

    2014-10-01

    The purpose of this study was to evaluate the interrelationship among childhood abuse and traumatic loss, posttraumatic stress symptoms (PTSS), and Axis I psychiatric disorders other than PTSD among newly incarcerated adults, and to test a proposed model in which the severity of PTSS mediates the relationship between childhood abuse/loss and adult psychiatric disorders. Four hundred sixty-five male and female inmates participated in a structured clinical research interview. Four types of interpersonal potentially traumatic experiences (physical abuse, sexual abuse, emotional abuse, and traumatic loss) were assessed for occurrence prior to the age of 18 years old. Current psychiatric disorders and PTSS were also assessed by structured interview. Negative binomial regression was used to evaluate the association between the cumulative number of types of childhood abuse/loss experienced and number of current Axis I disorders, and to test the mediation model. Approximately half of the sample (51%) experienced 1 or more types of childhood abuse/loss, and 30% of the sample had at least one psychiatric disorder other than PTSD. For both men and women, childhood physical abuse and childhood sexual abuse were independently associated with psychiatric morbidity, and an increasing number of types of childhood trauma experienced was associated with an increase in the number of current Axis I diagnoses. However, these associations were no longer statistically significant when severity of PTSS was added to the model, providing support for the proposed mediation model. Implications for secondary prevention services for at-risk inmates are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Spatially-Explicit Simulation Modeling of Ecological Response to Climate Change: Methodological Considerations in Predicting Shifting Population Dynamics of Infectious Disease Vectors.

    PubMed

    Dhingra, Radhika; Jimenez, Violeta; Chang, Howard H; Gambhir, Manoj; Fu, Joshua S; Liu, Yang; Remais, Justin V

    2013-09-01

    Poikilothermic disease vectors can respond to altered climates through spatial changes in both population size and phenology. Quantitative descriptors to characterize, analyze and visualize these dynamic responses are lacking, particularly across large spatial domains. In order to demonstrate the value of a spatially explicit, dynamic modeling approach, we assessed spatial changes in the population dynamics of Ixodes scapularis , the Lyme disease vector, using a temperature-forced population model simulated across a grid of 4 × 4 km cells covering the eastern United States, using both modeled (Weather Research and Forecasting (WRF) 3.2.1) baseline/current (2001-2004) and projected (Representative Concentration Pathway (RCP) 4.5 and RCP 8.5; 2057-2059) climate data. Ten dynamic population features (DPFs) were derived from simulated populations and analyzed spatially to characterize the regional population response to current and future climate across the domain. Each DPF under the current climate was assessed for its ability to discriminate observed Lyme disease risk and known vector presence/absence, using data from the US Centers for Disease Control and Prevention. Peak vector population and month of peak vector population were the DPFs that performed best as predictors of current Lyme disease risk. When examined under baseline and projected climate scenarios, the spatial and temporal distributions of DPFs shift and the seasonal cycle of key questing life stages is compressed under some scenarios. Our results demonstrate the utility of spatial characterization, analysis and visualization of dynamic population responses-including altered phenology-of disease vectors to altered climate.

  2. Spatially-Explicit Simulation Modeling of Ecological Response to Climate Change: Methodological Considerations in Predicting Shifting Population Dynamics of Infectious Disease Vectors

    PubMed Central

    Dhingra, Radhika; Jimenez, Violeta; Chang, Howard H.; Gambhir, Manoj; Fu, Joshua S.; Liu, Yang; Remais, Justin V.

    2014-01-01

    Poikilothermic disease vectors can respond to altered climates through spatial changes in both population size and phenology. Quantitative descriptors to characterize, analyze and visualize these dynamic responses are lacking, particularly across large spatial domains. In order to demonstrate the value of a spatially explicit, dynamic modeling approach, we assessed spatial changes in the population dynamics of Ixodes scapularis, the Lyme disease vector, using a temperature-forced population model simulated across a grid of 4 × 4 km cells covering the eastern United States, using both modeled (Weather Research and Forecasting (WRF) 3.2.1) baseline/current (2001–2004) and projected (Representative Concentration Pathway (RCP) 4.5 and RCP 8.5; 2057–2059) climate data. Ten dynamic population features (DPFs) were derived from simulated populations and analyzed spatially to characterize the regional population response to current and future climate across the domain. Each DPF under the current climate was assessed for its ability to discriminate observed Lyme disease risk and known vector presence/absence, using data from the US Centers for Disease Control and Prevention. Peak vector population and month of peak vector population were the DPFs that performed best as predictors of current Lyme disease risk. When examined under baseline and projected climate scenarios, the spatial and temporal distributions of DPFs shift and the seasonal cycle of key questing life stages is compressed under some scenarios. Our results demonstrate the utility of spatial characterization, analysis and visualization of dynamic population responses—including altered phenology—of disease vectors to altered climate. PMID:24772388

  3. A Skill Score of Trajectory Model Evaluation Using Reinitialized Series of Normalized Cumulative Lagrangian Separation

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Weisberg, R. H.

    2017-12-01

    The Lagrangian separation distance between the endpoints of simulated and observed drifter trajectories is often used to assess the performance of numerical particle trajectory models. However, the separation distance fails to indicate relative model performance in weak and strong current regions, such as a continental shelf and its adjacent deep ocean. A skill score is proposed based on the cumulative Lagrangian separation distances normalized by the associated cumulative trajectory lengths. The new metrics correctly indicates the relative performance of the Global HYCOM in simulating the strong currents of the Gulf of Mexico Loop Current and the weaker currents of the West Florida Shelf in the eastern Gulf of Mexico. In contrast, the Lagrangian separation distance alone gives a misleading result. Also, the observed drifter position series can be used to reinitialize the trajectory model and evaluate its performance along the observed trajectory, not just at the drifter end position. The proposed dimensionless skill score is particularly useful when the number of drifter trajectories is limited and neither a conventional Eulerian-based velocity nor a Lagrangian-based probability density function may be estimated.

  4. Assessing the performance of formulations for nonlinear feedback of surface gravity waves on ocean currents over coastal waters

    NASA Astrophysics Data System (ADS)

    Wang, Pengcheng; Sheng, Jinyu; Hannah, Charles

    2017-08-01

    This study presents applications of a two-way coupled wave-circulation modelling system over coastal waters, with a special emphasis of performance assessments of two different methods for nonlinear feedback of ocean surface gravity waves on three-dimensional (3D) ocean currents. These two methods are the vortex force (VF) formulation suggested by Bennis et al. (2011) and the latest version of radiation stress (RS) formulation suggested by Mellor (2015). The coupled modelling system is first applied to two idealized test cases of surf-zone scales to validate implementations of these two methods in the coupled wave-circulation system. Model results show that the latest version of RS has difficulties in producing the undertow over the surf zone. The coupled system is then applied to Lunenburg Bay (LB) of Nova Scotia during Hurricane Juan in 2003. The coupled system using both the VF and RS formulations generates much stronger and more realistic 3D circulation in the Bay during Hurricane Juan than the circulation-only model, demonstrating the importance of surface wave forces to the 3D ocean circulation over coastal waters. However, the RS formulation generates some weak unphysical currents outside the wave breaking zone due to a less reasonable representation for the vertical distribution of the RS gradients over a slopping bottom. These weak unphysical currents are significantly magnified in a two-way coupled system when interacting with large surface waves, degrading the model performance in simulating currents at one observation site. Our results demonstrate that the VF formulation with an appropriate parameterization of wave breaking effects is able to produce reasonable results for applications over coastal waters during extreme weather events. The RS formulation requires a complex wave theory rather than the linear wave theory for the approximation of a vertical RS term to improve its performance under both breaking and non-breaking wave conditions.

  5. A pharmacokinetic-pharmacodynamic model for the quantitative prediction of dofetilide clinical QT prolongation from human ether-a-go-go-related gene current inhibition data.

    PubMed

    Jonker, Daniël M; Kenna, Leslie A; Leishman, Derek; Wallis, Rob; Milligan, Peter A; Jonsson, E Niclas

    2005-06-01

    QT prolongation is an important biomarker of the arrhythmia torsades de pointes and appears to be related mainly to blockade of delayed inward cardiac rectifier potassium currents. The aim of this study was to quantify the relationship between in vitro human ether-a-go-go-related gene (hERG) potassium channel blockade and the magnitude of QT prolongation in humans for the class III antiarrhythmic dofetilide. The in vitro affinity and activity of dofetilide were determined in recombinant cell cultures expressing the hERG channel, and the QT-prolonging effect of dofetilide was assessed in 5 clinical studies (80 healthy volunteers and 17 patients with ischemic heart disease). A population pharmacokinetic-pharmacodynamic analysis of the in vitro and in vivo data was performed in NONMEM by use of the operational model of pharmacologic agonism to estimate the efficiency of transduction from ion channel binding to Fridericia-corrected QT response. A 3-compartment pharmacokinetic model with first-order absorption characterized the time course of dofetilide concentrations. On the basis of an in vitro potency of 5.13 ng/mL for potassium current inhibition and predicted unbound dofetilide concentrations, the estimated transducer ratio (tau) of 6.2 suggests that the QT response plateaus before currents are fully blocked. In our study population, 10% hERG blockade corresponds to a QT prolongation of 20 ms (95% confidence interval, 12-32 ms). With long-term dofetilide administration, tolerance develops with a half-life of 4.7 days. The current mechanism-based pharmacokinetic-pharmacodynamic model quantified the relationship between in vitro hERG channel blockade and clinical QT prolongation for dofetilide. This model may prove valuable for assessing the risk of QT prolongation in humans for other drugs that selectively block the hERG channel on the basis of in vitro assays and pharmacokinetic properties.

  6. Assessment of Technologies for the Space Shuttle External Tank Thermal Protection System and Recommendations for Technology Improvement. Part 2; Structural Analysis Technologies and Modeling Practices

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Nemeth, Michael P.; Hilburger, Mark W.

    2004-01-01

    A technology review and assessment of modeling and analysis efforts underway in support of a safe return to flight of the thermal protection system (TPS) for the Space Shuttle external tank (ET) are summarized. This review and assessment effort focuses on the structural modeling and analysis practices employed for ET TPS foam design and analysis and on identifying analysis capabilities needed in the short-term and long-term. The current understanding of the relationship between complex flight environments and ET TPS foam failure modes are reviewed as they relate to modeling and analysis. A literature review on modeling and analysis of TPS foam material systems is also presented. Finally, a review of modeling and analysis tools employed in the Space Shuttle Program is presented for the ET TPS acreage and close-out foam regions. This review includes existing simplified engineering analysis tools are well as finite element analysis procedures.

  7. Enhanced science-stakeholder communication to improve ecosystem model performances for climate change impact assessments.

    PubMed

    Jönsson, Anna Maria; Anderbrant, Olle; Holmér, Jennie; Johansson, Jacob; Schurgers, Guy; Svensson, Glenn P; Smith, Henrik G

    2015-04-01

    In recent years, climate impact assessments of relevance to the agricultural and forestry sectors have received considerable attention. Current ecosystem models commonly capture the effect of a warmer climate on biomass production, but they rarely sufficiently capture potential losses caused by pests, pathogens and extreme weather events. In addition, alternative management regimes may not be integrated in the models. A way to improve the quality of climate impact assessments is to increase the science-stakeholder collaboration, and in a two-way dialog link empirical experience and impact modelling with policy and strategies for sustainable management. In this paper we give a brief overview of different ecosystem modelling methods, discuss how to include ecological and management aspects, and highlight the importance of science-stakeholder communication. By this, we hope to stimulate a discussion among the science-stakeholder communities on how to quantify the potential for climate change adaptation by improving the realism in the models.

  8. Automated verbal credibility assessment of intentions: The model statement technique and predictive modeling

    PubMed Central

    van der Toolen, Yaloe; Vrij, Aldert; Arntz, Arnoud; Verschuere, Bruno

    2018-01-01

    Summary Recently, verbal credibility assessment has been extended to the detection of deceptive intentions, the use of a model statement, and predictive modeling. The current investigation combines these 3 elements to detect deceptive intentions on a large scale. Participants read a model statement and wrote a truthful or deceptive statement about their planned weekend activities (Experiment 1). With the use of linguistic features for machine learning, more than 80% of the participants were classified correctly. Exploratory analyses suggested that liars included more person and location references than truth‐tellers. Experiment 2 examined whether these findings replicated on independent‐sample data. The classification accuracies remained well above chance level but dropped to 63%. Experiment 2 corroborated the finding that liars' statements are richer in location and person references than truth‐tellers' statements. Together, these findings suggest that liars may over‐prepare their statements. Predictive modeling shows promise as an automated veracity assessment approach but needs validation on independent data. PMID:29861544

  9. Screening for Elevated Blood Lead Levels in Children: Assessment of Criteria and a Proposal for New Ones in France

    PubMed Central

    Etchevers, Anne; Glorennec, Philippe; Le Strat, Yann; Lecoffre, Camille; Bretin, Philippe; Le Tertre, Alain

    2015-01-01

    The decline in children’s Blood Lead Levels (BLL) raises questions about the ability of current lead poisoning screening criteria to identify those children most exposed. The objectives of the study were to evaluate the performance of current screening criteria in identifying children with blood lead levels higher than 50 µg/L in France, and to propose new criteria. Data from a national French survey, conducted among 3831 children aged 6 months to 6 years in 2008–2009 were used. The sensitivity and specificity of the current criteria in predicting blood lead levels higher than or equal to 50 µg/L were evaluated. Two predictive models of BLL above 44 µg/L (for lack of sufficient sample size at 50 µg/L) were built: the first using current criteria, and the second using newly identified risk factors. For each model, performance was studied by calculating the area under the ROC (Receiver Operating Characteristic) curve. The sensitivity of current criteria for detecting BLL higher than or equal to 50 µg/L was 0.51 (0.26; 0.75) and specificity was 0.66 (0.62; 0.70). The new model included the following criteria: foreign child newly arrived in France, mother born abroad, consumption of tap water in the presence of lead pipes, pre-1949 housing, period of construction of housing unknown, presence of peeling paint, parental smoking at home, occupancy rates for housing and child’s address in a cadastral municipality or census block comprising more than 6% of housing that is potentially unfit and built pre-1949. The area under the ROC curve was 0.86 for the new model, versus 0.76 for the current one. The lead poisoning screening criteria should be updated. The risk of industrial, occupational and hobby-related exposure could not be assessed in this study, but should be kept as screening criteria. PMID:26633457

  10. Ecosystem Model Skill Assessment. Yes We Can!

    PubMed

    Olsen, Erik; Fay, Gavin; Gaichas, Sarah; Gamble, Robert; Lucey, Sean; Link, Jason S

    2016-01-01

    Accelerated changes to global ecosystems call for holistic and integrated analyses of past, present and future states under various pressures to adequately understand current and projected future system states. Ecosystem models can inform management of human activities in a complex and changing environment, but are these models reliable? Ensuring that models are reliable for addressing management questions requires evaluating their skill in representing real-world processes and dynamics. Skill has been evaluated for just a limited set of some biophysical models. A range of skill assessment methods have been reviewed but skill assessment of full marine ecosystem models has not yet been attempted. We assessed the skill of the Northeast U.S. (NEUS) Atlantis marine ecosystem model by comparing 10-year model forecasts with observed data. Model forecast performance was compared to that obtained from a 40-year hindcast. Multiple metrics (average absolute error, root mean squared error, modeling efficiency, and Spearman rank correlation), and a suite of time-series (species biomass, fisheries landings, and ecosystem indicators) were used to adequately measure model skill. Overall, the NEUS model performed above average and thus better than expected for the key species that had been the focus of the model tuning. Model forecast skill was comparable to the hindcast skill, showing that model performance does not degenerate in a 10-year forecast mode, an important characteristic for an end-to-end ecosystem model to be useful for strategic management purposes. We identify best-practice approaches for end-to-end ecosystem model skill assessment that would improve both operational use of other ecosystem models and future model development. We show that it is possible to not only assess the skill of a complicated marine ecosystem model, but that it is necessary do so to instill confidence in model results and encourage their use for strategic management. Our methods are applicable to any type of predictive model, and should be considered for use in fields outside ecology (e.g. economics, climate change, and risk assessment).

  11. Assessment of a novel biomechanical fracture model for distal radius fractures

    PubMed Central

    2012-01-01

    Background Distal radius fractures (DRF) are one of the most common fractures and often need surgical treatment, which has been validated through biomechanical tests. Currently a number of different fracture models are used, none of which resemble the in vivo fracture location. The aim of the study was to develop a new standardized fracture model for DRF (AO-23.A3) and compare its biomechanical behavior to the current gold standard. Methods Variable angle locking volar plates (ADAPTIVE, Medartis) were mounted on 10 pairs of fresh-frozen radii. The osteotomy location was alternated within each pair (New: 10 mm wedge 8 mm / 12 mm proximal to the dorsal / volar apex of the articular surface; Gold standard: 10 mm wedge 20 mm proximal to the articular surface). Each specimen was tested in cyclic axial compression (increasing load by 100 N per cycle) until failure or −3 mm displacement. Parameters assessed were stiffness, displacement and dissipated work calculated for each cycle and ultimate load. Significance was tested using a linear mixed model and Wald test as well as t-tests. Results 7 female and 3 male pairs of radii aged 74 ± 9 years were tested. In most cases (7/10), the two groups showed similar mechanical behavior at low loads with increasing differences at increasing loads. Overall the novel fracture model showed a significant different biomechanical behavior than the gold standard model (p < 0,001). The average final loads resisted were significantly lower in the novel model (860 N ± 232 N vs. 1250 N ± 341 N; p = 0.001). Conclusion The novel biomechanical fracture model for DRF more closely mimics the in vivo fracture site and shows a significantly different biomechanical behavior with increasing loads when compared to the current gold standard. PMID:23244634

  12. Decision Tree Approach for Soil Liquefaction Assessment

    PubMed Central

    Gandomi, Amir H.; Fridline, Mark M.; Roke, David A.

    2013-01-01

    In the current study, the performances of some decision tree (DT) techniques are evaluated for postearthquake soil liquefaction assessment. A database containing 620 records of seismic parameters and soil properties is used in this study. Three decision tree techniques are used here in two different ways, considering statistical and engineering points of view, to develop decision rules. The DT results are compared to the logistic regression (LR) model. The results of this study indicate that the DTs not only successfully predict liquefaction but they can also outperform the LR model. The best DT models are interpreted and evaluated based on an engineering point of view. PMID:24489498

  13. Decision tree approach for soil liquefaction assessment.

    PubMed

    Gandomi, Amir H; Fridline, Mark M; Roke, David A

    2013-01-01

    In the current study, the performances of some decision tree (DT) techniques are evaluated for postearthquake soil liquefaction assessment. A database containing 620 records of seismic parameters and soil properties is used in this study. Three decision tree techniques are used here in two different ways, considering statistical and engineering points of view, to develop decision rules. The DT results are compared to the logistic regression (LR) model. The results of this study indicate that the DTs not only successfully predict liquefaction but they can also outperform the LR model. The best DT models are interpreted and evaluated based on an engineering point of view.

  14. Bottom friction. A practical approach to modelling coastal oceanography

    NASA Astrophysics Data System (ADS)

    Bolanos, Rodolfo; Jensen, Palle; Kofoed-Hansen, Henrik; Tornsfeldt Sørensen, Jacob

    2017-04-01

    Coastal processes imply the interaction of the atmosphere, the sea, the coastline and the bottom. The spatial gradients in this area are normally large, induced by orographic and bathymetric features. Although nowadays it is possible to obtain high-resolution bathymetry, the details of the seabed, e.g. sediment type, presence of biological material and living organisms are not available. Additionally, these properties as well as bathymetry can also be highly dynamic. These bottom characteristics are very important to describe the boundary layer of currents and waves and control to a large degree the dissipation of flows. The bottom friction is thus typically a calibration parameter in numerical modelling of coastal processes. In this work, we assess this process and put it into context of other physical processes uncertainties influencing wind-waves and currents in the coastal areas. A case study in the North Sea is used, particularly the west coast of Denmark, where water depth of less than 30 m cover a wide fringe along the coast, where several offshore wind farm developments are being carried out. We use the hydrodynamic model MIKE 21 HD and the spectral wave model MIKE 21 SW to simulate atmosphere and tidal induced flows and the wind wave generation and propagation. Both models represent state of the art and have been developed for flexible meshes, ideal for coastal oceanography as they can better represent coastlines and allow a variable spatial resolution within the domain. Sensitivity tests to bottom friction formulations are carried out into context of other processes (e.g. model forcing uncertainties, wind and wave interactions, wind drag coefficient). Additionally, a map of varying bottom properties is generated based on a literature survey to explore the impact of the spatial variability. Assessment of different approaches is made in order to establish a best practice regarding bottom friction and coastal oceanographic modelling. Its contribution is also assessed during storm conditions, where its most evident impact is expected as waves are affected by the bottom processes in larger areas, making bottom dissipation more efficient. We use available waves and current measurements in the North Sea (e.g. Ekofisk, Fino platforms and some other coastal stations at the west coast of Denmark) to quantify the importance of processes influencing waves and currents in the coastal zone and putting it in the context of the importance of bottom friction and other processes uncertainties.

  15. Locally-Adaptive, Spatially-Explicit Projection of U.S. Population for 2030 and 2050

    DOE PAGES

    McKee, Jacob J.; Rose, Amy N.; Bright, Eddie A.; ...

    2015-02-03

    Localized adverse events, including natural hazards, epidemiological events, and human conflict, underscore the criticality of quantifying and mapping current population. Moreover, knowing the spatial distribution of future population allows for increased preparation in the event of an emergency. Building on the spatial interpolation technique previously developed for high resolution population distribution data (LandScan Global and LandScan USA), we have constructed an empirically-informed spatial distribution of the projected population of the contiguous U.S. for 2030 and 2050. Whereas most current large-scale, spatially explicit population projections typically rely on a population gravity model to determine areas of future growth, our projection modelmore » departs from these by accounting for multiple components that affect population distribution. Modelled variables, which included land cover, slope, distances to larger cities, and a moving average of current population, were locally adaptive and geographically varying. The resulting weighted surface was used to determine which areas had the greatest likelihood for future population change. Population projections of county level numbers were developed using a modified version of the U.S. Census s projection methodology with the U.S. Census s official projection as the benchmark. Applications of our model include, but are not limited to, suitability modelling, service area planning for governmental agencies, consequence assessment, mitigation planning and implementation, and assessment of spatially vulnerable populations.« less

  16. Large Dataset of Acute Oral Toxicity Data Created for Testing ...

    EPA Pesticide Factsheets

    Acute toxicity data is a common requirement for substance registration in the US. Currently only data derived from animal tests are accepted by regulatory agencies, and the standard in vivo tests use lethality as the endpoint. Non-animal alternatives such as in silico models are being developed due to animal welfare and resource considerations. We compiled a large dataset of oral rat LD50 values to assess the predictive performance currently available in silico models. Our dataset combines LD50 values from five different sources: literature data provided by The Dow Chemical Company, REACH data from eChemportal, HSDB (Hazardous Substances Data Bank), RTECS data from Leadscope, and the training set underpinning TEST (Toxicity Estimation Software Tool). Combined these data sources yield 33848 chemical-LD50 pairs (data points), with 23475 unique data points covering 16439 compounds. The entire dataset was loaded into a chemical properties database. All of the compounds were registered in DSSTox and 59.5% have publically available structures. Compounds without a structure in DSSTox are currently having their structures registered. The structural data will be used to evaluate the predictive performance and applicable chemical domains of three QSAR models (TIMES, PROTOX, and TEST). Future work will combine the dataset with information from ToxCast assays, and using random forest modeling, assess whether ToxCast assays are useful in predicting acute oral toxicity. Pre

  17. Locally-Adaptive, Spatially-Explicit Projection of U.S. Population for 2030 and 2050

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKee, Jacob J.; Rose, Amy N.; Bright, Eddie A.

    Localized adverse events, including natural hazards, epidemiological events, and human conflict, underscore the criticality of quantifying and mapping current population. Moreover, knowing the spatial distribution of future population allows for increased preparation in the event of an emergency. Building on the spatial interpolation technique previously developed for high resolution population distribution data (LandScan Global and LandScan USA), we have constructed an empirically-informed spatial distribution of the projected population of the contiguous U.S. for 2030 and 2050. Whereas most current large-scale, spatially explicit population projections typically rely on a population gravity model to determine areas of future growth, our projection modelmore » departs from these by accounting for multiple components that affect population distribution. Modelled variables, which included land cover, slope, distances to larger cities, and a moving average of current population, were locally adaptive and geographically varying. The resulting weighted surface was used to determine which areas had the greatest likelihood for future population change. Population projections of county level numbers were developed using a modified version of the U.S. Census s projection methodology with the U.S. Census s official projection as the benchmark. Applications of our model include, but are not limited to, suitability modelling, service area planning for governmental agencies, consequence assessment, mitigation planning and implementation, and assessment of spatially vulnerable populations.« less

  18. The Ottawa Model of Research Use: a guide to clinical innovation in the NICU.

    PubMed

    Hogan, Debora L; Logan, Jo

    2004-01-01

    To improve performance of a neonatal transport team by implementing a research-based family assessment instrument. Objectives included providing a structure for evaluating families and fostering the healthcare relationship. Neonatal transports are associated with family crises. Transport teams require a comprehensive framework to accurately assess family responses to adversity and tools to guide their practice toward parental mastery of the event. Currently, there are no assessment tools that merge family nursing expertise with neonatal transport. A family assessment tool grounded in contemporary family nursing theory and research was developed by a clinical nurse specialist. The Ottawa Model of Research Use guided the process of piloting the innovation with members of a transport team. Focus groups, interviews, and surveys were conducted to create profiles of barriers and facilitators to research use by team members. Tailored research transfer strategies were enacted based on the profile results. Formative evaluations demonstrated improvements in team members' perceptions of their knowledge, family centeredness, and ability to assess and intervene with families. The family assessment tool is currently being incorporated into Clinical Practice Guidelines for Transport and thus will be considered standard care. Use of a family assessment tool is an effective way of appraising families and addressing suffering. The Ottawa Model of Research Use provided a framework for implementing the clinical innovation. A key role of the clinical nurse specialist is to influence nursing practice by fostering research use by practitioners. When developing and implementing a clinical innovation, input from end users and consumers is pivotal. Incorporating the innovation into a practice guideline provides a structure to imbed research evidence into practice.

  19. Utilizing interview and self-report assessment of the Five-Factor Model to examine convergence with the alternative model for personality disorders.

    PubMed

    Helle, Ashley C; Trull, Timothy J; Widiger, Thomas A; Mullins-Sweatt, Stephanie N

    2017-07-01

    An alternative model for personality disorders is included in Section III (Emerging Models and Measures) of Diagnostic and Statistical Manual of Mental Disorders, (5th ed.; DSM-5). The DSM-5 dimensional trait model is an extension of the Five-Factor Model (FFM; American Psychiatric Association, 2013). The Personality Inventory for DSM-5 (PID-5) assesses the 5 domains and 25 traits in the alternative model. The current study expands on recent research to examine the relationship of the PID-5 with an interview measure of the FFM. The Structured Interview for the Five Factor Model of Personality (SIFFM) assesses the 5 bipolar domains and 30 facets of the FFM. Research has indicated that the SIFFM captures maladaptive aspects of personality (as well as adaptive). The SIFFM, NEO PI-R, and PID-5 were administered to participants to examine their respective convergent and discriminant validity. Results provide evidence for the convergence of the 2 models using self-report and interview measures of the FFM. Clinical implications and future directions are discussed, particularly a call for the development of a structured interview for the assessment of the DSM-5 dimensional trait model. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Applying ILAMB to data from several generations of the Community Land Model to assess the relative contribution of model improvements and forcing uncertainty to model-data agreement

    NASA Astrophysics Data System (ADS)

    Lawrence, D. M.; Fisher, R.; Koven, C.; Oleson, K. W.; Swenson, S. C.; Hoffman, F. M.; Randerson, J. T.; Collier, N.; Mu, M.

    2017-12-01

    The International Land Model Benchmarking (ILAMB) project is a model-data intercomparison and integration project designed to assess and help improve land models. The current package includes assessment of more than 25 land variables across more than 60 global, regional, and site-level (e.g., FLUXNET) datasets. ILAMB employs a broad range of metrics including RMSE, mean error, spatial distributions, interannual variability, and functional relationships. Here, we apply ILAMB for the purpose of assessment of several generations of the Community Land Model (CLM4, CLM4.5, and CLM5). Encouragingly, CLM5, which is the result of model development over the last several years by more than 50 researchers from 15 different institutions, shows broad improvements across many ILAMB metrics including LAI, GPP, vegetation carbon stocks, and the historical net ecosystem carbon balance among others. We will also show that considerable uncertainty arises from the historical climate forcing data used (GSWP3v1 and CRUNCEPv7). ILAMB score variations due to forcing data can be as large for many variables as that due to model structural differences. Strengths and weaknesses and persistent biases across model generations will also be presented.

  1. Nation-wide primary healthcare research network: a privacy protection assessment.

    PubMed

    De Clercq, Etienne; Van Casteren, Viviane; Bossuyt, Nathalie; Moreels, Sarah; Goderis, Geert; Bartholomeeusen, Stefaan; Bonte, Pierre; Bangels, Marc

    2012-01-01

    Efficiency and privacy protection are essential when setting up nationwide research networks. This paper investigates the extent to which basic services developed to support the provision of care can be re-used, whilst preserving an acceptable privacy protection level, within a large Belgian primary care research network. The generic sustainable confidentiality management model used to assess the privacy protection level of the selected network architecture is described. A short analysis of the current architecture is provided. Our generic model could also be used in other countries.

  2. Assessment of Spacecraft Systems Integration Using the Electric Propulsion Interactions Code (EPIC)

    NASA Technical Reports Server (NTRS)

    Mikellides, Ioannis G.; Kuharski, Robert A.; Mandell, Myron J.; Gardner, Barbara M.; Kauffman, William J. (Technical Monitor)

    2002-01-01

    SAIC is currently developing the Electric Propulsion Interactions Code 'EPIC', an interactive computer tool that allows the construction of a 3-D spacecraft model, and the assessment of interactions between its subsystems and the plume from an electric thruster. EPIC unites different computer tools to address the complexity associated with the interaction processes. This paper describes the overall architecture and capability of EPIC including the physics and algorithms that comprise its various components. Results from selected modeling efforts of different spacecraft-thruster systems are also presented.

  3. Beyond the audiogram: application of models of auditory fitness for duty to assess communication in the real world.

    PubMed

    Dubno, Judy R

    2018-05-01

    This manuscript provides a Commentary on a paper published in the current issue of the International Journal of Audiology and the companion paper published in Ear and Hearing by Soli et al. These papers report background, rationale and results of a novel modelling approach to assess "auditory fitness for duty," or an individual's ability to perform hearing-critical tasks related to their job, based on their likelihood of effective speech communication in the listening environment in which the task is performed.

  4. Common-Cause Failure Treatment in Event Assessment: Basis for a Proposed New Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dana Kelly; Song-Hua Shen; Gary DeMoss

    2010-06-01

    Event assessment is an application of probabilistic risk assessment in which observed equipment failures and outages are mapped into the risk model to obtain a numerical estimate of the event’s risk significance. In this paper, we focus on retrospective assessments to estimate the risk significance of degraded conditions such as equipment failure accompanied by a deficiency in a process such as maintenance practices. In modeling such events, the basic events in the risk model that are associated with observed failures and other off-normal situations are typically configured to be failed, while those associated with observed successes and unchallenged components aremore » assumed capable of failing, typically with their baseline probabilities. This is referred to as the failure memory approach to event assessment. The conditioning of common-cause failure probabilities for the common cause component group associated with the observed component failure is particularly important, as it is insufficient to simply leave these probabilities at their baseline values, and doing so may result in a significant underestimate of risk significance for the event. Past work in this area has focused on the mathematics of the adjustment. In this paper, we review the Basic Parameter Model for common-cause failure, which underlies most current risk modelling, discuss the limitations of this model with respect to event assessment, and introduce a proposed new framework for common-cause failure, which uses a Bayesian network to model underlying causes of failure, and which has the potential to overcome the limitations of the Basic Parameter Model with respect to event assessment.« less

  5. Qualitative risk assessment in a data-scarce environment: a model to assess the impact of control measures on spread of African Swine Fever.

    PubMed

    Wieland, Barbara; Dhollander, Sofie; Salman, Mo; Koenen, Frank

    2011-04-01

    In the absence of data, qualitative risk assessment frameworks have proved useful to assess risks associated with animal health diseases. As part of a scientific opinion for the European Commission (EC) on African Swine Fever (ASF), a working group of the European Food Safety Authority (EFSA) assessed the risk of ASF remaining endemic in Trans Caucasus Countries (TCC) and the Russian Federation (RF) and the risk of ASF becoming endemic in the EU if disease were introduced. The aim was to develop a tool to evaluate how current control or preventive measures mitigate the risk of spread and giving decision makers the means to review how strengthening of surveillance and control measures would mitigate the risk of disease spread. Based on a generic model outlining disease introduction, spread and endemicity in a region, the impact of risk mitigation measures on spread of disease was assessed for specific risk questions. The resulting hierarchical models consisted of key steps containing several sub-steps. For each step of the risk pathways risk estimates were determined by the expert group based on existing data or through expert opinion elicitation. Risk estimates were combined using two different combination matrices, one to combine estimates of independent steps and one to combine conditional probabilities. The qualitative risk assessment indicated a moderate risk that ASF will remain endemic in current affected areas in the TCC and RF and a high risk of spread to currently unaffected areas. If introduced into the EU, ASF is likely to be controlled effectively in the production sector with high or limited biosecurity. In the free range production sector, however, there is a moderate risk of ASF becoming endemic due to wild boar contact, non-compliance with animal movement bans, and difficult access to all individual pigs upon implementation of control measures. This study demonstrated the advantages of a systematic framework to assist an expert panel to carry out a risk assessment as it helped experts to disassociate steps in the risk pathway and to overcome preconceived notions of final risk estimates. The approach presented here shows how a qualitative risk assessment framework can address animal diseases with complexity in their spread and control measures and how transparency of the resulting estimates was achieved. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. A review of tephra transport and dispersal models: Evolution, current status, and future perspectives

    NASA Astrophysics Data System (ADS)

    Folch, A.

    2012-08-01

    Tephra transport models try to predict atmospheric dispersion and sedimentation of tephra depending on meteorology, particle properties, and eruption characteristics, defined by eruption column height, mass eruption rate, and vertical distribution of mass. Models are used for different purposes, from operational forecast of volcanic ash clouds to hazard assessment of tephra dispersion and fallout. The size of the erupted particles, a key parameter controlling the dynamics of particle sedimentation in the atmosphere, varies within a wide range. Largest centimetric to millimetric particles fallout at proximal to medial distances from the volcano and sediment by gravitational settling. On the other extreme, smallest micrometric to sub-micrometric particles can be transported at continental or even at global scales and are affected by other deposition and aggregation mechanisms. Different scientific communities had traditionally modeled the dispersion of these two end members. Volcanologists developed families of models suitable for lapilli and coarse ash and aimed at computing fallout deposits and for hazard assessment. In contrast, meteorologists and atmospheric scientists have traditionally used other atmospheric transport models, dealing with finer particles, for tracking motion of volcanic ash clouds and, eventually, for computing airborne ash concentrations. During the last decade, the increasing demand for model accuracy and forecast reliability has pushed on two fronts. First, the original gap between these different families of models has been filled with the emergence of multi-scale and multi-purpose models. Second, new modeling strategies including, for example, ensemble and probabilistic forecast or model data assimilation are being investigated for future implementation in models and or modeling strategies. This paper reviews the evolution of tephra transport and dispersal models during the last two decades, presents the status and limitations of the current modeling strategies, and discusses some emergent perspectives expected to be implemented at operational level during the next few years. Improvements in both real-time forecasting and long-term hazard assessment are necessary to loss prevention programs on a local, regional, national and international level.

  7. Understanding Disproportionate Representation in Special Education by Examining Group Differences in Behavior Ratings

    ERIC Educational Resources Information Center

    Peters, Christina D.; Kranzler, John H.; Algina, James; Smith, Stephen W.; Daunic, Ann P.

    2014-01-01

    The aim of the current study was to examine mean-group differences on behavior rating scales and variables that may predict such differences. Sixty-five teachers completed the Clinical Assessment of Behavior-Teacher Form (CAB-T) for a sample of 982 students. Four outcome variables from the CAB-T were assessed. Hierarchical linear modeling was used…

  8. Innovative Educational Restructuring for America 2000: Time To Bury Political Bureaucracies and Begin Systematic Assessment, Profiling & Technological Improvement of School Organizations.

    ERIC Educational Resources Information Center

    Packard, Richard D.; Dereshiwsky, Mary I.

    Despite current interest with the concept of the "New American School" model discussed in "America 2000," school systems continue to approach educational reform and restructuring by tinkering with key organizational components in isolation. The total school organization requires assessment and profiling to determine which key components are drags…

  9. The GLSEN Workbook: A Development Model for Assessing, Describing and Improving Schools for Lesbian, Gay, Bisexual and Transgender (LGBT) People.

    ERIC Educational Resources Information Center

    Gay, Lesbian, and Straight Education Network, New York, NY.

    This workbook provides an instrument to objectively analyze a school's current climate with regard to lesbian, gay, bisexual, and transgendered (LGBT) people and the steps needed to move that school toward a more inclusive environment. It provides a detailed assessment survey (to be completed by key school stakeholders), descriptive data, and…

  10. Evaluation of the Effectiveness of Management Development Institutions of Higher Education on the Basis of the Factor and Criterion Model

    ERIC Educational Resources Information Center

    Badrtdinov, Nail N.; Gorobets, Daniil V.

    2016-01-01

    The relevance of the investigated problem is conditioned by absence of the single approach to the common criteria and mechanisms of a pedagogical educational establishment's assessment; the current assessment principles of planning and management are out-of-date. The aim of this article is to analyze theoretical approaches and concepts of…

  11. The Victim-Offender Overlap and Fear of In-School Victimization: A Longitudinal Examination of Risk Assessment Models

    ERIC Educational Resources Information Center

    Melde, Chris; Esbensen, Finn-Aage

    2009-01-01

    Reports of serious violence in schools have raised general awareness and concern about safety in America's schools. In this article, the authors examine the extent to which in-school victimization is associated with students' perceived risk and fear of victimization. By expanding on Ferraro's risk assessment framework, the current study explores…

  12. Assessing the Incremental Value of KABC-II Luria Model Scores in Predicting Achievement: What Do They Tell Us beyond the MPI?

    ERIC Educational Resources Information Center

    McGill, Ryan J.; Spurgin, Angelia R.

    2016-01-01

    The current study examined the incremental validity of the Luria interpretive scheme for the Kaufman Assessment Battery for Children-Second Edition (KABC-II) for predicting scores on the Kaufman Test of Educational Achievement-Second Edition (KTEA-II). All participants were children and adolescents (N = 2,025) drawn from the nationally…

  13. Asteroid-Generated Tsunami and Impact Risk

    NASA Astrophysics Data System (ADS)

    Boslough, M.; Aftosmis, M.; Berger, M. J.; Ezzedine, S. M.; Gisler, G.; Jennings, B.; LeVeque, R. J.; Mathias, D.; McCoy, C.; Robertson, D.; Titov, V. V.; Wheeler, L.

    2016-12-01

    The justification for planetary defense comes from a cost/benefit analysis, which includes risk assessment. The contribution from ocean impacts and airbursts is difficult to quantify and represents a significant uncertainty in our assessment of the overall risk. Our group is currently working toward improved understanding of impact scenarios that can generate dangerous tsunami. The importance of asteroid-generated tsunami research has increased because a new Science Definition Team, at the behest of NASA's Planetary Defense Coordinating Office, is now updating the results of a 2003 study on which our current planetary defense policy is based Our group was formed to address this question on many fronts, including asteroid entry modeling, tsunami generation and propagation simulations, modeling of coastal run-ups, inundation, and consequences, infrastructure damage estimates, and physics-based probabilistic impact risk assessment. We also organized the Second International Workshop on Asteroid Threat Assessment, focused on asteroid-generated tsunami and associated risk (Aug. 23-24, 2016). We will summarize our progress and present the highlights of our workshop, emphasizing its relevance to earth and planetary science. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE-AC04-94AL85000.

  14. A Process-Based Assessment for Watershed Restoration Planning, Chehalis River Basin, USA

    NASA Astrophysics Data System (ADS)

    Beechie, T. J.; Thompson, J.; Seixas, G.; Fogel, C.; Hall, J.; Chamberlin, J.; Kiffney, P.; Pollock, M. M.; Pess, G. R.

    2016-12-01

    Three key questions in identifying and prioritizing river restoration are: (1) How have habitats changed?, (2) What are the causes of those habitat changes?, and (3) How of those changes affected the species of interest? To answer these questions and assist aquatic habitat restoration planning in the Chehalis River basin, USA, we quantified habitat changes across the river network from headwaters to the estuary. We estimated historical habitat capacity to support salmonids using a combination of historical assessments, reference sites, and models. We also estimated current capacity from recent or newly created data sets. We found that losses of floodplain habitats and beaver ponds were substantial, while the estuary was less modified. Both tributary and main channel habitats—while modified—did not show particularly large habitat changes. Assessments of key processes that form and sustain habitats indicate that riparian functions (shading and wood recruitment) have been significantly altered, although peak and low flows have also been altered in some locations. The next step is to link our habitat assessments to salmon life-cycle models to evaluate which life stages and habitat types currently constrain population sizes of spring and fall Chinook salmon, coho salmon, and steelhead. By comparing model runs that represent different components of habitat losses identified in the analysis above, life-cycle models help identify which habitat losses have most impacted each species and population. This assessment will indicate which habitat types provide the greatest restoration potential, and help define a guiding vision for restoration efforts. Future analyses may include development and evaluation of alternative restoration scenarios, including different climate change scenarios, to refine our understanding of which restoration actions provide the greatest benefit to a salmon population.

  15. Integrating in silico models to enhance predictivity for developmental toxicity.

    PubMed

    Marzo, Marco; Kulkarni, Sunil; Manganaro, Alberto; Roncaglioni, Alessandra; Wu, Shengde; Barton-Maclaren, Tara S; Lester, Cathy; Benfenati, Emilio

    2016-08-31

    Application of in silico models to predict developmental toxicity has demonstrated limited success particularly when employed as a single source of information. It is acknowledged that modelling the complex outcomes related to this endpoint is a challenge; however, such models have been developed and reported in the literature. The current study explored the possibility of integrating the selected public domain models (CAESAR, SARpy and P&G model) with the selected commercial modelling suites (Multicase, Leadscope and Derek Nexus) to assess if there is an increase in overall predictive performance. The results varied according to the data sets used to assess performance which improved upon model integration relative to individual models. Moreover, because different models are based on different specific developmental toxicity effects, integration of these models increased the applicable chemical and biological spaces. It is suggested that this approach reduces uncertainty associated with in silico predictions by achieving a consensus among a battery of models. The use of tools to assess the applicability domain also improves the interpretation of the predictions. This has been verified in the case of the software VEGA, which makes freely available QSAR models with a measurement of the applicability domain. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. Assessment of Effectiveness and Limitations of Habitat Suitability Models for Wetland Restoration

    USGS Publications Warehouse

    Draugelis-Dale, Rassa O.

    2008-01-01

    Habitat suitability index (HSI) models developed for wildlife in the Louisiana Coastal Area Comprehensive Ecosystem Restoration Plan (LCA study) have been assessed for parameter and overall model quality. The success of the suitability models from the South Florida Water Management District for The Everglades restoration project and from the Spatially Explicit Species Index Models (SESI) of the Across Trophic Level System Simulation (ATLSS) Program of Florida warranted investigation with possible application of modeling theory to the current LCA study. General HSI models developed by the U.S. Fish and Wildlife Service were also investigated. This report presents examinations of theoretical formulae and comparisons of the models, performed by using diverse hypothetical settings of hydrological/biological ecosystems to highlight weaknesses as well as strengths among the models, limited to the American alligator and selected wading bird species (great blue heron, great egret, and white ibis). Recommendations were made for the LCA study based on these assessments. An enhanced HSI model for the LCA study is proposed for the American alligator, and a new HSI model for wading birds is introduced for the LCA study. Performance comparisons of the proposed models with the other suitability models are made by using the aforementioned hypothetical settings.

  17. Engineered Barrier System performance requirements systems study report. Revision 02

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balady, M.A.

    This study evaluates the current design concept for the Engineered Barrier System (EBS), in concert with the current understanding of the geologic setting to assess whether enhancements to the required performance of the EBS are necessary. The performance assessment calculations are performed by coupling the EBS with the geologic setting based on the models (some of which were updated for this study) and assumptions used for the 1995 Total System Performance Assessment (TSPA). The need for enhancements is determined by comparing the performance assessment results against the EBS related performance requirements. Subsystem quantitative performance requirements related to the EBS includemore » the requirement to allow no more than 1% of the waste packages (WPs) to fail before 1,000 years after permanent closure of the repository, as well as a requirement to control the release rate of radionuclides from the EBS. The EBS performance enhancements considered included additional engineered components as well as evaluating additional performance available from existing design features but for which no performance credit is currently being taken.« less

  18. Strategies for the Use of Tidal Stream Currents for Power Generation

    NASA Astrophysics Data System (ADS)

    Orhan, Kadir; Mayerle, Roberto

    2015-04-01

    Indonesia is one of the priority countries in Southeast Asia for the development of ocean renewable energy facilities and The National Energy Council intends to increase the role of ocean energy significantly in the energy mix for 2010-2050. To this end, the joint German-Indonesian project "Ocean Renewable Energy ORE-12" aims at the identification of marine environments in the Indonesian Archipelago, which are suitable for the efficient generation of electric power by converter facilities. This study, within the ORE-12 project, is focused on the tidal stream currents on the straits between the Indian Ocean and Flores Sea to estimate the energy potentials and to develop strategies for producing renewable energy. FLOW module of Delft3D has been used to run hydrodynamic models for site assessment and design development. In site assessment phase, 2D models have been operated for a-month long periods and with a resolution of 500 m. Later on, in design development phase, detailed 3D models have been developed and operated for three-month long periods and with a resolution of 50 m. Bathymetric data for models have been obtained from the GEBCO_08 Grid and wind data from the Global Forecast System of NOAA's National Climatic Data Center. To set the boundary conditions of models, tidal forcing with 11 harmonic constituents was supplied from TPXO Indian Ocean Atlas (1/12° regional model) and data from HYCOM+NCODA Global 1/12° Analysis have been used to determine salinity and temperature on open boundaries. After the field survey is complete, water level time-series supplied from a tidal gauge located in the domain of interest (8° 20΄ 9.7" S, 122° 54΄ 51.9" E) have been used to verify the models and then energy potentials of the straits have been estimated. As a next step, correspondence between model outputs and measurements taken by the radar system of TerraSAR-X satellite (DLR) will be analysed. Also for the assessment of environmental impacts caused by tidal stream current power plants, studies are being conducted in a cooperation with CRM (Coastal Research & Management) company.

  19. Geomanetically Induced Currents (GIC) calculation, impact assessment on transmission system and validation using 3-D earth conductivity tensors and GIC measurements.

    NASA Astrophysics Data System (ADS)

    Sharma, R.; McCalley, J. D.

    2016-12-01

    Geomagnetic disturbance (GMD) causes the flow of geomagnetically induced currents (GIC) in the power transmission system that may cause large scale power outages and power system equipment damage. In order to plan for defense against GMD, it is necessary to accurately estimate the flow of GICs in the power transmission system. The current calculation as per NERC standards uses the 1-D earth conductivity models that don't reflect the coupling between the geoelectric and geomagnetic field components in the same direction. For accurate estimation of GICs, it is important to have spatially granular 3-D earth conductivity tensors, accurate DC network model of the transmission system and precisely estimated or measured input in the form of geomagnetic or geoelectric field data. Using these models and data the pre event, post event and online planning and assessment can be performed. The pre, post and online planning can be done by calculating GIC, analyzing voltage stability margin, identifying protection system vulnerabilities and estimating heating in transmission equipment. In order to perform the above mentioned tasks, an established GIC calculation and analysis procedure is needed that uses improved geophysical and DC network models obtained by model parameter tuning. The issue is addressed by performing the following tasks; 1) Geomagnetic field data and improved 3-D earth conductivity tensors are used to plot the geoelectric field map of a given area. The obtained geoelectric field map then serves as an input to the PSS/E platform, where through DC circuit analysis the GIC flows are calculated. 2) The computed GIC is evaluated against GIC measurements in order to fine tune the geophysical and DC network model parameters for any mismatch in the calculated and measured GIC. 3) The GIC calculation procedure is then adapted for a one in 100 year storm, in order to assess the impact of the worst case GMD on the power system. 4) Using the transformer models, the voltage stability margin would be analyzed for various real and synthetic geomagnetic or geoelectric field inputs, by calculating the reactive power absorbed by the transformers during an event. All four steps will help the electric utilities and planners to make use of better and accurate estimation techniques for GIC calculation, and impact assessment for future GMD events.

  20. MODEL-BASED HYDROACOUSTIC BLOCKAGE ASSESSMENT AND DEVELOPMENT OF AN EXPLOSIVE SOURCE DATABASE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzel, E; Ramirez, A; Harben, P

    2005-07-11

    We are continuing the development of the Hydroacoustic Blockage Assessment Tool (HABAT) which is designed for use by analysts to predict which hydroacoustic monitoring stations can be used in discrimination analysis for any particular event. The research involves two approaches (1) model-based assessment of blockage, and (2) ground-truth data-based assessment of blockage. The tool presents the analyst with a map of the world, and plots raypath blockages from stations to sources. The analyst inputs source locations and blockage criteria, and the tool returns a list of blockage status from all source locations to all hydroacoustic stations. We are currently usingmore » the tool in an assessment of blockage criteria for simple direct-path arrivals. Hydroacoustic data, predominantly from earthquake sources, are read in and assessed for blockage at all available stations. Several measures are taken. First, can the event be observed at a station above background noise? Second, can we establish backazimuth from the station to the source. Third, how large is the decibel drop at one station relative to other stations. These observational results are then compared with model estimates to identify the best set of blockage criteria and used to create a set of blockage maps for each station. The model-based estimates are currently limited by the coarse bathymetry of existing databases and by the limitations inherent in the raytrace method. In collaboration with BBN Inc., the Hydroacoustic Coverage Assessment Model (HydroCAM) that generates the blockage files that serve as input to HABAT, is being extended to include high-resolution bathymetry databases in key areas that increase model-based blockage assessment reliability. An important aspect of this capability is to eventually include reflected T-phases where they reliably occur and to identify the associated reflectors. To assess how well any given hydroacoustic discriminant works in separating earthquake and in-water explosion populations it is necessary to have both a database of reference earthquake events and of reference in-water explosive events. Although reference earthquake events are readily available, explosive reference events are not. Consequently, building an in-water explosion reference database requires the compilation of events from many sources spanning a long period of time. We have developed a database of small implosive and explosive reference events from the 2003 Indian Ocean Cruise data. These events were recorded at some or all of the IMS Indian Ocean hydroacoustic stations: Diego Garcia, Cape Leeuwin, and Crozet Island. We have also reviewed many historical large in-water explosions and identified five that have adequate source information and can be positively associated to the hydrophone recordings. The five events are: Cannekin, Longshot, CHASE-3, CHASE-5, and IITRI-1. Of these, the first two are nuclear tests on land but near water. The latter three are in-water conventional explosive events with yields from ten to hundreds of tons TNT equivalent. The objective of this research is to enhance discrimination capabilities for events located in the world's oceans. Two research and development efforts are needed to achieve this: (1) improvement in discrimination algorithms and their joint statistical application to events, and (2) development of an automated and accurate blockage prediction capability that will identify all stations and phases (direct and reflected) from a given event that will have adequate signal to be used in a discrimination analysis. The strategy for improving blockage prediction in the world's oceans is to improve model-based prediction of blockage and to develop a ground-truth database of reference events to assess blockage. Currently, research is focused on the development of a blockage assessment software tool. The tool is envisioned to develop into a sophisticated and unifying package that optimally and automatically assesses both model and data based blockage predictions in all ocean basins, for all NDC stations, and accounting for reflected phases (Pulli et al., 2000). Currently, we have focused our efforts on the Diego Garcia, Cape Leeuwin and Crozet Island hydroacoustic stations in the Indian Ocean.« less

  1. A Social-Ecological Framework of Theory, Assessment, and Prevention of Suicide

    PubMed Central

    Cramer, Robert J.; Kapusta, Nestor D.

    2017-01-01

    The juxtaposition of increasing suicide rates with continued calls for suicide prevention efforts begs for new approaches. Grounded in the Centers for Disease Control and Prevention (CDC) framework for tackling health issues, this personal views work integrates relevant suicide risk/protective factor, assessment, and intervention/prevention literatures. Based on these components of suicide risk, we articulate a Social-Ecological Suicide Prevention Model (SESPM) which provides an integration of general and population-specific risk and protective factors. We also use this multi-level perspective to provide a structured approach to understanding current theories and intervention/prevention efforts concerning suicide. Following similar multi-level prevention efforts in interpersonal violence and Human Immunodeficiency Virus (HIV) domains, we offer recommendations for social-ecologically informed suicide prevention theory, training, research, assessment, and intervention programming. Although the SESPM calls for further empirical testing, it provides a suitable backdrop for tailoring of current prevention and intervention programs to population-specific needs. Moreover, the multi-level model shows promise to move suicide risk assessment forward (e.g., development of multi-level suicide risk algorithms or structured professional judgments instruments) to overcome current limitations in the field. Finally, we articulate a set of characteristics of social-ecologically based suicide prevention programs. These include the need to address risk and protective factors with the strongest degree of empirical support at each multi-level layer, incorporate a comprehensive program evaluation strategy, and use a variety of prevention techniques across levels of prevention. PMID:29062296

  2. Assessing the evolution of primary healthcare organizations and their performance (2005-2010) in two regions of Québec province: Montréal and Montérégie

    PubMed Central

    2010-01-01

    Background The Canadian healthcare system is currently experiencing important organizational transformations through the reform of primary healthcare (PHC). These reforms vary in scope but share a common feature of proposing the transformation of PHC organizations by implementing new models of PHC organization. These models vary in their performance with respect to client affiliation, utilization of services, experience of care and perceived outcomes of care. Objectives In early 2005 we conducted a study in the two most populous regions of Quebec province (Montreal and Montérégie) which assessed the association between prevailing models of primary healthcare (PHC) and population-level experience of care. The goal of the present research project is to track the evolution of PHC organizational models and their relative performance through the reform process (from 2005 until 2010) and to assess factors at the organizational and contextual levels that are associated with the transformation of PHC organizations and their performance. Methods/Design This study will consist of three interrelated surveys, hierarchically nested. The first survey is a population-based survey of randomly-selected adults from two populous regions in the province of Quebec. This survey will assess the current affiliation of people with PHC organizations, their level of utilization of healthcare services, attributes of their experience of care, reception of preventive and curative services and perception of unmet needs for care. The second survey is an organizational survey of PHC organizations assessing aspects related to their vision, organizational structure, level of resources, and clinical practice characteristics. This information will serve to develop a taxonomy of organizations using a mixed methods approach of factorial analysis and principal component analysis. The third survey is an assessment of the organizational context in which PHC organizations are evolving. The five year prospective period will serve as a natural experiment to assess contextual and organizational factors (in 2005) associated with migration of PHC organizational models into new forms or models (in 2010) and assess the impact of this evolution on the performance of PHC. Discussion The results of this study will shed light on changes brought about in the organization of PHC and on factors associated with these changes. PMID:21122145

  3. Assessing the evolution of primary healthcare organizations and their performance (2005-2010) in two regions of Québec province: Montréal and Montérégie.

    PubMed

    Levesque, Jean-Frédéric; Pineault, Raynald; Provost, Sylvie; Tousignant, Pierre; Couture, Audrey; Da Silva, Roxane Borgès; Breton, Mylaine

    2010-12-01

    The Canadian healthcare system is currently experiencing important organizational transformations through the reform of primary healthcare (PHC). These reforms vary in scope but share a common feature of proposing the transformation of PHC organizations by implementing new models of PHC organization. These models vary in their performance with respect to client affiliation, utilization of services, experience of care and perceived outcomes of care. In early 2005 we conducted a study in the two most populous regions of Quebec province (Montreal and Montérégie) which assessed the association between prevailing models of primary healthcare (PHC) and population-level experience of care. The goal of the present research project is to track the evolution of PHC organizational models and their relative performance through the reform process (from 2005 until 2010) and to assess factors at the organizational and contextual levels that are associated with the transformation of PHC organizations and their performance. This study will consist of three interrelated surveys, hierarchically nested. The first survey is a population-based survey of randomly-selected adults from two populous regions in the province of Quebec. This survey will assess the current affiliation of people with PHC organizations, their level of utilization of healthcare services, attributes of their experience of care, reception of preventive and curative services and perception of unmet needs for care. The second survey is an organizational survey of PHC organizations assessing aspects related to their vision, organizational structure, level of resources, and clinical practice characteristics. This information will serve to develop a taxonomy of organizations using a mixed methods approach of factorial analysis and principal component analysis. The third survey is an assessment of the organizational context in which PHC organizations are evolving. The five year prospective period will serve as a natural experiment to assess contextual and organizational factors (in 2005) associated with migration of PHC organizational models into new forms or models (in 2010) and assess the impact of this evolution on the performance of PHC. The results of this study will shed light on changes brought about in the organization of PHC and on factors associated with these changes.

  4. A framework to analyze emissions implications of manufacturing shifts in the industrial sector through integrating bottom-up energy models and economic input-output environmental life cycle assessment models

    EPA Science Inventory

    Future year emissions depend highly on the evolution of the economy, technology and current and future regulatory drivers. A scenario framework was adopted to analyze various technology development pathways and societal change while considering existing regulations and future unc...

  5. A Structural Equation Model of the Writing Process in Typically-Developing Sixth Grade Children

    ERIC Educational Resources Information Center

    Koutsoftas, Anthony D.; Gray, Shelley

    2013-01-01

    The purpose of this study was to evaluate how sixth grade children planned, translated, and revised written narrative stories using a task reflecting current instructional and assessment practices. A modified version of the Hayes and Flower (1980) writing process model was used as the theoretical framework for the study. Two hundred one…

  6. Support Networks of Single Puerto Rican Mothers of Children with Disabilities

    ERIC Educational Resources Information Center

    Correa, Vivian I.; Bonilla, Zobeida E.; Reyes-MacPherson, Maria E.

    2011-01-01

    The social support networks of 25 Puerto Rican single mothers of young children with disabilities were examined and compared with current models of family support for children with disabilities. This study was designed to assess the support systems of Latino single mothers in light of dominant models of family support. The Family Support Scale,…

  7. A Simplified Technique for Scoring DSM-IV Personality Disorders with the Five-Factor Model

    ERIC Educational Resources Information Center

    Miller, Joshua D.; Bagby, R. Michael; Pilkonis, Paul A.; Reynolds, Sarah K.; Lynam, Donald R.

    2005-01-01

    The current study compares the use of two alternative methodologies for using the Five-Factor Model (FFM) to assess personality disorders (PDs). Across two clinical samples, a technique using the simple sum of selected FFM facets is compared with a previously used prototype matching technique. The results demonstrate that the more easily…

  8. Five-Factor Model Prototypes for Personality Disorders: The Utility of Self-Reports and Observer Ratings

    ERIC Educational Resources Information Center

    Miller, Joshua D.; Pilkonis, Paul A.; Morse, Jennifer Q.

    2004-01-01

    The current study examined the prototype-matching technique for using the five-factor model (FFM) of personality to assess personality disorders (PDs) and their correlates. The sample was composed of 69 psychiatric patients, most of whom suffered from affective or anxiety disorders. The participants were predominantly outpatients (78%), Caucasian…

  9. The development and implementation of a decision-making capacity assessment model.

    PubMed

    Parmar, Jasneet; Brémault-Phillips, Suzette; Charles, Lesley

    2015-03-01

    Decision-making capacity assessment (DMCA) is an issue of increasing importance for older adults. Current challenges need to be explored, and potential processes and strategies considered in order to address issues of DMCA in a more coordinated manner. An iterative process was used to address issues related to DMCA. This began with recognition of challenges associated with capacity assessments (CAs) by staff at Covenant Health (CH). Review of the literature, as well as discussions with and a survey of staff at three CH sites, resulted in determination of issues related to DMCA. Development of a DMCA Model and demonstration of its feasibility followed. A process was proposed with front-end screening/problem- solving, a well-defined standard assessment, and definition of team member roles. A Capacity Assessment Care Map was formulated based on the process. Documentation was developed consisting of a Capacity Assessment Process Worksheet, Capacity Interview Worksheet, and a brochure. Interactive workshops were delivered to familiarize staff with the DMCA Model. A successful demonstration project led to implementation across all sites in the Capital Health region, and eventual provincial endorsement. Concerns identified in the survey and in the literature regarding CA were addressed through the holistic interdisciplinary approach offered by the DMCA Model.

  10. Virtual reality in ophthalmology training.

    PubMed

    Khalifa, Yousuf M; Bogorad, David; Gibson, Vincent; Peifer, John; Nussbaum, Julian

    2006-01-01

    Current training models are limited by an unstructured curriculum, financial costs, human costs, and time constraints. With the newly mandated resident surgical competency, training programs are struggling to find viable methods of assessing and documenting the surgical skills of trainees. Virtual-reality technologies have been used for decades in flight simulation to train and assess competency, and there has been a recent push in surgical specialties to incorporate virtual-reality simulation into residency programs. These efforts have culminated in an FDA-approved carotid stenting simulator. What role virtual reality will play in the evolution of ophthalmology surgical curriculum is uncertain. The current apprentice system has served the art of surgery for over 100 years, and we foresee virtual reality working synergistically with our current curriculum modalities to streamline and enhance the resident's learning experience.

  11. Parameter extraction and transistor models

    NASA Technical Reports Server (NTRS)

    Rykken, Charles; Meiser, Verena; Turner, Greg; Wang, QI

    1985-01-01

    Using specified mathematical models of the MOSFET device, the optimal values of the model-dependent parameters were extracted from data provided by the Jet Propulsion Laboratory (JPL). Three MOSFET models, all one-dimensional were used. One of the models took into account diffusion (as well as convection) currents. The sensitivity of the models was assessed for variations of the parameters from their optimal values. Lines of future inquiry are suggested on the basis of the behavior of the devices, of the limitations of the proposed models, and of the complexity of the required numerical investigations.

  12. Using the Many-Faceted Rasch Model to Evaluate Standard Setting Judgments: An Illustration with the Advanced Placement Environmental Science Exam

    ERIC Educational Resources Information Center

    Kaliski, Pamela K.; Wind, Stefanie A.; Engelhard, George, Jr.; Morgan, Deanna L.; Plake, Barbara S.; Reshetar, Rosemary A.

    2013-01-01

    The many-faceted Rasch (MFR) model has been used to evaluate the quality of ratings on constructed response assessments; however, it can also be used to evaluate the quality of judgments from panel-based standard setting procedures. The current study illustrates the use of the MFR model for examining the quality of ratings obtained from a standard…

  13. Parameterization of the 3-PG model for Pinus elliottii stands using alternative methods to estimate fertility rating, biomass partitioning and canopy closure

    Treesearch

    Carlos A. Gonzalez-Benecke; Eric J. Jokela; Wendell P. Cropper; Rosvel Bracho; Daniel J. Leduc

    2014-01-01

    The forest simulation model, 3-PG, has been widely applied as a useful tool for predicting growth of forest species in many countries. The model has the capability to estimate the effects of management, climate and site characteristics on many stand attributes using easily available data. Currently, there is an increasing interest in estimating biomass and assessing...

  14. Assessment and improvement of biotransfer models to cow's milk and beef used in exposure assessment tools for organic pollutants.

    PubMed

    Takaki, Koki; Wade, Andrew J; Collins, Chris D

    2015-11-01

    The aim of this study was to assess and improve the accuracy of biotransfer models for the organic pollutants (PCBs, PCDD/Fs, PBDEs, PFCAs, and pesticides) into cow's milk and beef used in human exposure assessment. Metabolic rate in cattle is known as a key parameter for this biotransfer, however few experimental data and no simulation methods are currently available. In this research, metabolic rate was estimated using existing QSAR biodegradation models of microorganisms (BioWIN) and fish (EPI-HL and IFS-HL). This simulated metabolic rate was then incorporated into the mechanistic cattle biotransfer models (RAIDAR, ACC-HUMAN, OMEGA, and CKow). The goodness of fit tests showed that RAIDAR, ACC-HUMAN, OMEGA model performances were significantly improved using either of the QSARs when comparing the new model outputs to observed data. The CKow model is the only one that separates the processes in the gut and liver. This model showed the lowest residual error of all the models tested when the BioWIN model was used to represent the ruminant metabolic process in the gut and the two fish QSARs were used to represent the metabolic process in the liver. Our testing included EUSES and CalTOX which are KOW-regression models that are widely used in regulatory assessment. New regressions based on the simulated rate of the two metabolic processes are also proposed as an alternative to KOW-regression models for a screening risk assessment. The modified CKow model is more physiologically realistic, but has equivalent usability to existing KOW-regression models for estimating cattle biotransfer of organic pollutants. Copyright © 2015. Published by Elsevier Ltd.

  15. An assessment of the near-surface accuracy of the international geomagnetic reference field 1980 model of the main geomagnetic field

    USGS Publications Warehouse

    Peddie, N.W.; Zunde, A.K.

    1985-01-01

    The new International Geomagnetic Reference Field (IGRF) model of the main geomagnetic field for 1980 is based heavily on measurements from the MAGSAT satellite survey. Assessment of the accuracy of the new model, as a description of the main field near the Earth's surface, is important because the accuracy of models derived from satellite data can be adversely affected by the magnetic field of electric currents in the ionosphere and the auroral zones. Until now, statements about its accuracy have been based on the 6 published assessments of the 2 proposed models from which it was derived. However, those assessments were either regional in scope or were based mainly on preliminary or extrapolated data. Here we assess the near-surface accuracy of the new model by comparing it with values for 1980 derived from annual means from 69 magnetic observatories, and by comparing it with WC80, a model derived from near-surface data. The comparison with observatory-derived data shows that the new model describes the field at the 69 observatories about as accurately as would a model derived solely from near-surface data. The comparison with WC80 shows that the 2 models agree closely in their description of D and I near the surface. These comparisons support the proposition that the new IGRF 1980 main-field model is a generally accurate description of the main field near the Earth's surface in 1980. ?? 1985.

  16. Exploring the concept of climate surprises. A review of the literature on the concept of surprise and how it is related to climate change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glantz, M.H.; Moore, C.M.; Streets, D.G.

    This report examines the concept of climate surprise and its implications for environmental policymaking. Although most integrated assessment models of climate change deal with average values of change, it is usually the extreme events or surprises that cause the most damage to human health and property. Current models do not help the policymaker decide how to deal with climate surprises. This report examines the literature of surprise in many aspects of human society: psychology, military, health care, humor, agriculture, etc. It draws together various ways to consider the concept of surprise and examines different taxonomies of surprise that have beenmore » proposed. In many ways, surprise is revealed to be a subjective concept, triggered by such factors as prior experience, belief system, and level of education. How policymakers have reacted to specific instances of climate change or climate surprise in the past is considered, particularly with regard to the choices they made between proactive and reactive measures. Finally, the report discusses techniques used in the current generation of assessment models and makes suggestions as to how climate surprises might be included in future models. The report concludes that some kinds of surprises are simply unpredictable, but there are several types that could in some way be anticipated and assessed, and their negative effects forestalled.« less

  17. Towards Adaptive Educational Assessments: Predicting Student Performance using Temporal Stability and Data Analytics in Learning Management Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thakur, Gautam; Olama, Mohammed M; McNair, Wade

    Data-driven assessments and adaptive feedback are becoming a cornerstone research in educational data analytics and involve developing methods for exploring the unique types of data that come from the educational context. For example, predicting college student performance is crucial for both the students and educational institutions. It can support timely intervention to prevent students from failing a course, increasing efficacy of advising functions, and improving course completion rate. In this paper, we present our efforts in using data analytics that enable educationists to design novel data-driven assessment and feedback mechanisms. In order to achieve this objective, we investigate temporal stabilitymore » of students grades and perform predictive analytics on academic data collected from 2009 through 2013 in one of the most commonly used learning management systems, called Moodle. First, we have identified the data features useful for assessments and predicting student outcomes such as students scores in homework assignments, quizzes, exams, in addition to their activities in discussion forums and their total Grade Point Average(GPA) at the same term they enrolled in the course. Second, time series models in both frequency and time domains are applied to characterize the progression as well as overall projections of the grades. In particular, the model analyzed the stability as well as fluctuation of grades among students during the collegiate years (from freshman to senior) and disciplines. Third, Logistic Regression and Neural Network predictive models are used to identify students as early as possible who are in danger of failing the course they are currently enrolled in. These models compute the likelihood of any given student failing (or passing) the current course. The time series analysis indicates that assessments and continuous feedback are critical for freshman and sophomores (even with easy courses) than for seniors, and those assessments may be provided using the predictive models. Numerical results are presented to evaluate and compare the performance of the developed models and their predictive accuracy. Our results show that there are strong ties associated with the first few weeks for coursework and they have an impact on the design and distribution of individual modules.« less

  18. Mechanistic modeling of insecticide risks to breeding birds in ...

    EPA Pesticide Factsheets

    Insecticide usage in the United States is ubiquitous in urban, suburban, and rural environments. In evaluating data for an insecticide registration application and for registration review, scientists at the United States Environmental Protection Agency (USEPA) assess the fate of the insecticide and the risk the insecticide poses to the environment and non-target wildlife. At the present time, current USEPA risk assessments do not include population-level endpoints. In this paper, we present a new mechanistic model, which allows risk assessors to estimate the effects of insecticide exposure on the survival and seasonal productivity of birds known to use agricultural fields during their breeding season. The new model was created from two existing USEPA avian risk assessment models, the Terrestrial Investigation Model (TIM v.3.0) and the Markov Chain Nest Productivity model (MCnest). The integrated TIM/MCnest model has been applied to assess the relative risk of 12 insecticides used to control corn pests on a suite of 31 avian species known to use cornfields in midwestern agroecosystems. The 12 insecticides that were assessed in this study are all used to treat major pests of corn (corn root worm borer, cutworm, and armyworm). After running the integrated TIM/MCnest model, we found extensive differences in risk to birds among insecticides, with chlorpyrifos and malathion (organophosphates) generally posing the greatest risk, and bifenthrin and ë-cyhalothrin (

  19. Assessing variable rate nitrogen fertilizer strategies within an extensively instrument field site using the MicroBasin model

    NASA Astrophysics Data System (ADS)

    Ward, N. K.; Maureira, F.; Yourek, M. A.; Brooks, E. S.; Stockle, C. O.

    2014-12-01

    The current use of synthetic nitrogen fertilizers in agriculture has many negative environmental and economic costs, necessitating improved nitrogen management. In the highly heterogeneous landscape of the Palouse region in eastern Washington and northern Idaho, crop nitrogen needs vary widely within a field. Site-specific nitrogen management is a promising strategy to reduce excess nitrogen lost to the environment while maintaining current yields by matching crop needs with inputs. This study used in-situ hydrologic, nutrient, and crop yield data from a heavily instrumented field site in the high precipitation zone of the wheat-producing Palouse region to assess the performance of the MicroBasin model. MicroBasin is a high-resolution watershed-scale ecohydrologic model with nutrient cycling and cropping algorithms based on the CropSyst model. Detailed soil mapping conducted at the site was used to parameterize the model and the model outputs were evaluated with observed measurements. The calibrated MicroBasin model was then used to evaluate the impact of various nitrogen management strategies on crop yield and nitrate losses. The strategies include uniform application as well as delineating the field into multiple zones of varying nitrogen fertilizer rates to optimize nitrogen use efficiency. We present how coupled modeling and in-situ data sets can inform agricultural management and policy to encourage improved nitrogen management.

  20. A Particle and Energy Balance Model of the Orificed Hollow Cathode

    NASA Technical Reports Server (NTRS)

    Domonkos, Matthew T.

    2002-01-01

    A particle and energy balance model of orificed hollow cathodes was developed to assist in cathode design. The model presented here is an ensemble of original work by the author and previous work by others. The processes in the orifice region are considered to be one of the primary drivers in determining cathode performance, since the current density was greatest in this volume (up to 1.6 x 10(exp 8) A/m2). The orifice model contains comparatively few free parameters, and its results are used to bound the free parameters for the insert model. Next, the insert region model is presented. The sensitivity of the results to the free parameters is assessed, and variation of the free parameters in the orifice dominates the calculated power consumption and plasma properties. The model predictions are compared to data from a low-current orificed hollow cathode. The predicted power consumption exceeds the experimental results. Estimates of the plasma properties in the insert region overlap Langmuir probe data, and the predicted orifice plasma suggests the presence of one or more double layers. Finally, the model is used to examine the operation of higher current cathodes.

  1. Kinetic Monte Carlo simulations of nucleation and growth in electrodeposition.

    PubMed

    Guo, Lian; Radisic, Aleksandar; Searson, Peter C

    2005-12-22

    Nucleation and growth during bulk electrodeposition is studied using kinetic Monte Carlo (KMC) simulations. Ion transport in solution is modeled using Brownian dynamics, and the kinetics of nucleation and growth are dependent on the probabilities of metal-on-substrate and metal-on-metal deposition. Using this approach, we make no assumptions about the nucleation rate, island density, or island distribution. The influence of the attachment probabilities and concentration on the time-dependent island density and current transients is reported. Various models have been assessed by recovering the nucleation rate and island density from the current-time transients.

  2. Impacts of vanpooling in Pennsylvania and future opportunities.

    DOT National Transportation Integrated Search

    2010-12-30

    This study conducted a state survey to assess the feasibility of expanded vanpool operations in Pennsylvania : and financing models available. An overview of current commuting patterns and vanpool operations in : Pennsylvania is presented and an empl...

  3. Backup of Renewable Energy for an Electrical Island: Case Study of Israeli Electricity System—Current Status

    PubMed Central

    Fakhouri, A.; Kuperman, A.

    2014-01-01

    The paper focuses on the quantitative analysis of Israeli Government's targets of 10% renewable energy penetration by 2020 and determining the desired methodology (models) for assessing the effects on the electricity market, addressing the fact that Israel is an electricity island. The main objective is to determine the influence of achieving the Government's goals for renewable energy penetration on the need for backup in the Israeli electricity system. This work presents the current situation of the Israeli electricity market and the study to be taken in order to assess the undesirable effects resulting from the intermittency of electricity generated by wind and solar power stations as well as presents some solutions to mitigating these phenomena. Future work will focus on a quantitative analysis of model runs and determine the amounts of backup required relative to the amount of installed capacity from renewable resources. PMID:24624044

  4. Backup of renewable energy for an electrical island: case study of Israeli electricity system--current status.

    PubMed

    Fakhouri, A; Kuperman, A

    2014-01-01

    The paper focuses on the quantitative analysis of Israeli Government's targets of 10% renewable energy penetration by 2020 and determining the desired methodology (models) for assessing the effects on the electricity market, addressing the fact that Israel is an electricity island. The main objective is to determine the influence of achieving the Government's goals for renewable energy penetration on the need for backup in the Israeli electricity system. This work presents the current situation of the Israeli electricity market and the study to be taken in order to assess the undesirable effects resulting from the intermittency of electricity generated by wind and solar power stations as well as presents some solutions to mitigating these phenomena. Future work will focus on a quantitative analysis of model runs and determine the amounts of backup required relative to the amount of installed capacity from renewable resources.

  5. Human resource aspects of antiretroviral treatment delivery models: current practices and recommendations.

    PubMed

    Assefa, Yibeltal; Van Damme, Wim; Hermann, Katharina

    2010-01-01

    PURPOSE OF VIEW: To illustrate and critically assess what is currently being published on the human resources for health dimension of antiretroviral therapy (ART) delivery models. The use of human resources for health can have an effect on two crucial aspects of successful ART programmes, namely the scale-up capacity and the long-term retention in care. Task shifting as the delegation of tasks from higher qualified to lower qualified cadres has become a widespread practice in ART delivery models in low-income countries in recent years. It is increasingly shown to effectively reduce the workload for scarce medical doctors without compromising the quality of care. At the same time, it becomes clear that task shifting can only be successful when accompanied by intensive training, supervision and support from existing health system structures. Although a number of recent publications have focussed on task shifting in ART delivery models, there is a lack of accessible information on the link between task shifting and patient outcomes. Current ART delivery models do not focus sufficiently on retention in care as arguably one of the most important issues for the long-term success of ART programmes. There is a need for context-specific re-designing of current ART delivery models in order to increase access to ART and improve long-term retention.

  6. Evaluation of Two Crew Module Boilerplate Tests Using Newly Developed Calibration Metrics

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.

    2012-01-01

    The paper discusses a application of multi-dimensional calibration metrics to evaluate pressure data from water drop tests of the Max Launch Abort System (MLAS) crew module boilerplate. Specifically, three metrics are discussed: 1) a metric to assess the probability of enveloping the measured data with the model, 2) a multi-dimensional orthogonality metric to assess model adequacy between test and analysis, and 3) a prediction error metric to conduct sensor placement to minimize pressure prediction errors. Data from similar (nearly repeated) capsule drop tests shows significant variability in the measured pressure responses. When compared to expected variability using model predictions, it is demonstrated that the measured variability cannot be explained by the model under the current uncertainty assumptions.

  7. Logistic regression modeling to assess groundwater vulnerability to contamination in Hawaii, USA

    NASA Astrophysics Data System (ADS)

    Mair, Alan; El-Kadi, Aly I.

    2013-10-01

    Capture zone analysis combined with a subjective susceptibility index is currently used in Hawaii to assess vulnerability to contamination of drinking water sources derived from groundwater. In this study, we developed an alternative objective approach that combines well capture zones with multiple-variable logistic regression (LR) modeling and applied it to the highly-utilized Pearl Harbor and Honolulu aquifers on the island of Oahu, Hawaii. Input for the LR models utilized explanatory variables based on hydrogeology, land use, and well geometry/location. A suite of 11 target contaminants detected in the region, including elevated nitrate (> 1 mg/L), four chlorinated solvents, four agricultural fumigants, and two pesticides, was used to develop the models. We then tested the ability of the new approach to accurately separate groups of wells with low and high vulnerability, and the suitability of nitrate as an indicator of other types of contamination. Our results produced contaminant-specific LR models that accurately identified groups of wells with the lowest/highest reported detections and the lowest/highest nitrate concentrations. Current and former agricultural land uses were identified as significant explanatory variables for eight of the 11 target contaminants, while elevated nitrate was a significant variable for five contaminants. The utility of the combined approach is contingent on the availability of hydrologic and chemical monitoring data for calibrating groundwater and LR models. Application of the approach using a reference site with sufficient data could help identify key variables in areas with similar hydrogeology and land use but limited data. In addition, elevated nitrate may also be a suitable indicator of groundwater contamination in areas with limited data. The objective LR modeling approach developed in this study is flexible enough to address a wide range of contaminants and represents a suitable addition to the current subjective approach.

  8. Assessment of the terrestrial water balance using the global water availability and use model WaterGAP - status and challenges

    NASA Astrophysics Data System (ADS)

    Müller Schmied, Hannes; Döll, Petra

    2017-04-01

    The estimation of the World's water resources has a long tradition and numerous methods for quantification exists. The resulting numbers vary significantly, leaving room for improvement. Since some decades, global hydrological models (GHMs) are being used for large scale water budget assessments. GHMs are designed to represent the macro-scale hydrological processes and many of those models include human water management, e.g. irrigation or reservoir operation, making them currently the first choice for global scale assessments of the terrestrial water balance within the Anthropocene. The Water - Global Assessment and Prognosis (WaterGAP) is a model framework that comprises both the natural and human water dimension and is in development and application since the 1990s. In recent years, efforts were made to assess the sensitivity of water balance components to alternative climate forcing input data and, e.g., how this sensitivity is affected by WaterGAP's calibration scheme. This presentation shows the current best estimate of terrestrial water balance components as simulated with WaterGAP by 1) assessing global and continental water balance components for the climate period 1971-2000 and the IPCC reference period 1986-2005 for the most current WaterGAP version using a homogenized climate forcing data, 2) investigating variations of water balance components for a number of state-of-the-art climate forcing data and 3) discussing the benefit of the calibration approach for a better observation-data constrained global water budget. For the most current WaterGAP version 2.2b and a homogenized combination of the two WATCH Forcing Datasets, global scale (excluding Antarctica and Greenland) river discharge into oceans and inland sinks (Q) is assessed to be 40 000 km3 yr-1 for 1971-2000 and 39 200 km3 yr-1 for 1986-2005. Actual evapotranspiration (AET) is close to each other with around 70 600 (70 700) km3 yr-1 as well as water consumption with 1000 (1100) km3 yr-1. The main reason for differing Q is varying precipitation (P, 111 600 km3 yr-1 vs. 110 900 km3 yr-1). The sensitivity of water balance components to alternative climate forcing data is high. Applying 5 state-of-the-art climate forcing data sets, long term average P differs globally by 8000 km3 yr-1, mainly due to different handling of precipitation undercatch correction (or neglecting it). AET differs by 5500 km3 yr-1 whereas Q varies by 3000 km3 yr-1. The sensitivity of human water consumption to alternative climate input data is only about 5%. WaterGAP's calibration approach forces simulated long-term river discharge to be approximately equal to observed values at 1319 gauging stations during the time period selected for calibration. This scheme greatly reduces the impact of uncertain climate input on simulated Q data in these upstream drainage basins (as well as downstream). In calibration areas, the Q variation among the climate input data is much lower (1.6%) than in non-calibrated areas (18.5%). However, variation of Q at the grid cell-level is still high (an average of 37% for Q in grid cells in calibration areas vs. 74% outside). Due to the closed water balance, variation of AET is higher in calibrated areas than in non-calibrated areas. Main challenges in assessing the world's water resources by GHMs like WaterGAP are 1) the need of consistent long-term climate forcing input data sets, especial considering a suitable handling of P undercatch, 2) the accessibility of in-situ data for river discharge or alternative calibration data for currently non-calibrated areas, and 3) an improved simulation in semi-arid and arid river basins. As an outlook, a multi-model, multi-forcing study of global water balance components within the frame of the Inter-Sectoral Impact Model Intercomparison Project is proposed.

  9. Optimization of return electrodes in neurostimulating arrays

    NASA Astrophysics Data System (ADS)

    Flores, Thomas; Goetz, Georges; Lei, Xin; Palanker, Daniel

    2016-06-01

    Objective. High resolution visual prostheses require dense stimulating arrays with localized inputs of individual electrodes. We study the electric field produced by multielectrode arrays in electrolyte to determine an optimal configuration of return electrodes and activation sequence. Approach. To determine the boundary conditions for computation of the electric field in electrolyte, we assessed current dynamics using an equivalent circuit of a multielectrode array with interleaved return electrodes. The electric field modeled with two different boundary conditions derived from the equivalent circuit was then compared to measurements of electric potential in electrolyte. To assess the effect of return electrode configuration on retinal stimulation, we transformed the computed electric fields into retinal response using a model of neural network-mediated stimulation. Main results. Electric currents at the capacitive electrode-electrolyte interface redistribute over time, so that boundary conditions transition from equipotential surfaces at the beginning of the pulse to uniform current density in steady state. Experimental measurements confirmed that, in steady state, the boundary condition corresponds to a uniform current density on electrode surfaces. Arrays with local return electrodes exhibit improved field confinement and can elicit stronger network-mediated retinal response compared to those with a common remote return. Connecting local return electrodes enhances the field penetration depth and allows reducing the return electrode area. Sequential activation of the pixels in large monopolar arrays reduces electrical cross-talk and improves the contrast in pattern stimulation. Significance. Accurate modeling of multielectrode arrays helps optimize the electrode configuration to maximize the spatial resolution, contrast and dynamic range of retinal prostheses.

  10. Comparison of cannabinoids with known analgesics using a novel high throughput zebrafish larval model of nociception.

    PubMed

    Ellis, L D; Berrue, F; Morash, M; Achenbach, J C; Hill, J; McDougall, J J

    2018-01-30

    It has been established that both adult and larval zebrafish are capable of showing nociceptive responses to noxious stimuli; however, the use of larvae to test novel analgesics has not been fully explored. Zebrafish larvae represent a low-cost, high-throughput alternative to traditional mammalian models for the assessment of product efficacy during the initial stages of drug development. In the current study, a novel model of nociception using zebrafish larvae is described. During the recovery from an acute exposure to low levels of acetic acid, larvae display innate changes in behaviour that may be indicative of nociception. To assess the usefulness of this model for testing potential analgesics, three known synthetic pain medications were assessed (ibuprofen, acetaminophen and tramadol) along with three naturally occurring products (honokiol, tetrahydrocannabinol and cannabidiol). When the effect of each compound on both the acetic acid recovery and control activity was compared there appeared to be both similarities and differences between the compounds. One of the most interesting effects was found for cannabidiol which appeared to oppose the activity change during the recovery period of AA exposed larvae while having a nominal effect on control activity. This would appear to be in line with current research that has demonstrated the nociceptive properties of cannabidiol. Here we have provided a novel model that will complement existing zebrafish models and will expand on the potential use of zebrafish larvae for studying both nociception and new analgesics. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  11. Benefit of Modeling the Observation Error in a Data Assimilation Framework Using Vegetation Information Obtained From Passive Based Microwave Data

    NASA Technical Reports Server (NTRS)

    Bolten, John D.; Mladenova, Iliana E.; Crow, Wade; De Jeu, Richard

    2016-01-01

    A primary operational goal of the United States Department of Agriculture (USDA) is to improve foreign market access for U.S. agricultural products. A large fraction of this crop condition assessment is based on satellite imagery and ground data analysis. The baseline soil moisture estimates that are currently used for this analysis are based on output from the modified Palmer two-layer soil moisture model, updated to assimilate near-real time observations derived from the Soil Moisture Ocean Salinity (SMOS) satellite. The current data assimilation system is based on a 1-D Ensemble Kalman Filter approach, where the observation error is modeled as a function of vegetation density. This allows for offsetting errors in the soil moisture retrievals. The observation error is currently adjusted using Normalized Difference Vegetation Index (NDVI) climatology. In this paper we explore the possibility of utilizing microwave-based vegetation optical depth instead.

  12. Towards the development of improved tests for negative symptoms of schizophrenia in a validated animal model.

    PubMed

    Sahin, Ceren; Doostdar, Nazanin; Neill, Joanna C

    2016-10-01

    Negative symptoms in schizophrenia remain an unmet clinical need. There is no licensed treatment specifically for this debilitating aspect of the disorder and effect sizes of new therapies are too small to make an impact on quality of life and function. Negative symptoms are multifactorial but often considered in terms of two domains, expressive deficit incorporating blunted affect and poverty of speech and avolition incorporating asociality and lack of drive. There is a clear need for improved understanding of the neurobiology of negative symptoms which can be enabled through the use of carefully validated animal models. While there are several tests for assessing sociability in animals, tests for blunted affect in schizophrenia are currently lacking. Two paradigms have recently been developed for assessing negative affect of relevance to depression in rats. Here we assess their utility for studying negative symptoms in schizophrenia using our well validated model for schizophrenia of sub-chronic (sc) treatment with Phencyclidine (PCP) in adult female rats. Results demonstrate that sc PCP treatment produces a significant negative affect bias in response to a high value reward in the optimistic and affective bias tests. Our results are not easily explained by the known cognitive deficits induced by sc PCP and support the hypothesis of a negative affective bias in this model. We suggest that further refinement of these two tests will provide a means to investigate the neurobiological basis of negative affect in schizophrenia, thus supporting the assessment of efficacy of new targets for this currently untreated symptom domain. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Electronic field emission models beyond the Fowler-Nordheim one

    NASA Astrophysics Data System (ADS)

    Lepetit, Bruno

    2017-12-01

    We propose several quantum mechanical models to describe electronic field emission from first principles. These models allow us to correlate quantitatively the electronic emission current with the electrode surface details at the atomic scale. They all rely on electronic potential energy surfaces obtained from three dimensional density functional theory calculations. They differ by the various quantum mechanical methods (exact or perturbative, time dependent or time independent), which are used to describe tunneling through the electronic potential energy barrier. Comparison of these models between them and with the standard Fowler-Nordheim one in the context of one dimensional tunneling allows us to assess the impact on the accuracy of the computed current of the approximations made in each model. Among these methods, the time dependent perturbative one provides a well-balanced trade-off between accuracy and computational cost.

  14. Development of an algebraic stress/two-layer model for calculating thrust chamber flow fields

    NASA Technical Reports Server (NTRS)

    Chen, C. P.; Shang, H. M.; Huang, J.

    1993-01-01

    Following the consensus of a workshop in Turbulence Modeling for Liquid Rocket Thrust Chambers, the current effort was undertaken to study the effects of second-order closure on the predictions of thermochemical flow fields. To reduce the instability and computational intensity of the full second-order Reynolds Stress Model, an Algebraic Stress Model (ASM) coupled with a two-layer near wall treatment was developed. Various test problems, including the compressible boundary layer with adiabatic and cooled walls, recirculating flows, swirling flows and the entire SSME nozzle flow were studied to assess the performance of the current model. Detailed calculations for the SSME exit wall flow around the nozzle manifold were executed. As to the overall flow predictions, the ASM removes another assumption for appropriate comparison with experimental data, to account for the non-isotropic turbulence effects.

  15. Turbulence modelling of flow fields in thrust chambers

    NASA Technical Reports Server (NTRS)

    Chen, C. P.; Kim, Y. M.; Shang, H. M.

    1993-01-01

    Following the consensus of a workshop in Turbulence Modelling for Liquid Rocket Thrust Chambers, the current effort was undertaken to study the effects of second-order closure on the predictions of thermochemical flow fields. To reduce the instability and computational intensity of the full second-order Reynolds Stress Model, an Algebraic Stress Model (ASM) coupled with a two-layer near wall treatment was developed. Various test problems, including the compressible boundary layer with adiabatic and cooled walls, recirculating flows, swirling flows, and the entire SSME nozzle flow were studied to assess the performance of the current model. Detailed calculations for the SSME exit wall flow around the nozzle manifold were executed. As to the overall flow predictions, the ASM removes another assumption for appropriate comparison with experimental data to account for the non-isotropic turbulence effects.

  16. Risk of breast cancer following exposure to tetrachloroethylene-contaminated drinking water in Cape Cod, Massachusetts: reanalysis of a case-control study using a modified exposure assessment

    PubMed Central

    2011-01-01

    Background Tetrachloroethylene (PCE) is an important occupational chemical used in metal degreasing and drycleaning and a prevalent drinking water contaminant. Exposure often occurs with other chemicals but it occurred alone in a pattern that reduced the likelihood of confounding in a unique scenario on Cape Cod, Massachusetts. We previously found a small to moderate increased risk of breast cancer among women with the highest exposures using a simple exposure model. We have taken advantage of technical improvements in publically available software to incorporate a more sophisticated determination of water flow and direction to see if previous results were robust to more accurate exposure assessment. Methods The current analysis used PCE exposure estimates generated with the addition of water distribution modeling software (EPANET 2.0) to test model assumptions, compare exposure distributions to prior methods, and re-examine the risk of breast cancer. In addition, we applied data smoothing to examine nonlinear relationships between breast cancer and exposure. We also compared a set of measured PCE concentrations in water samples collected in 1980 to modeled estimates. Results Thirty-nine percent of individuals considered unexposed in prior epidemiological analyses were considered exposed using the current method, but mostly at low exposure levels. As a result, the exposure distribution was shifted downward resulting in a lower value for the 90th percentile, the definition of "high exposure" in prior analyses. The current analyses confirmed a modest increase in the risk of breast cancer for women with high PCE exposure levels defined by either the 90th percentile (adjusted ORs 1.0-1.5 for 0-19 year latency assumptions) or smoothing analysis cut point (adjusted ORs 1.3-2.0 for 0-15 year latency assumptions). Current exposure estimates had a higher correlation with PCE concentrations in water samples (Spearman correlation coefficient = 0.65, p < 0.0001) than estimates generated using the prior method (0.54, p < 0.0001). Conclusions The incorporation of sophisticated flow estimates in the exposure assessment method shifted the PCE exposure distribution downward, but did not meaningfully affect the exposure ranking of subjects or the strength of the association with the risk of breast cancer found in earlier analyses. Thus, the current analyses show a slightly elevated breast cancer risk for highly exposed women, with strengthened exposure assessment and minimization of misclassification by using the latest technology. PMID:21600013

  17. Meteorological disaster management and assessment system design and implementation

    NASA Astrophysics Data System (ADS)

    Tang, Wei; Luo, Bin; Wu, Huanping

    2009-09-01

    Disaster prevention and mitigation get more and more attentions by Chinese government, with the national economic development in recent years. Some problems exhibit in traditional disaster management, such as the chaotic management of data, low level of information, poor data sharing. To improve the capability of information in disaster management, Meteorological Disaster Management and Assessment System (MDMAS) was developed and is introduced in the paper. MDMAS uses three-tier C/S architecture, including the application layer, data layer and service layer. Current functions of MDMAS include the typhoon and rainstorm assessment, disaster data query and statistics, automatic cartography for disaster management. The typhoon and rainstorm assessment models can be used in both pre-assessment of pre-disaster and post-disaster assessment. Implementation of automatic cartography uses ArcGIS Geoprocessing and ModelBuilder. In practice, MDMAS has been utilized to provide warning information, disaster assessment and services products. MDMAS is an efficient tool for meteorological disaster management and assessment. It can provide decision supports for disaster prevention and mitigation.

  18. Meteorological disaster management and assessment system design and implementation

    NASA Astrophysics Data System (ADS)

    Tang, Wei; Luo, Bin; Wu, Huanping

    2010-11-01

    Disaster prevention and mitigation get more and more attentions by Chinese government, with the national economic development in recent years. Some problems exhibit in traditional disaster management, such as the chaotic management of data, low level of information, poor data sharing. To improve the capability of information in disaster management, Meteorological Disaster Management and Assessment System (MDMAS) was developed and is introduced in the paper. MDMAS uses three-tier C/S architecture, including the application layer, data layer and service layer. Current functions of MDMAS include the typhoon and rainstorm assessment, disaster data query and statistics, automatic cartography for disaster management. The typhoon and rainstorm assessment models can be used in both pre-assessment of pre-disaster and post-disaster assessment. Implementation of automatic cartography uses ArcGIS Geoprocessing and ModelBuilder. In practice, MDMAS has been utilized to provide warning information, disaster assessment and services products. MDMAS is an efficient tool for meteorological disaster management and assessment. It can provide decision supports for disaster prevention and mitigation.

  19. Conjunctive-management models for sustained yield of stream-aquifer systems

    USGS Publications Warehouse

    Barlow, P.M.; Ahlfeld, D.P.; Dickerman, D.C.

    2003-01-01

    Conjunctive-management models that couple numerical simulation with linear optimization were developed to evaluate trade-offs between groundwater withdrawals and streamflow depletions for alluvial-valley stream-aquifer systems representative of those of the northeastern United States. A conjunctive-management model developed for a hypothetical stream-aquifer system was used to assess the effect of interannual hydrologic variability on minimum monthly streamflow requirements. The conjunctive-management model was applied to the Hunt-Annaquatucket-Pettaquamscutt stream-aquifer system of central Rhode Island. Results show that it is possible to increase the amount of current withdrawal from the aquifer by as much as 50% by modifying current withdrawal schedules, modifying the number and configuration of wells in the supply-well network, or allowing increased streamflow depletion in the Annaquatucket and Pettaquamscutt rivers. Alternatively, it is possible to reduce current rates of streamflow depletion in the Hunt River by as much as 35% during the summer, but such reductions would result increases in groundwater withdrawals.

  20. A review of the ionospheric model for the long wave prediction capability

    NASA Astrophysics Data System (ADS)

    Ferguson, J. A.

    1992-11-01

    The Naval Command, Control, and Ocean Surveillance Center's Long Wave Prediction Capability (LWPC) has a built-in ionospheric model. The latter was defined after a review of the literature comparing measurements with calculations. Subsequent to this original specification of the ionospheric model in the LWPC, a new collection of data were obtained and analyzed. The new data were collected aboard a merchant ship named the Callaghan during a series of trans-Atlantic trips over a period of a year. This report presents a detailed analysis of the ionospheric model currently in use by the LWPC and the new model suggested by the shipboard measurements. We conclude that, although the fits to measurements are almost the same between the two models examined, the current LWPC model should be used because it is better than the new model for nighttime conditions at long ranges. This conclusion supports the primary use of the LWPC model for coverage assessment that requires a valid model at the limits of a transmitter's reception.

  1. Factor Structure and Measurement Invariance of the Need-Supportive Teaching Style Scale for Physical Education.

    PubMed

    Liu, Jing-Dong; Chung, Pak-Kwong

    2017-08-01

    The purpose of the current study was to examine the factor structure and measurement invariance of a scale measuring students' perceptions of need-supportive teaching (Need-Supportive Teaching Style Scale in Physical Education; NSTSSPE). We sampled 615 secondary school students in Hong Kong, 200 of whom also completed a follow-up assessment two months later. Factor structure of the scale was examined through exploratory structural equation modeling (ESEM). Further, nomological validity of the NSTSSPE was evaluated by examining the relationships between need-supportive teaching style and student satisfaction of psychological needs. Finally, four measurement models-configural, metric invariance, scalar invariance, and item uniqueness invariance-were assessed using multiple group ESEM to test the measurement invariance of the scale across gender, grade, and time. ESEM results suggested a three-factor structure of the NSTSSPE. Nomological validity was supported, and weak, strong, and strict measurement invariance of the NSTSSPE was evidenced across gender, grade, and time. The current study provides initial psychometric support for the NSTSSPE to assess student perceptions of teachers' need-supportive teaching style in physical education classes.

  2. Fatal and nonfatal risk associated with recycle of D&D-generated concrete

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boren, J.K.; Ayers, K.W.; Parker, F.L.

    1997-02-01

    As decontamination and decommissioning activities proceed within the U.S. Department of Energy Complex, vast volumes of uncontaminated and contaminated concrete will be generated. The current practice of decontaminating and landfilling the concrete is an expensive and potentially wasteful practice. Research is being conducted at Vanderbilt University to assess the economic, social, legal, and political ramifications of alternate methods of dealing with waste concrete. An important aspect of this research work is the assessment of risk associated with the various alternatives. A deterministic risk assessment model has been developed which quantifies radiological as well as non-radiological risks associated with concrete disposalmore » and recycle activities. The risk model accounts for fatal as well as non-fatal risks to both workers and the public. Preliminary results indicate that recycling of concrete presents potentially lower risks than the current practice. Radiological considerations are shown to be of minor importance in comparison to other sources of risk, with conventional transportation fatalities and injuries dominating. Onsite activities can also be a major contributor to non-fatal risk.« less

  3. Solutions Network Formulation Report. Using NASA Sensors to Perform Crop Type Assessment for Monitoring Insect Resistance in Corn

    NASA Technical Reports Server (NTRS)

    Lewis, David; Copenhaver, Ken; Anderson, Daniel; Hilbert, Kent

    2007-01-01

    The EPA (U.S. Environmental Protection Agency) is tasked to monitor for insect pest resistance to transgenic crops. Several models have been developed to understand the resistance properties of insects. The Population Genetics Simulator model is used in the EPA PIRDSS (Pest Infestation and Resistance Decision Support System). The EPA Office of Pesticide Programs uses the DSS to help understand the potential for insect pest resistance development and the likelihood that insect pest resistance will negatively affect transgenic corn. Once the DSS identifies areas of concern, crews are deployed to collect insect pest samples, which are tested to identify whether they have developed resistance to the toxins in transgenic corn pesticides. In this candidate solution, VIIRS (Visible/Infrared Imager/Radiometer Suite) vegetation index products will be used to build hypertemporal layerstacks for crop type and phenology assessment. The current phenology attribute is determined by using the current time of year to index the expected growth stage of the crop. VIIRS might provide more accurate crop type assessment and also might give a better estimate on the crop growth stage.

  4. What are the effects of Agro-Ecological Zones and land use region boundaries on land resource projection using the Global Change Assessment Model?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Vittorio, Alan V.; Kyle, Page; Collins, William D.

    Understanding the potential impacts of climate change is complicated by mismatched spatial representations between gridded Earth System Models (ESMs) and Integrated Assessment Models (IAMs), whose regions are typically larger and defined by geopolitical and biophysical criteria. In this study we address uncertainty stemming from the construction of land use regions in an IAM, the Global Change Assessment Model (GCAM), whose regions are currently based on historical climatic conditions (1961-1990). We re-define GCAM’s regions according to projected climatic conditions (2070-2099), and investigate how this changes model outcomes for land use, agriculture, and forestry. By 2100, we find potentially large differences inmore » projected global and regional area of biomass energy crops, fodder crops, harvested forest, and intensive pasture. These land area differences correspond with changes in agricultural commodity prices and production. These results have broader implications for understanding policy scenarios and potential impacts, and for evaluating and comparing IAM and ESM simulations.« less

  5. Plasma distribution and spacecraft charging modeling near Jupiter

    NASA Technical Reports Server (NTRS)

    Goldstein, R.; Divine, N.

    1977-01-01

    To assess the role of spacecraft charging near Jupiter, the plasma distribution in Jupiter's magnetosphere was modeled using data from the plasma analyzer experiments on Pioneer 10 (published results) and on Pioneer 11 (preliminary results). In the model, electron temperatures are kT = 4 eV throughout, whereas proton temperatures range over 100 or equal to kT or equal to 400 eV. The model fluxes and concentrations vary over three orders of magnitude among several corotating regions, including, in order to increasing distance from Jupiter, a plasma void, plasma sphere, sporadic zone, ring current, current sheet, high latitude plasma and magnetosheath. Intermediate and high energy electrons and protons (to 100 MeV) are modeled as well. The models supply the information for calculating particle fluxes to a spacecraft in the Jovian environment. The particle balance equations (including effects of secondary and photoemission) then determine the spacecraft potential.

  6. Use of dispersion modelling for Environmental Impact Assessment of biological air pollution from composting: Progress, problems and prospects.

    PubMed

    Douglas, P; Hayes, E T; Williams, W B; Tyrrel, S F; Kinnersley, R P; Walsh, K; O'Driscoll, M; Longhurst, P J; Pollard, S J T; Drew, G H

    2017-12-01

    With the increase in composting asa sustainable waste management option, biological air pollution (bioaerosols) from composting facilities have become a cause of increasing concern due to their potential health impacts. Estimating community exposure to bioaerosols is problematic due to limitations in current monitoring methods. Atmospheric dispersion modelling can be used to estimate exposure concentrations, however several issues arise from the lack of appropriate bioaerosol data to use as inputs into models, and the complexity of the emission sources at composting facilities. This paper analyses current progress in using dispersion models for bioaerosols, examines the remaining problems and provides recommendations for future prospects in this area. A key finding is the urgent need for guidance for model users to ensure consistent bioaerosol modelling practices. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. How Do Tides and Tsunamis Interact in a Highly Energetic Channel? The Case of Canal Chacao, Chile

    NASA Astrophysics Data System (ADS)

    Winckler, Patricio; Sepúlveda, Ignacio; Aron, Felipe; Contreras-López, Manuel

    2017-12-01

    This study aims at understanding the role of tidal level, speed, and direction in tsunami propagation in highly energetic tidal channels. The main goal is to comprehend whether tide-tsunami interactions enhance/reduce elevation, currents speeds, and arrival times, when compared to pure tsunami models and to simulations in which tides and tsunamis are linearly superimposed. We designed various numerical experiments to compute the tsunami propagation along Canal Chacao, a highly energetic channel in the Chilean Patagonia lying on a subduction margin prone to megathrust earthquakes. Three modeling approaches were implemented under the same seismic scenario: a tsunami model with a constant tide level, a series of six composite models in which independent tide and tsunami simulations are linearly superimposed, and a series of six tide-tsunami nonlinear interaction models (full models). We found that hydrodynamic patterns differ significantly among approaches, being the composite and full models sensitive to both the tidal phase at which the tsunami is triggered and the local depth of the channel. When compared to full models, composite models adequately predicted the maximum surface elevation, but largely overestimated currents. The amplitude and arrival time of the tsunami-leading wave computed with the full model was found to be strongly dependent on the direction of the tidal current and less responsive to the tide level and the tidal current speed. These outcomes emphasize the importance of addressing more carefully the interactions of tides and tsunamis on hazard assessment studies.

  8. Site descriptive modeling as a part of site characterization in Sweden - Concluding the surface based investigations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersson, Johan; Winberg, Anders; Skagius, Kristina

    The Swedish Nuclear Fuel and Waste Management Co., SKB, is currently finalizing its surface based site investigations for the final repository for spent nuclear fuel in the municipalities of Oestharmnar (the Forsmark area) and Oskarshamn (the Simpevar/Laxemar area). The investigation data are assessed into a Site Descriptive Model, constituting a synthesis of geology, rock mechanics, thermal properties, hydrogeology, hydro-geochemistry, transport properties and a surface system description. Site data constitute a wide range of different measurement results. These data both need to be checked for consistency and to be interpreted into a format more amenable for three-dimensional modeling. The three-dimensional modelingmore » (i.e. estimating the distribution of parameter values in space) is made in a sequence where the geometrical framework is taken from the geological models and in turn used by the rock mechanics, thermal and hydrogeological modeling. These disciplines in turn are partly interrelated, and also provide feedback to the geological modeling, especially if the geological description appears unreasonable when assessed together with the other data. Procedures for assessing the uncertainties and the confidence in the modeling have been developed during the course of the site modeling. These assessments also provide key input to the completion of the site investigation program. (authors)« less

  9. Mental State Assessment and Validation Using Personalized Physiological Biometrics

    PubMed Central

    Patel, Aashish N.; Howard, Michael D.; Roach, Shane M.; Jones, Aaron P.; Bryant, Natalie B.; Robinson, Charles S. H.; Clark, Vincent P.; Pilly, Praveen K.

    2018-01-01

    Mental state monitoring is a critical component of current and future human-machine interfaces, including semi-autonomous driving and flying, air traffic control, decision aids, training systems, and will soon be integrated into ubiquitous products like cell phones and laptops. Current mental state assessment approaches supply quantitative measures, but their only frame of reference is generic population-level ranges. What is needed are physiological biometrics that are validated in the context of task performance of individuals. Using curated intake experiments, we are able to generate personalized models of three key biometrics as useful indicators of mental state; namely, mental fatigue, stress, and attention. We demonstrate improvements to existing approaches through the introduction of new features. Furthermore, addressing the current limitations in assessing the efficacy of biometrics for individual subjects, we propose and employ a multi-level validation scheme for the biometric models by means of k-fold cross-validation for discrete classification and regression testing for continuous prediction. The paper not only provides a unified pipeline for extracting a comprehensive mental state evaluation from a parsimonious set of sensors (only EEG and ECG), but also demonstrates the use of validation techniques in the absence of empirical data. Furthermore, as an example of the application of these models to novel situations, we evaluate the significance of correlations of personalized biometrics to the dynamic fluctuations of accuracy and reaction time on an unrelated threat detection task using a permutation test. Our results provide a path toward integrating biometrics into augmented human-machine interfaces in a judicious way that can help to maximize task performance.

  10. Mental State Assessment and Validation Using Personalized Physiological Biometrics.

    PubMed

    Patel, Aashish N; Howard, Michael D; Roach, Shane M; Jones, Aaron P; Bryant, Natalie B; Robinson, Charles S H; Clark, Vincent P; Pilly, Praveen K

    2018-01-01

    Mental state monitoring is a critical component of current and future human-machine interfaces, including semi-autonomous driving and flying, air traffic control, decision aids, training systems, and will soon be integrated into ubiquitous products like cell phones and laptops. Current mental state assessment approaches supply quantitative measures, but their only frame of reference is generic population-level ranges. What is needed are physiological biometrics that are validated in the context of task performance of individuals. Using curated intake experiments, we are able to generate personalized models of three key biometrics as useful indicators of mental state; namely, mental fatigue, stress, and attention. We demonstrate improvements to existing approaches through the introduction of new features. Furthermore, addressing the current limitations in assessing the efficacy of biometrics for individual subjects, we propose and employ a multi-level validation scheme for the biometric models by means of k -fold cross-validation for discrete classification and regression testing for continuous prediction. The paper not only provides a unified pipeline for extracting a comprehensive mental state evaluation from a parsimonious set of sensors (only EEG and ECG), but also demonstrates the use of validation techniques in the absence of empirical data. Furthermore, as an example of the application of these models to novel situations, we evaluate the significance of correlations of personalized biometrics to the dynamic fluctuations of accuracy and reaction time on an unrelated threat detection task using a permutation test. Our results provide a path toward integrating biometrics into augmented human-machine interfaces in a judicious way that can help to maximize task performance.

  11. Birth weight, current anthropometric markers, and high sensitivity C-reactive protein in Brazilian school children.

    PubMed

    Boscaini, Camile; Pellanda, Lucia Campos

    2015-01-01

    Studies have shown associations of birth weight with increased concentrations of high sensitivity C-reactive protein. This study assessed the relationship between birth weight, anthropometric and metabolic parameters during childhood, and high sensitivity C-reactive protein. A total of 612 Brazilian school children aged 5-13 years were included in the study. High sensitivity C-reactive protein was measured by particle-enhanced immunonephelometry. Nutritional status was assessed by body mass index, waist circumference, and skinfolds. Total cholesterol and fractions, triglycerides, and glucose were measured by enzymatic methods. Insulin sensitivity was determined by the homeostasis model assessment method. Statistical analysis included chi-square test, General Linear Model, and General Linear Model for Gamma Distribution. Body mass index, waist circumference, and skinfolds were directly associated with birth weight (P < 0.001, P = 0.001, and P = 0.015, resp.). Large for gestational age children showed higher high sensitivity C-reactive protein levels (P < 0.001) than small for gestational age. High birth weight is associated with higher levels of high sensitivity C-reactive protein, body mass index, waist circumference, and skinfolds. Large for gestational age altered high sensitivity C-reactive protein and promoted additional risk factor for atherosclerosis in these school children, independent of current nutritional status.

  12. Watershed Planning within a Quantitative Scenario Analysis Framework.

    PubMed

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-07-24

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  13. Comparison of the Current Center of Site Annual Neshap Dose Modeling at the Savannah River Site with Other Assessment Methods.

    PubMed

    Minter, Kelsey M; Jannik, G Timothy; Stagich, Brooke H; Dixon, Kenneth L; Newton, Joseph R

    2018-04-01

    The U.S. Environmental Protection Agency (EPA) requires the use of the model CAP88 to estimate the total effective dose (TED) to an offsite maximally exposed individual (MEI) for demonstrating compliance with 40 CFR 61, Subpart H: The National Emission Standards for Hazardous Air Pollutants (NESHAP) regulations. For NESHAP compliance at the Savannah River Site (SRS), the EPA, the U.S. Department of Energy (DOE), South Carolina's Department of Health and Environmental Control, and SRS approved a dose assessment method in 1991 that models all radiological emissions as if originating from a generalized center of site (COS) location at two allowable stack heights (0 m and 61 m). However, due to changes in SRS missions, radiological emissions are no longer evenly distributed about the COS. An area-specific simulation of the 2015 SRS radiological airborne emissions was conducted to compare to the current COS method. The results produced a slightly higher dose estimate (2.97 × 10 mSv vs. 2.22 × 10 mSv), marginally changed the overall MEI location, and noted that H-Area tritium emissions dominated the dose. Thus, an H-Area dose model was executed as a potential simplification of the area-specific simulation by adopting the COS methodology and modeling all site emissions from a single location in H-Area using six stack heights that reference stacks specific to the tritium production facilities within H-Area. This "H-Area Tritium Stacks" method produced a small increase in TED estimates (3.03 × 10 mSv vs. 2.97 × 10 mSv) when compared to the area-specific simulation. This suggests that the current COS method is still appropriate for demonstrating compliance with NESHAP regulations but that changing to the H-Area Tritium Stacks assessment method may now be a more appropriate representation of operations at SRS.

  14. CAD/CAE-technologies application for assessment of passenger safety on railway transport in emergency

    NASA Astrophysics Data System (ADS)

    Antipin, D. Ya; Shorokhov, S. G.; Bondarenko, O. I.

    2018-03-01

    A possibility of using current software products realizing CAD/CAE-technologies for the assessment of passenger safety in emergency cases on railway transport has been analyzed. On the basis of the developed solid computer model of an anthropometric dummy, the authors carried out an analysis of possible levels of passenger injury during accident collision of a train with an obstacle.

  15. TERSSE: Definition of the Total Earth Resources System for the Shuttle Era. Volume 2: An Assessment of the Current State-of-the-Art

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Results of a state-of-the-art assessment of technology areas which affect the Earth Resources Program are presented along with a functional description of the basic earth resources system. Major areas discussed include: spacecraft flight hardware, remote sensors, data processing techniques and hardware, user models, user interfaces, and operations technology.

  16. Assessing the Utility of Item Response Theory Models: Differential Item Functioning.

    ERIC Educational Resources Information Center

    Scheuneman, Janice Dowd

    The current status of item response theory (IRT) is discussed. Several IRT methods exist for assessing whether an item is biased. Focus is on methods proposed by L. M. Rudner (1975), F. M. Lord (1977), D. Thissen et al. (1988) and R. L. Linn and D. Harnisch (1981). Rudner suggested a measure of the area lying between the two item characteristic…

  17. Review of methods for developing regional probabilistic risk assessments, part 2: modeling invasive plant, insect, and pathogen species

    Treesearch

    P. B. Woodbury; D. A. Weinstein

    2010-01-01

    We reviewed probabilistic regional risk assessment methodologies to identify the methods that are currently in use and are capable of estimating threats to ecosystems from fire and fuels, invasive species, and their interactions with stressors. In a companion chapter, we highlight methods useful for evaluating risks from fire. In this chapter, we highlight methods...

  18. Annual Research Review: Embracing Not Erasing Contextual Variability in Children's Behavior--Theory and Utility in the Selection and Use of Methods and Informants in Developmental Psychopathology

    ERIC Educational Resources Information Center

    Dirks, Melanie A.; De Los Reyes, Andres; Briggs-Gowan, Margaret; Cella, David; Wakschlag, Lauren S.

    2012-01-01

    This paper examines the selection and use of multiple methods and informants for the assessment of disruptive behavior syndromes and attention deficit/hyperactivity disorder, providing a critical discussion of (a) the bidirectional linkages between theoretical models of childhood psychopathology and current assessment techniques; and (b) current…

  19. Evaluation of Uncertainty and Sensitivity in Environmental Modeling at a Radioactive Waste Management Site

    NASA Astrophysics Data System (ADS)

    Stockton, T. B.; Black, P. K.; Catlett, K. M.; Tauxe, J. D.

    2002-05-01

    Environmental modeling is an essential component in the evaluation of regulatory compliance of radioactive waste management sites (RWMSs) at the Nevada Test Site in southern Nevada, USA. For those sites that are currently operating, further goals are to support integrated decision analysis for the development of acceptance criteria for future wastes, as well as site maintenance, closure, and monitoring. At these RWMSs, the principal pathways for release of contamination to the environment are upward towards the ground surface rather than downwards towards the deep water table. Biotic processes, such as burrow excavation and plant uptake and turnover, dominate this upward transport. A combined multi-pathway contaminant transport and risk assessment model was constructed using the GoldSim modeling platform. This platform facilitates probabilistic analysis of environmental systems, and is especially well suited for assessments involving radionuclide decay chains. The model employs probabilistic definitions of key parameters governing contaminant transport, with the goals of quantifying cumulative uncertainty in the estimation of performance measures and providing information necessary to perform sensitivity analyses. This modeling differs from previous radiological performance assessments (PAs) in that the modeling parameters are intended to be representative of the current knowledge, and the uncertainty in that knowledge, of parameter values rather than reflective of a conservative assessment approach. While a conservative PA may be sufficient to demonstrate regulatory compliance, a parametrically honest PA can also be used for more general site decision-making. In particular, a parametrically honest probabilistic modeling approach allows both uncertainty and sensitivity analyses to be explicitly coupled to the decision framework using a single set of model realizations. For example, sensitivity analysis provides a guide for analyzing the value of collecting more information by quantifying the relative importance of each input parameter in predicting the model response. However, in these complex, high dimensional eco-system models, represented by the RWMS model, the dynamics of the systems can act in a non-linear manner. Quantitatively assessing the importance of input variables becomes more difficult as the dimensionality, the non-linearities, and the non-monotonicities of the model increase. Methods from data mining such as Multivariate Adaptive Regression Splines (MARS) and the Fourier Amplitude Sensitivity Test (FAST) provide tools that can be used in global sensitivity analysis in these high dimensional, non-linear situations. The enhanced interpretability of model output provided by the quantitative measures estimated by these global sensitivity analysis tools will be demonstrated using the RWMS model.

  20. An Overview of GIS-Based Modeling and Assessment of Mining-Induced Hazards: Soil, Water, and Forest

    PubMed Central

    Kim, Sung-Min; Yi, Huiuk; Choi, Yosoon

    2017-01-01

    In this study, current geographic information system (GIS)-based methods and their application for the modeling and assessment of mining-induced hazards were reviewed. Various types of mining-induced hazard, including soil contamination, soil erosion, water pollution, and deforestation were considered in the discussion of the strength and role of GIS as a viable problem-solving tool in relation to mining-induced hazards. The various types of mining-induced hazard were classified into two or three subtopics according to the steps involved in the reclamation procedure, or elements of the hazard of interest. Because GIS is appropriated for the handling of geospatial data in relation to mining-induced hazards, the application and feasibility of exploiting GIS-based modeling and assessment of mining-induced hazards within the mining industry could be expanded further. PMID:29186922

Top